ETL Developer / Data Engineer in Boise, ID

$100K - $150K(Ladders Estimates)

ClickBank   •  

Boise, ID 83702

Industry: Enterprise Technology


5 - 7 years

Posted 70 days ago

This job is no longer available.


We are looking for an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The hire will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company's data architecture to support our next generation of products and data initiatives.


  • Works with the business intelligence and the ETL development teams to extract fact and dimension data from the OLTP system and present it in a format that supports data mining, analysis, predictive modeling and reporting.

  • Ability to develop and maintain ETL jobs that extract from source systems, transform data based on business rules and requirements and load data into warehouse structures.
  • Troubleshoot data issues and implement fixes to ensure data accuracy.
  • Investigate production issues by discovering system states that prevent successful job completion (source systems, infrastructure, etc.).
  • Develop custom data marts using dimensional modeling
  • Good communication skills with both technical and non-technical staff members.


  • 4 year degree in Information Systems, Computer Science, or relevant work experience.

  • Good understanding of SQL concepts, with at least five years of SQL experience.
  • Minimum of 5 years of working experience in java.
  • Working experience in ETL or ELT tools like Talend, Informatica, Denodo, Lyftron etc.
  • Working experience in AWS, Azure, Google or any cloud Data Architecture and real-time streaming technologies like Kinesis or Kafka.
  • 2 or more years of experience with Snowflake, RedShift, Postgres, or MySQL databases.
  • Good communication and analytical skills.
  • Work experience in a continuous integration environment working with build systems like Maven, Jenkins, Dockers etc.
  • Experience using cloud deployment tools.
  • Be willing to work in a rapid pace development environment with scheduled release commitments.
  • Five years of experience working in a Unix environment including working with file editors (vim), shell scripting (bash), and file searching (grep, awk, sed).
  • Experience with source control (Git, BitBucket, Stash, Artifactory).


  • Work experience in a Data Warehouse environment with a strong understanding of the principals of data warehousing (star schema, slowly changing dimensions, etc.).

  • Advanced database tuning concepts and SQL Optimization.
  • Knowledge of the AWS CLI (S3).
  • Experience working with database standards and conventions.
  • Understanding of BigData concepts.
  • Some experience with NoSQL, Serverless like databases (Cassandra, DynamoDB etc.).
  • Experience writing unit tests.
  • Experience working in an Agile Scrum development environment.
  • Familiarity with team collaboration tools (JIRA, Confluence, Slack).
  • Experience with data searching using Splunk.
  • Has collaborated with BI teams who use BI tools like PowerBI, Domo, Tableau etc.

Valid Through: 2019-9-5