Senior Data Engineer in Lehi, UT

$80K - $100K(Ladders Estimates)

Workfront   •  

Lehi, UT 84043

Industry: Information Technology


Less than 5 years

Posted 55 days ago

Come join Workfront, one of the hottest companies in cloud computing as recognized by Forbes magazine! Located in the heart of Silicon Slopes, Workfront is the leader in enterprise work and project management. Meeting our mission to become the authoritative source for work, hundreds of thousands of enterprise users leverage Workfronts SaaS solution to make their work faster and more efficient taking projects to a higher level.

Job Summary


As part of the Data & Analytics team, the Senior Data Engineer helps leaders and analysts throughout the organization get the data they need to further accelerate our growth. The team helps the business define analytics needs and translates that into actionable analytics that are both robust and easy for business users to understand. This position will work closely with Data Warehouse engineers and analysts to understand what data needs to be available and how it needs to be consumed. The Data Engineer will take these requirements and develop clean and efficient code that will move data from source systems into a data lake and enable timely and actionable analytics around the organization.

Job Responsibilities

  • Enable business to be more data driven by building enterprise class API data connectors to cloud business applications
  • Improve overall robustness and efficiency of existing data pipelines and operations
  • Automate monitoring of data lake objects and associated pipelines
  • Assist in normal work hours operational support of data lake ETL and storage
  • Design and build integration solutions to meet business analytics needs



  • 3-5 years building data connectors to REST and SOAP APIs
  • Strong understanding of JSON and XML
  • Extensive background developing data pipelines in Python
  • Past experience working with Linux automation using bash and cron
  • Working knowledge of code repositories (preferably Git)
  • Bachelors in computer science or related degree


Bonus Skills

  • Worked with AWS solutions like Redshift, EC2, S3, Athena, etc.
  • Ability to utilize message queues like Kafka, RabitMQ, AWS Kinesis
  • Familiarity with common enterprise APIs like Salesforce, Marketo, Netsuite, etc.
  • Experience working with columnar file types like Parquet or ORC
  • Experience with big data technologies like Presto, Spark, Hadoop, etc.
  • Knowledge of data warehouse methodologies
  • Experience automating data science models and data pipelines

Valid Through: 2019-10-18