Data Engineer, User Research in Seattle, WA

$80K - $100K(Ladders Estimates)

Tableau Software   •  

Seattle, WA 98160

Industry: Enterprise Technology


5 - 7 years

Posted 34 days ago

Are you a Data Engineer with experience in cloud technologies, have a passion for data, and cares deeply about your end users? The Tableau User Experience Research team is looking for a Data Engineer to help us turn data into actionable insights for product development.

In this role you will be responsible for developing and maintaining systems that enable and automate the movement of data across internal systems. The Data Engineer will work closely with User Researchers, Designers, Engineers, and others to structure data and data flows to support existing and new workloads in the service of User Research activities. We are looking for a Data Engineer that will partner with us through the entire life cycle of research - from data preparation through to helping us deliver insights generated from that data.

Primary Responsibilities...

  • Develop and maintain key features and services that move, store, and enable access to product usage data.
  • Implement data structures using best practices in data modeling and ETL processes.
  • Define and implement monitoring and alerting policies for data solutions.
  • Participate in code reviews and related processes.
  • Work with product managers, engineers, and internal customers to identify and scope of the requests and define implementation plans.
  • Partner with teams on modeling and analysis problems – including transforming problem statements into analysis problem.

Additional Responsibilities...

  • You are a Recruiter! Tableau hires company builders and, in this role, you will be asked to be on the constant lookout for the best talent to bring on-board to help us continue to build one of the best companies in the world!

Knowledge and Skill Requirements...

  • Experience building and optimizing data pipelines, architectures and data sets. A successful history of manipulating, processing and extracting value from large disconnected data sets. 
  • Technical. A technical background in data architecture, data pipeline architecture & development, data warehousing concepts. Demonstrated experience writing Python code. Demonstrated experience writing complex, highly-optimized SQL queries across large data sets. Experience with: AWS services such as EC2, S3, IAM roles, ECS, CloudWatch, etc., Gilt, Snowflake is handy but not required.
  • Interest in, or proficiency with, data science methods and tools.
  • Demonstrated ability to coordinate projects across functional teams, including engineering, IT, product management, marketing, and operations.
  • A willingness to jump in and help when needed, learn and teach new skills, and have the experience and professionalism required to meet objectives.


  • BA/BS in Computer Science, Data Science, or equivalent.
  • 4-6+ years professional experience in an engineering role.
  • 1-2+ years ETL development experience.
  • 1+ years of experience working in a Cloud Platform (like GCP, AWS, or Azure).

Valid Through: 2019-11-8