Software Engineer, Data Platform

Outreach   •  

Seattle, WA

Industry: Enterprise Technology

  •  

Not Specified years

Posted 64 days ago

This job is no longer available.

The Role


Data is at the core of Outreach's strategy. It drives our customers and ourselves to the highest levels of success. We use it for everything from customer health scores and revenue dashboards, to operational metrics of our AWS infrastructure, to helping increase product engagement, to predictive analytics and causal inference via experimentation. As our customer base continues to grow, we are looking towards new ways of leveraging our data to save our customers time and improve their sales efficiency.


About the Team


The mission of the Data Platform team is to accelerate success of our internal and external customers through trustworthy data analysis and experimentation. As a member of the team, you will be on the ground floor, working directly with the VP of Data Science to define and implement our strategy for delivering data products. You will be responsible for delivering models, data-driven functionality, and end-user features based on these models that will be deployed into production. This is a full stack role that has a strong backend and frontend components.

Your Daily Adventures Will Include

  • Owner of the core company data pipeline, responsible for scaling up data processing flow to meet the data growth at Outreach
  • Implement systems tracking and monitoring data integrity, data quality and consistency
  • Develop framework & tools to support self-service data pipeline management (ETL)
  • Using wide big data related technology to improve data processing performance

Basic Qualifications

  • Extensive experience with Hadoop Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase, Parquet)
  • Proficient in at least one of the SQL languages (MySQL, PostgreSQL, SqlServer, Oracle)
  • Good understanding of SQL Engine and able to conduct advanced performance tuning
  • Strong skills in scripting language (Python, Ruby, Perl, Bash)
  • Experience with workflow management tools (Airflow preferred)
  • Comfortable working directly with data analytics to bridge business requirements with data engineering