Senior Data Engineer - Infrastructure

OfferUp   •  

Bellevue, WA

Industry: Technology

  •  

5 - 7 years

Posted 80 days ago

This job is no longer available.

We are looking for an experienced Data Engineer to work on building, operating, and scaling OfferUp’s batch and near real-time data processing platforms and tools that will power data-driven capabilities throughout the entire organization. This is a highly visible opportunity to build systems that drive the end user experience, as well as directly supporting backend engineers, business intelligence dashboards and reports, data analysts and data scientists. Building the largest and most responsive mobile marketplace poses unique data challenges that require leveraging the latest developments in data infrastructure. We leverage open source infrastructure where we can, but are ready to build and share solutions if they don’t exist yet. Come help us do that!

Responsibilities:

  • Be the technical lead on the team, owning the architecture of the data solutions, and the data platform.
  • Lead the team by example, mentor other engineers and help them with their growth.
  • Drive engineering best practices, set standards and propose larger projects which may require cross-team collaboration.
  • Design and develop applications to process large amounts of critical information in batch and near real-time to power user-facing features.
  • Influence technical direction for the company, leveraging your prior experiences and helping evaluate emerging technologies and approaches.
  • Help bring engineering maturity to a growing team that is at the center of a lot of critical initiatives for the company

Requirements:

  • 5+ years of professional software development experience
  • Strong ability in distributed systems for processing large scale data processing
  • Ability to communicate technical information effectively to technical and non-technical audiences
  • Proficiency in Java, Scala and Python
  • Experience leveraging open source data infrastructure projects, such as Apache Spark, Airflow, Kafka, Flink, Samza, Avro, Parquet, Hadoop, Hive, HBase, Phoenix, Presto or Druid
  • Experience building data pipelines and real-time data streams
  • Experience building software in AWS or a similar cloud environment
  • Experience with AWS services like EMR, Kinesis, Firehose, Lambda, Sagemaker, Athena, Elasticsearch is a big plus
  • Computer Science or Engineering degreerequired, Mastersdegreepreferred
  • Must be eligible to work in the United States