Senior Big Data Engineer - B2B in San Francisco, CA

$100K - $150K(Ladders Estimates)

Glassdoor   •  

San Francisco, CA 94102

Industry: Consumer Technology

  •  

5 - 7 years

Posted 55 days ago

Looking for your next challenge? How about helping us disrupt a $90B+ talent acquisition market! At Glassdoor, we have more than 63+ million unique users and thousands of business customers that together we are revolutionizing the way phenomenal employees find great jobs and companies they love.

We are looking for a hardworking Sr. Data Engineer to join our growing Data Engineering team. You will have significant experience in building scalable data platforms that enable business intelligence, analytics, data science and data products. You have strong, hands-on technical expertise in a variety of the newest technologies and the proven ability to fashion robust, scalable solutions. You will work on Glassdoor's next generation products. You have a passion for continuous quality improvement and will be energized by working in a highly energized environment. We work on fun data problems, partner with other engineering teams, data scientists and business stakeholders to deliver end to end solutions.


Responsibilities

  • Design and develop big data applications using different technologies.
  • Develop logical and physical data models for big data platforms.
  • Automate workflows using Apache Airflow.
  • Write data pipelines using Apache Hive, Apache Spark, Apache Kafka.
  • Create solutions on AWS using services such as Lambda and API Gateway.
  • Provide ongoing maintenance and enhancements to existing systems, and participate in rotational on-call support.
  • Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.

Key Qualifications

  • 5+ years of hands-on experience with developing data warehouse solutions and data products.
  • 2+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
  • 2+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
  • Experience with programming languages: Python, Java, Scala, etc.
  • Experience with scripting languages: Perl, Shell, etc.
  • Practice working with, processing, and managing large data sets (multi TB/PB scale).
  • Exposure to test driven development and automated testing frameworks.
  • Background in Scrum/Agile development methodologies.
  • Capable of delivering on multiple challenging priorities with little supervision.
  • Excellent verbal and written communication skills.
  • Bachelor's Degree in computer science or equivalent experience.

Nice To Have

  • Experience building machine learning pipelines or data products.
  • Familiarity with AWS or GCS technologies.
  • Be passionate about or have contributed to open sourced engineering projects in the past.


Valid Through: 2019-10-18