Can you design and develop next generation real time data pipeline that can handle billions of events a day? Are you excited about building self service analytics around event streams? Do you love to solve complex problems around multi-functional environment? Do you want to be part of the team that is passionate about data and help other teams to make data driven decisions. If so, we are looking for you.
About the Role
- Lead event stream analytics initiative and create the foundation for event data collection at Zillow Group.
- Design and build highly scalable and responsive platform to collect the data across all the brands and all devices (mobile apps, desktop …)
- Work with teams across Zillow group to drive the new platform adoption.
About the team
We build the pipelines and processes responsible for daily ingestion of terabytes of data. We productionalize intelligent, data-driven systems to help Zillow capture strategic opportunities in the market. Our work enriches Zillow’s unparalleled living database of all homes and hundreds of millions of customers and empowers teams downstream to build analytics tools and products to delight our users.
- Small team = big impact. Engineering teams are highly decentralized in order to create the small team speed and autonomy of a start-up environment but backed by big company resources.
- Fast-moving, developer driven organization full of brilliant and ambitious people.
Who you are
- Experience building and shipping highly scalable clickstream data pipelines and analytics systems on distributed data systems and cloud platforms (AWS/Azure/GCP)
- Experience with Java, Scala, Python etc.
- Solid experience in streaming technologies like AWS Kinesis or Kafka.
- Experience building batch, real-time and streaming analytics pipelines with data from event data streams, NoSQL and APIs.
- Experience with batch & stream processing technologies such Spark, KStreams, Samza etc.
- Have a CS Degree or equivalent experience