Architect and operate high quality, large scale, multi-geo data pipelines that drive business decisions.
- Redesigned data pipelines using the applicable DBR features, and incorporating external tools where necessary to have better reliability and tighter SLAs.
- Established conventions or new APIs for logging feature usage for PM use-cases.
- Understandable SLAs for each of the production data pipelines.
- Improved test coverage (90+%) for data pipelines. Best practices and frameworks for unit, functional and integration tests.
- CI and deployment processes and best practices for the production data pipelines.
- Reduction in overall alert noise and increase responsiveness by rethinking the current alert categories and priorities.
- Design schemas for financial, sales and support data in the data warehouse.
- Experience building, shipping and operating multi-geo data pipelines at scale.
- Experience with working with and operating workflow or orchestration frameworks, including open source tools like Airflow and Luigi or commercial enterprise tools.
- Experience with large scale messaging systems like Kafka or RabbitMQ or commercial systems.
- Excellent communication (writing, conversation, presentation) skills, consensus builder
- Strong analytical and problem solving skills
- Passion for data engineering and for enabling others by making their data easier to access.
- Medical, dental, vision
- 401k Retirement Plan
- Unlimited Paid Time Off
- Catered lunch (everyday), snacks, and drinks
- Gym reimbursement
- Employee referral bonus program
- Awesome coworkers
- Maternity and paternity plans.