We are looking for the Data Engineers who will help make our next decade just as revolutionary as our past. If you're one of the super-talented who thrive on change, aren't afraid to take risks, love to make a difference, want to cultivate the better world, you're the right fit. Come grow with us.
The Data Engineer for Enterprise Data Management at Chipotle will be assisting in building the innovative enterprise data solution leveraging both cloud and on-prem technology. You will be responsible for building data engineering solution and processes to enable analytics, business intelligence, MDM and mobility. This role will be responsible for creating and maintaining the new data pipelines to leverage data at scale.
WHAT YOU'LL DO
Chipotle is seeking highly motivated, results oriented individual with strong data engineering skills in MS Stack and cloud technologies for Enterprise Data and Business Intelligence function. This role will influence the future processes and architecture to make difference at enterprise level. You will be part of the exciting journey at Chipotle. These responsibilities include:
- Design, develop and maintain scalable data pipelines
- Develop data ingestion and integrations (REST, SOAP, SFTP, MQ, etc.) processes
- Assist in technology discovery and implementation for both on-prem and in Cloud (i.e. Azure or AWS) to build solution for future systems
- Develop high performance scripts in SQL/Python/etc. to achieve objectives of enterprise data, BI and analytics needs.
- Incorporate standards and best practices into engineering solutions
- Manage code versions in source control and coordinate changes across team
- Participate in architecture design and discussions
- Provide logical and physical data design, and database modeling
- Be part of the Agile team to collaborate and to help shape requirements
- Coordinate with senior resources to solve complex data issues around data integration, unusable data elements, unstructured data sets, and other data processing incidents
- Supports the development and design of the internal data integration framework
- Works with system owners to resolve source data issues and refine transformation rules
- Partner with enterprise teams, data scientist, architects to define requirements and solution
WHAT YOU'LL BRING TO THE TABLE
- Have a B.A./B.S. and 2+ years of relevant work experience; or an equivalent in education and experience
- Hands on experience with Microsoft Stack – SSIS, SQL, etc.
- Possess strong analytical skills with the ability to analyze raw data, draw conclusions, and develop actionable recommendations
- Experience with the Agile development process preferred
- Proven track-record of excellence and consistently delivered past project successfully
- Hands on experience with Azure data factory V2, Azure Databricks, SQLDW or Snowflake, Azure analysis services and Cosmos DB
- Experience with Python or Scala.
- Understanding of continuous integration and continuous deployment on Azure
- Experience with large scale data lake or warehouse implementation on any of the public cloud (AWS, Azure, GCP)
- Have excellent interpersonal and written/verbal communication skills
- Manage financial information in a confidential and professional manner
- Be highly motivated and flexible
- Effectively handle multiple projects simultaneously and pay close attention to detail
- Have experience in a multi-dimensional data environment