The Climate Corporation’s mission is to help the world’s farmers sustainably increase their productivity with digital tools. The Data and Analytics team is focused on creating competitive advantage for Climate and our customers through novel data infrastructure, metrics, insights and data services. We are a small but rapidly growing analysis and engineering team that builds and leverages state-of-the-art analytics systems. Our work informs decisions and direction for our business, while also impacting our products. We are looking for a Data Engineer to not only build data pipelines to efficiently and reliably move data across systems, but also to build the next generation of data tools to enable us to take full advantage of this data. In this role, your work will broadly influence the company's products, data consumers and analysts. We are looking for a candidate with knowledge of data warehousing and experience with ETL tools.
What You Will Do:
- Help design and build a Business Analytics Warehouse. Build and maintain the core data model, ETL / ELT, core data metrics and data quality.
- Rapidly prototype new analytics views and work directly with stakeholders across multiple functions (Science, Marketing, Sales, Risk, Finance, Product)
- Champion data warehousing best practices
- Build systems to answer business questions in a timely fashion and expand our product features
- Architect, build and launch new data models that provide intuitive analytics to business users
- Develop infrastructure to inform on key metrics, recommend changes and predict future results
- Work closely with other departments to gather new data and leverage existing data to make our products better for us and our users
- Build data expertise and own data quality for the pipelines you build
- Design and develop new systems and tools to enable folks to consume and understand data faster
- Provide expert advice and education in the usage and interpretation of data systems to the business users
- B.S. or B.A. in Computer Science/Applications or a related field
- 5+ years of SQL and dynamic or static programming languages experience as applied to ETL/ELT tools (Informatics, Kettle, Talend, etc.)
- Performing shell scripting in a Linux/Unix environment
- Performing dimensional data modeling
- Schema design in data warehouses, working directly with SQL to profile data, generate analytics
- Develop infrastructure in AWS Cloud environment to inform on key metrics, recommend changes
- Experience with relational databases or NoSql
- Experience in developing models / explores / dashboards in Looker
- Applicant must be willing to provide 24x7 on call support one week per month
- 5+ years of experience with dimensional data modeling & schema design in Data Warehouses.
- 5+ years of scripting experience
- Experience with massive scale relational databases (MPP) is a big plus (Vertica / Redshift / Teradata / MemSQL).
- Experience working in a cloud deployment such as AWS is a plus
- Excellent communication skills including the ability to identify and communicate data driven insights