As a Data Platform Engineer at Hailify, you will architect and implement cloud-native data pipelines and infrastructure to enable analytics and machine learning on large, rich datasets. You will be responsible for ensuring datasets are appropriately sourced, extracted, cleansed/transformed, validated/reconciled and made analytical and machine learning purposes.
- Architect a system: Design and propose a data reporting environment across all of Hailify’s data systems, potentially including data warehouses, data lakes, master data management systems, etc.
- Build a data pipeline: Scope, define and document data warehousing/extract, transform, load (ETL/ELT) requirements through implementation and testing. Provide production support to ensure stability and efficiency.
- Reconcile/QA data: Develop automated systems to monitor data integrity/quality and data lineage. Be able to validate and reconcile big data sets.
- Find data platform solutions: Lead the research, design, and implementation of maintainable, scalable, reusable and performant software solutions that meet functional and non-functional requirements and that are aligned with our strategic direction.
- Strive for continuous improvement: Contribute to the continuous improvement of our overall data infrastructure. Identify ways data can be more accurate, trustworthy, and timely. Provide thought leadership around best practices and emerging concepts in the data analytics domain. Find opportunities for automating operational database activities to enable the organization to scale and reduce costs.
- 5+ years of experience in data management, particularly in ETL/ELT, data integration, data engineering, data design and/or data modeling
- Experience in building, improving, and maintaining complex ETL and/or next-gen ELT pipelines
- Excellent programming skills in writing stored procedures, queries, views, user defined functions, cursors and common table expressions using SQL
- Keen understanding of data structures, data modeling, and software architecture
- Experience building cloud-native applications and supporting technologies / patterns / practices including: AWS/GCP/Azure, Docker, CI/CD, DevOps, and micro-services
- Working knowledge of programming or scripting languages (e.g., Python)
- Preferred degrees in Computer Science, Engineering, Information Technology or similar
- Comfortable working during India (primary) / United States Eastern (secondary) business hours