This role usually functions as a core member of an agile team, at the intersection of enterprise services, analytics and our data platform. The team is responsible for the care and growth of the infrastructure and solutions that provide insight from our data platform and other sources. They enable and develop big data and batch/real-time analytical solutions that leverage emerging technologies. The candidate will assist in designing, building, and testing data ingestion and ETL programs from a variety of source systems (both unstructured and structured).
- BS in Computer Science, Engineering or other relevant combination of training and education
- 3-5years' experience in configuring and developing big data solutions in Cloud Environment (AWS, Cloudera etc.)
- 2-5 years' experience in software development and coding in one of languages (C#, Python, NodeJs, Java etc.)
- Extensive experience with Agile methodologies
- Strong Data Modeling skills
- Experience in designing and operating Data related solutions (NoSQL, Relational, Graph, etc.)
- Proficient in Big Data technologies like Spark, Hadoop, etc.
- Good working experience on core AWS Services, Amazon Redshift, Amazon Glue and other tools related to data storage.
- Excellent communication and interpersonal skills.
Nice to Have
- AWS certifications is a huge plus
- Experience developing complex technical and ETL programs within an Enterprise Data Platform
- Familiarity to Docker and container-based solutions
- Strong analytical skills