Ready to make an impact? If so, read on!
As a Big DataDeveloper, you will be responsible for helping deliver GEICO’s Business Intelligence efforts. The candidate will be responsible for managing existing data extraction jobs, but also play a vital role in building new data pipelines from various structured and unstructured sources into Hadoop. The qualified candidate will be working with heterogeneous data models, mapping and transformation. The candidate will be responsible for working with various OR mapping tools and familiarity with Data as Service (DaaS).
Would you like to join this innovative team? If so, do you meet these qualifications?
• Bachelor’s degree is required
• A minimum of 3 years working with HBase/Hive/MRV1/MRV2 is required
• A minimum of 2 years working with Hadoop
• Experience working with Apache Spark, Storm, Kafka is preferred.
• Experience in integrating heterogeneous applications is required
• Experience working with Systems Operation Department in resolving variety of infrastructure issues
• Experience designing and supporting RESTful Web Services is required
• Knowledge and understanding of SDLC and Agile/Scrum procedures, processes and practices is preferred