AIG is seeking a Big Data Hadoop ETL developer to become part of a team in designing, developing, and deploying ETL solutions in a big data platform for the IT Life, Health and Disability pillar of the organization.
The candidate will be part of the Information Management team responsible for architecting and developing a Data Integration Hub enabling AIG to leverage data as an asset by providing a single authoritative view of the business, providing a layer of separation between our complex data sources and our data consumers; thereby, providing each data layer to evolve independently.
- Architect, design, construct, test, tune, and deploy ETL infrastructure based on the Hadoop ecosystem based technologies.
- Work closely with administrators, architects, and application teams to insure applications are performing well and within agreed upon SLAs.
- Work closely with Management and Data Scientist teams to achieve company business objectives.
- Collaborate with other technology teams and architects to define and develop solutions.
- Deploy ETL code that aligns with the ETL target state architecture standards and development standards.
- Research and experiment with emerging ETL technologies and tools related to Big Data.
- Contribute to the Big Data open source ecosystem.
- Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed; ensuring a high degree of data quality.
- Maintain, tune, andsupportthe ETL platform on a day-to-day basis to insure high availability.
- Experience within the Life Insurance industry space preferred.
- Excellent technical and organizational skills.
- Strong communication and leadership skills.
- Proficiencies with Agile development practices.
- Experience with and strong understanding of Data Warehousing and Big Data Hadoop ecosystems.
- Experience translating functional and technical requirements into technical specifications and design.
- Knowledge and experience of ELT for Data Lake to ETL for the data servicing layer life cycle.
- Experience with ELT/ETL batch, real-time, streaming, and messaging.
- Experience with Talend (required), AB-Initio, Informatica/Data Exchange a plus.
- Experience with Hadoop Map Reduce loading into a Data Warehouse.
- Experience with HBase, Cassandra, DynamoDB, CouchDB a plus.
- Experience with RDBMS technologies and SQL languages, Oracle & SQL Server a plus.
- Experience with NoSQL platforms a plus.
- Experience with Cloud solutions including admin. / deployment of data in AWS or Azure a plus.
- Working knowledge of ACORD modeling a plus.
- Working knowledge of web technologies and protocols (JAVA/NoSQL/JSON/REST/JMS) a plus.
Background and experience desired:
- The position calls for a seasoned IT professional with senior leadership qualities and background with a minimum of 5 years’ experience in IT.
- 3+ years of experience building and managing complex ETL infrastructure solutions.
- 1+ years of experience with distributed, highly-scalable, multi-node environments utilizing Big Data ecosystems.
- Bachelor's degree in Information Technology or related field preferred, or equivalent work experience.
- 5+ years of experience in the IT Industry.