IT - Big Data Developer II - IV

Cincinnati Financial   •  

Fairfield, OH

8 - 10 years

Posted 267 days ago

This job is no longer available.

Description

 

The Cincinnati Insurance Companies' IT department is currently seeking a developer III-IV to analyze, design, code, test and deploy software programs and applications. This includes researching, designing, documenting and modifying software specifications throughout the production lifecycle.

 

Responsibilities

  • hadoop development and implementation
  • designing, building, installing, configuring and supporting Hadoop
  • develop detailed system design documentation
  • write high-performance, reliable and maintainable code
  • create extract, transform and load processes and programs in appropriate environments
  • loading from disparate data sets
  • create source to target maps and transformation rules for Data Mart creation, aggregates and summary tables  
  • collect appropriate metadata and document procedures
  • create reconciliation procedures and data checks to ensure data quality
  • design, develop and apply best practices and document ETL processes including audit, balancing, and controls
  • review schedules, objectives and priorities with service manager
  • assist with routine work of the department as required
  • provide production support, including on-call rotation and technical assistance to end-user and IT staff
  • suggest improvements for systems and procedures
  • mentor less experienced associates

 

Qualifications

 

Requirements:

  • at least seven years of software development experience
  • at least five years of experience with JAVA, including Core Java programming, JS, shell scripting experience, and good knowledge of Java Design Patterns
  • at least two years of experience with Hadoop preferably with HortonWorks Data Platform (HDFS, YARN, Oozie, and Hive)
  • hands-on experience in extracting data from DB2, MS SQL Server using Sqoop
  • hands-on experience in writing Hive Based ETL jobs
  • knowledgeable with HDFS file system and commands
  • UNIX OS working skills and proficiency in Shell Scripting
  • experience designing and developing exception handling, data standardization procedures and quality assurance controls
  • experience implementing and refining required business transformation rules and logic
  • experience analyzing source systems for data profiling and creation of source to target mapping specs
  • progressive experience in conceptual, logical, physical data modeling; design, development and delivery of data-centric IT solutions, including relational databases, data warehouses, ETL/ELT processes

Preferred Skills:

  • Altova XML Spy
  • version control using TFS
  • scheduling Linux/Unix Shell scripts in Control – M
  • Apache Hue
  • Zeppelin
  • SQL tools such as Squirrel, DBeaver
  • data modeling
  • insurance domain

1800231