Hadoop Data Engineer

Capgemini   •  

IL

Industry: Accounting, Finance & Insurance

  •  

Less than 5 years

Posted 177 days ago

This job is no longer available.

Position Type: Permanent/Fulltime 

Duties & Responsibilities:
• Design and develop cutting edge Analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids.
• Build automation of deployment and configuration using open source frameworks
• Act as the subject matter expert for Big Data platforms and technologies
• Work across IT teams to ensure code quality, performance and scalability of deployed data products
• Perform other duties and/or special projects as assigned
Qualifications/Requirements:
• Bachelor's degree with minimum 2 years of IT experience in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) Or in lieu of Degree, a High School Diploma/GED and 5 years of experience in quantitative field with programming (Java/J2EE) and data management experience
• Minimum 2 years of experience in deployment of BI & Analytics solutions using Big Data Technologies (such as – MapReduce, Kafka, HBase) in complex large scale environments preferably (20Tb+)
• Minimum 2 years of experience in at least 3 of the following: Pig, Sqoop, MapReduce, Kafka, Spark, Java
Desired Characteristics:
• Experience with Hortonworks Hadoop 2.4.x distribution
• Demonstrated excellent planning and organizational skills
• Engaging personality with experience collaborating across teams of internal and external technical staff, business analysts, software support and operations staff.
• Experience with Agile project management methods and practices.
• Familiarity with traditional BI Solution Architecture encompassing – ETL, CEP, DW, BI Reporting (preferably in a Unix/Linux, Oracle environment)