Big Data Lead

Confidential Company  •  Atlanta, GA

8 - 10 years experience  •  IT Consulting/Services

Posted on 10/12/17 by Cynet Sytems
Confidential Company
Atlanta, GA
8 - 10 years experience
IT Consulting/Services
Posted on 10/12/17 Cynet Sytems

We are looking for Big Data Lead for our client in Atlanta, GA

Job Title: Big Data Lead

Job Location: Atlanta, GA

Job Type: Contract – 12 Months / Contract to Hire / Direct Hire

Job Description:

  • Years of exp- 10 years
  • Primary Skills: Apache NiFi, Kafka, Flume, Sqoop, Apache Atlas, Hive, HDFS, HBase and Spark (Hortonworks HDP and HDF preferred).
  • Lead Big Dataengineering team with specialization on data ingestion (from 100+ source systems in batch and near real-time), egestion and governance.
  • Participate in collaborative software and system design and development of the new NCR Data Lake on Hortonworks HDP and HDF distributions.
  • Manage own learning and contribute to technical skill building of the team.
  • Inspire and cultivate the engineering mindset and systems thinking.
  • Gain deep technical expertise in the data movement patterns, practices and tools.
  • Play active role in Big Data Communities of Practice.
  • Put the minimal system needed into production.

Required Qualifications

  • Bachelor’s degree or higher in Computer Science or a related field.
  • Good understanding of distributed computing and big dataarchitectures.
  • Passion for software engineering and craftsman-like coding prowess.
  • Proven experience in developing Big Data solutions in Hadoop Ecosystem using Apache NiFi, Kafka, Flume, Sqoop, Apache Atlas, Hive, HDFS, HBase and Spark (Hortonworks HDP and HDF preferred).
  • Experience with at least one of the leading CDC (Change Data Capture) tools like Informatica PowerCenter. 
  • Development experience with at least one NoSQL database. HBase or Cassandra preferred.
  • Polyglot development (4-5 years+): Capable of developing in Java and Scala with good understanding of functional programming, SOLID principles and, concurrency models and modularization.
  • DevOps: Appreciates the CI and CD model and always builds to ease consumption and monitoring of the system. Experience with Maven (or Gradle or SBT) and Git preferred.
  • Experience in Agile development including Scrum and otherlean techniques.
  • Should believe in You Build! You Ship! And You Run! Philosophy.
  • Personal qualities such as creativity, tenacity, curiosity, and passion for deep technical excellence.

Desired Qualifications

  • Experience with Big Data migrations/transformations programs in the Data Warehousing and/or Business Intelligence areas.
  • Experience with ETL tools like Talend, Pentaho, Attunity etc. 
  • Knowledge of Teradata, Netezza etc.
  • Good grounding in NoSQL data stores such as Cassandra, Neo4j etc.
  • Strong knowledge on computer algorithms.
  • Experience with workload orchestration and automation tools like Oozie, Control-M etc.
  • Experience in building self-contained applications using Docker, Vagrant. Chef.

Not the right job?
Join Ladders to find it.
With a free Ladders account, you can find the best jobs for you and be found by over 20,0000 recruiters.