Big Data Framework Developer

  •  

Geneva, IL

Industry: IT Consulting/Services

  •  

Less than 5 years

Posted 390 days ago

  by    Deb Ash

JOB DESCRIPTION

• Minimum 1year of building and deploying Java apps

• Minimum 1year of building and coding applications using at least two Hadoop components – MapReduce, HDFS, Hbase, Pig, Hive, Spark, Scoop, Flume, etc

• Minimum 1year of coding, including one of the following: Python, Pig programming, Hadoop Streaming, HiveQL

• Minimum 1year implementing relational data models

• Minimum 1year understanding of traditional ETL tools & RDBMS

• Minimum of a Bachelor’s Degree or 3years IT/Programming experience

  •  Deliver large-scale programs that integrate processes with technology to help clients achieve high performance.
  • Design, implement and deploy custom applications on Hadoop.
  • Implementation of complete Big Data solutions, including data acquisition, storage, transformation, and analysis. 
  • Design, implement and deploy ETL to load data into Hadoop.

 

$110K - $120K