Big Data Framework Developer
• Minimum 1year of building and deploying Java apps
• Minimum 1year of building and coding applications using at least two Hadoop components – MapReduce, HDFS, Hbase, Pig, Hive, Spark, Scoop, Flume, etc
• Minimum 1year of coding, including one of the following: Python, Pig programming, Hadoop Streaming, HiveQL
• Minimum 1year implementing relational data models
• Minimum 1year understanding of traditional ETL tools & RDBMS
• Minimum of a Bachelor’s Degree or 3years IT/Programming experience
- Deliver large-scale programs that integrate processes with technology to help clients achieve high performance.
- Design, implement and deploy custom applications on Hadoop.
- Implementation of complete Big Data solutions, including data acquisition, storage, transformation, and analysis.
- Design, implement and deploy ETL to load data into Hadoop.