Sr Hadoop Developer / Admin

ArrowCore Group  •  Atlanta, GA

8 - 10 years experience  •  Consulting

$120K - $130K
Posted on 02/23/18 by Rohit Pandey
ArrowCore Group
Atlanta, GA
8 - 10 years experience
Consulting
$120K - $130K
Posted on 02/23/18 Rohit Pandey

Sr Hadoop Developer/ Admin

Direct Hire/Fulltime Role

Atlanta, GA

 

We are looking for an experienced Senior HadoopDeveloper/ to help us develop solutions that cater the data analytics needs of our products

 

Tasks include:

?        Design, Document, Develop and unit test Streaming Applications with Spark Streaming, HBase, Cassandra and Kafka APIs

?        Design, Document, Develop and unit test Batch processing applications with Spark, Hive, HBase, Cassandra, Map-Reduce, HDFS and Hadoop frameworks and APIs

?        Design, Document, Develop and unit test search and indexing solutions with Solr/Elastic search APIs and tools

?        Design, Document, Develop and unit test Data models and applications for time-series data analytics solutions on streaming data

?        Work closely with Business and Technical Product owners for requirements and use cases

?        Design, Document, Develop and unit test data ingestion applications to ingest data from various source systems and third-party applications using Spark, Sqoop, Flume, Nifi, Hive, HBase, Cassandra, HDFS, Hadoop and third-party APIs

?        Design and Document solutions for Streaming, Batch, Ingestion, Indexing and Search use cases

?        Document Unit Test cases for applications and deployment plans for code migration

?        Partner with the Admin teams on deployments and warranty support activities

?        Partner with QA for Integration, System and Performance testing applications

?        Ability to read/write code down to a white box level and provide clear solutions.

?        Develop and unit test batch scripts using UNIX, Java, JavaScript, Spark, Hive, PIG and HBase for batched data movement.

?        Design, Document, Develop and unit test analytics applications using Spark SQL, Pig, Hive, HDFS, HBase and Impala

?        Develop and unit test scripting using Java, Java Script, UNIX, etc? for local file processing and file transfer processes.

?        Participate actively in agile day to day processes

 

Required Qualifications:

?        Experience: 7-10 years of experience in Data Analytics platform using Big DataTechnologies of Hadoop Ecosystem with very good experience in real time analytics using spark stream processing framework with Kafka message queues is mandatory.

?        OS: Linux, Windows

?        Coding languages: Java, Scala, Spark, Spark Streaming with proficiency in RDDs, Dataframes and Datasets, Map-Reduce

?        NoSQL: Knowledge on HBase and Cassandra is strongly preferred

?        SQL interfaces: Spark SQL, Hive, CQL, Impala

?        Batch Querying: Hive, PIG, Impala

?        Search and Indexing: Solr and Elastic Search

?        Message Queues: Kafka

?        Scripting: JavaScript, UNIX, PIG

?        Databases: Oracle, MySQL

?        Code Management: Jenkins, SVN, GIT

?        Agile Tools: Jira

?        Proven analytical skills

?        Knowledge of business process development and improvement

?        Otherpreferredqualifications: Knowledge in Sqoop, Flume, Nifi, R and Python are preferred.

 

Education:

?        B.S. or MS. & 10+ years of experience

 

Not the right job?
Join Ladders to find it.
With a free Ladders account, you can find the best jobs for you and be found by over 20,0000 recruiters.