A Hadoop Developer is responsible for the actual coding or programming of Hadoop applications (Sqoop/YARN/Flume/Spark - Scala, HDFS) and the job responsibilities are dependent on your domain/sector, where some of them would be applicable and some might not.
In this role, the selected candidate will develop and recommend strategies and specifications for (technical) data solutions based on the analysis of the business goals, objectives, needs, and existing data systems infrastructure.
Code, test, modify, debug, document, and implement Ab Initio Graphs utilizing the GDE environment and EME, develop scripts to automate the execution of Ab Initio graphs using shell scripts under UNIX environment.
In this role, you will be responsible for designing, developing, and maintaining customer-facing software; involved in everything from the conception of a project, to algorithmic design, to code implementation, to maintenance and improvement.
Develops the necessary documentation for project storyboards, functional specifications, systems & end-user documentation through a process of meeting with customers and clients, documenting the requirements specified by those clients, and translating those requirements into technical documentation and specifications.
This position will be responsible for developing enterprise class web applications and maintaining the installation, configuration, and support of web applications and web services through best practices and industry standards.
You will significantly contribute to identifying best-fit architectural solutions for one or more projects; develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution.