Hadoop Developer in Charlotte, NC

$100K - $150K(Ladders Estimates)

Bank of America Corporation   •  

Charlotte, NC 28202

Industry: Finance & Insurance

  •  

5 - 7 years

Posted 29 days ago

Job Description:

Position Summary

Hadoop Developer able to work in one or more than one projects in the Hadoop data lake, including technical deliverables as per business needs.

Essential Duties and Responsibilities:

Following is a summary of the essential functions for this job. Other duties may be performed, both major and minor, which are not mentioned below. Specific activities may change from time to time.

1. Sound understanding and experience with Hadoop ecosystem (Cloudera). Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand.

2. Experience in working with a Big Data implementation in production environment

3. Experience in HDFS, Map Reduce, Hive, impala, Linux/Unix technologies is mandatory

4. Experience in Flume/Kafka/spark is an added advantage

5. Experience in Unix shell scripting is mandatory

6. Able to analyze the existing shell scripts/python/perl code to debug any issues or enhance the code

7. Sound knowledge of relational databases (SQL) and experience with large SQL based systems.

8. Strong IT consulting experience in various data warehousing engagement, handling large data volumes, architecting big data environments.

9. Deep understanding of algorithms, data structures, performance optimization techniques and software development in a team environment.

10. Benchmark and debug critical issues with algorithms and software as they arise.

11. Lead and assist with the technical design/architecture and implementation of the big data cluster in various environments.

12. Able to guide/mentor development team for example to create custom common utilities/libraries that can be reused in multiple big data development efforts.

13. Exposure to ETL tools e.g. data stage, NoSQL (HBase, Cassandra, MongoDB)

14. Work with line of business (LOB) personnel, external vendors, and internal Data Services team to develop system specifications in compliance with corporate standards for architecture adherence and performance guidelines.

15. Provide technical resources to assist in the design, testing and implementation of software code and infrastructure to support data infrastructure and governance activities.

16. Support multiple projects with competing deadlines


Required Skills and Competencies:

1. Bachelor's degree in a technical or business-related field, or equivalent education and related training

2. Seven years of experience in data warehousing architectural approaches and minimum 4 years in big data (Cloudera)

3. Exposure to and strong working knowledge of distributed systems

4. Excellent understanding of client-service models and customer orientation in service delivery

5. Ability to grasp the 'big picture' for a solution by considering all potential options in impacted area

6. Aptitude to understand and adapt to newer technologies

7. Assist in the evaluation of new solutions for integration into the Hadoop Roadmap/Strategy

8. Motivate internal and external resources to deliver on project commitments

9. The desire to learn new soft and technology skills and the desire to coach, mentor and train peers throughout the organization

10. The ability to work with team mates in a collaborative manner to achieve a mission

11. Presentation skills to prepare and present to large and small groups on technical and functional topics

Desired Skills:

1. Previous experience in the financial services industry

2. Broad BofA technical experience and good understanding of existing testing/operational processes and an open mind on how to enhance those

3. Understanding of industry trends and relevant application technologies

4. Experience in designing and implementing analytical environments and business intelligence solutions

Valid Through: 2019-11-13