Job Summary
We are looking for bright driven and talented individuals to join our team of passionate and innovative software engineers In this role you will use your experience with Java Ignite Big Data and Streaming technologies to build a lending platform based of data lake
Job Duties
- Developing and deploying distributed computing Big Data applications using Apache Spark
- Experience processing large amounts of data using Java
- Leveraging DevOps techniques and practices like Continuous Integration
- Help drive cross team design development via technical leadership mentoring
- Work with business partners to develop business rules and business rule execution
- Perform process improvement and re engineering with an understanding of technical problems and solutions as they relate to the current and future business environment
- Design and develop innovative solutions for demanding business situation
- Analyze complex distributed production deployments and make recommendations to optimize performance Must Haves
- At least 8-10 years of professional hands on work experience in Java J2ee Strong knowledge of Object Oriented Analysis and Design Software Design Patterns and Java coding principles
Job Responsibilities
- Experience with Core Java J2EE development
- 3 or more years of experience with the Hadoop Stack is a plus
- 3 years of Distributed Computing frameworks such as Apache Spark Hadoop with large data sets
- Expertise in SQL and ETL batch processing
- Proficient in a Unix Linux CentOS
- Worked on a full lifecycle project implementation of applications with Java in Spark
- Strong initiative with the ability to identify areas of improvement with little direction
- Team player excited to work in a fast paced environment
- Agile experience preferred
Good to have
- Experience with Elasticsearch Spark plus
- Experience with database and ETL development
- Proficiency with MapR Hadoop distribution components and custom packages is a huge plus
- Proven understanding and related experience with Hadoop HBase Hive Pig Sqoop Flume Hbase and or Map Reduce
- Excellent RDBMS Oracle SQL Server knowledge for development using SQL PL SQL
- Solid UNIX OS and Shell Scripting skills
- Bachelor s degree in computer science data processing or equivalent