Spark Scala Developer

Nityo Infotech  •  Charlotte, NC and Malvern, PA

Less than 5 years experience  •  Financial Services

$80K - $110K
Posted on 11/03/17 by Akanksha Jakhmola
Nityo Infotech
Charlotte, NC
Less than 5 years experience
Financial Services
$80K - $110K
Posted on 11/03/17 Akanksha Jakhmola

Skill set need:

·        3+ years with Spark 1.6 and 2.10 Development, specifically remediating code to new version

·        Hands on experience with back-end programming, specifically Scala, python.

·        Knowledge working with EMR, Hive, and S3

·        Knowledge in Shell Scripting.

·        Ability to write ETL jobs through Spark.

·        Ability to Handle Performance and Memory issues.

·        Good knowledge of database structures, theories, principles, and practices.

·        Hands on experience in PostgreQL.

·        Familiarity with data loading tools like Sqoop.

·        Analytical and problem solving skills, applied to Big Data domain.

·        Writing high-performance, reliable and maintainable code.

·        Proven understanding with RDS or other columnar Databases.

·        Good aptitude in multi-threading and concurrency concepts.

·        Atlassian Products knowledge is preferred

·        Worked on moving legacy data warehouse data into S3 on AWS Cloud.

·        Involved in best practices, supervision and support to enterprise analytics models deployed on AWS based Data Lake.

Skill set need:

·        3+ years with Spark 1.6 and 2.10 Development, specifically remediating code to new version

·        Hands on experience with back-end programming, specifically Scala, python.

·        Knowledge working with EMR, Hive, and S3

·        Knowledge in Shell Scripting.

·        Ability to write ETL jobs through Spark.

·        Ability to Handle Performance and Memory issues.

·        Good knowledge of database structures, theories, principles, and practices.

·        Hands on experience in PostgreQL.

·        Familiarity with data loading tools like Sqoop.

·        Analytical and problem solving skills, applied to Big Data domain.

·        Writing high-performance, reliable and maintainable code.

·        Proven understanding with RDS or other columnar Databases.

·        Good aptitude in multi-threading and concurrency concepts.

·        Atlassian Products knowledge is preferred

·        Worked on moving legacy data warehouse data into S3 on AWS Cloud.

·        Involved in best practices, supervision and support to enterprise analytics models deployed on AWS based Data Lake.

Not the right job?
Join Ladders to find it.
With a free Ladders account, you can find the best jobs for you and be found by over 20,0000 recruiters.