Senior Big Data Developer

5 - 7 years experience  •  IT Consulting/Services

Salary depends on experience
Posted on 09/22/17
Seattle, WA
5 - 7 years experience
IT Consulting/Services
Salary depends on experience
Posted on 09/22/17


Currently we are looking for a Senior Big Data Developer for our Seattle, WA office to make the team even stronger.

This position will be a key on-site resource for our clients, supporting design and development of leading edge solutions for EPAM.

We do not believe in matching against a list of buzzwords - we look for smart people with good general programming skills andexperience withBig Data platforms andtechnologies in cloud as we believe that cleverdevelopers can learn newtechnologiesquickly and well.


  • Work with rest of the team to design, develop & manage end-to-end engineering solutions for business opportunities using existing on premises or new Cloud based technology platforms;
  • Tenaciously keep the Big Datainfrastructure (Hadoop and peripheral tools) operational across various environments in datacenters & Cloud;
  • Deploy and manage EMR, Redshift, and Dynamo DB etc. services based applications on AWS;
  • Lift and shift of on premises Big Data applications/tools to Cloud;
  • Work with the team to build the Big Data platform solutions for different business needs;
  • Develop scripts and glue code to integrate multiple software components;
  • Proactively monitor environments and drive troubleshooting and tuning;
  • Demonstrate deep knowledge of Hadoop & Cloud technologies to troubleshoot issues;
  • Evaluate and build different compute frameworks for all tiers for technologies in Cloud.


  • At least 5+ years of relevant experience preferably in e-commerce and Big datatechnologies;
  • Strong programming skills in at least one programming language – Scala or Java or Python;
  • Experience in implementing the full life cycle of massive datasets including ETL, cleaning, data analysis and deployment in the Cloud;
  • Experience in processing large amounts of structured and unstructured data from various sources using technologies like Hadoop ecosystem, Map Reduce programming, Hive, Spark and more;
  • Experience in a wide variety of AWS technologies like S3, EMR, Redshift, Lambda, Aurora, SNS and EC2;
  • Experience in relational databases such as Postgres and with NoSQL databases like MongoDB or Cassandra;
  • Experience with Linux and hands on experience with Shell scripting;
  • Agile development methodologies including scrum, code reviews, pair programming;
  • Object-oriented design and development;
  • Performance and scalability tuning, algorithms and computational complexity;
  • Open source libraries and tools such as Spring, Maven, Guava, Apache Commons, Eclipse, Git, Jira, Jenkins;
  • Unit testing;
  • All things Linux (bash scripting, grep, sed, awk etc.);
  • MS/BS degree in a computer science field or related discipline is nice but not essential.
Not the right job?
Join Ladders to find it.
With a free Ladders account, you can find the best jobs for you and be found by over 20,0000 recruiters.