These are the characteristics that we strive for in our own work. We would love to hear from candidates who embody the same:
- Desire to work collaboratively with your teammates to come up with the best solution to a problem
- Demonstrated experience and ability to deliver results on multiple projects in a fast-paced, agile environment
- Excellent problem-solving and interpersonal communication skills
- Strong desire to learn and share knowledge with others
The following qualifications and technical skills will position you well for this role:
- MS/BS in Computer Science, or related technical discipline
- Strong programming experience, Scala preferred
- Experience working with Big Data streaming services such as Kinesis, Kafka, etc.
- Experience working with Big Data streaming frameworks such as Nifi, Spark-Streaming, Flink, etc.
- Experience working with NoSQL data stores such as HBase, DynamoDB, etc.
- Experience building domain-driven Microservices
- Experience provisioning RESTful API’s to enable real-time data consumption
- Experience with performance and scalability tuning
The following skills and experience are also relevant to our overall environment, and nice to have, but not required:
- Experience in Python or Java
- Experience working with Hadoop and Big Data processing frameworks such as Spark, Hive, etc.
- Experience with SQL and SQL Analytical functions
- Experience working in a public cloud environment, particularly AWS
- Familiarity with practices like Continuous Development, Continuous Integration and Automated Testin
- Familiarity with build tools such as Cloud Formation and automation tools such as Jenkins
- Agile/Scrum Application Development experience
- An interest in artificial intelligence and machine learning