SnapLogic is seeking senior software engineers for our Distributed Data Processing Team in San Mateo, California.
We are seeking a detail-oriented candidate with outstanding technical abilities and accomplishments, an extremely strong passion for technology and software craftsmanship, and willingness to “go the extra mile” to ensure the highest quality of experience for our customers.
What You’ll Do:
- Contribute to the development of the distributed processing engine to power the data transformation capabilities of SnapLogic’s data integration offering.
- Be able to quickly debug complex distributed computing issues and figure out the root cause, using tests (unit and integration) to validate the work.
- Cycle between projects in weeks rather than years, while working closely with customers to harden an early stage product as it achieves product market fit.
- Keep on top of emerging trends and technologies in the field of distributed data processing, including open source products.
- Take a look at our latest interview with one of your future teammates: https://snaplogic.box.com/s/8faxwz9vjvaolj52or7ggc6ozxc0b5i8
What We’re Looking For:
- You have extensive (5+ years) of experience with C++/Java in a multithreaded environment, with a complete understanding of object-oriented programming and software complexity.
- You have experience building distributed software systems (file systems, messaging systems, databases, integration systems, etc.) for enterprise customers.
- You appreciate the level of code quality required for successful enterprise products and maintenance and you always write code with testability in mind.
- You have a demonstrated a history of mentoring team members, contributing to agile processes, and leading knowledge transfer sessions.
- You have excellent communication skills, are comfortable working with little supervision, and have a preference for taking the initiative.
- You have a customer-centric view of products and processes.
- Bachelor's or Master’s degree in Computer Science, Computer Engineering, ElectricalEngineering, or a related field.
- Experience developing a data integration framework/product e.g. Informatica, Talend.
- Knowledge of Hadoop, Storm or Spark internals.
- Contributions to open source projects like HDFS, Spark, Flink, etc.
- Knowledge of Big Data integration patterns, data lakes and data warehouses.