Be a leader on our small, dedicated team by helping us deliver high-quality, high-performing products. Develop creative solutions to problems, collaborate with awesome teammates and help us build a world class analytics platform. We are a global company and appreciate people with global awareness and knowledge (languages other than English are a bonus).
Some tasks a Senior Data Engineer will perform:
- Work closely with the leadership team to establish and maintain the technical and functional direction for the analytics product portfolio
- Design and develop software and data pipelines for managing premium video data collected from various devices
- Implement big data best practices for data warehousing/data lake creation, provision and consumption
- Participate in and lead discussions dealing with architectures, specifications, requirements, testing and design reviews
- Implement your own and others' designs, write code, write and perform unit tests and integrate into our distributed video security data flow system
- Develop new algorithms and software, analyze, review and re-architect current designs in order to create new capabilities as well as improve performance, efficiency and sustainability
- Estimate and plan development tasks, improve development processes and tools to meet corporate targets
- Work with other departments to resolve problems and facilitate product development efforts
- Mentor other software engineers
- Share technical knowledge and skills throughout the department
- Proactively suggest changes to the products, processes or internal tools to improve efficiency
- Raise technical risks to engineering management
- Provide tier 3 technical support as needed
- Others as assigned
This position reports to a technical manager within the development team.
- 6 or more years of ETL/data engineering work experience
- 3 years of big data development experience
- Design and coding experience in one of the following languages: Java, Python, Pig or Scala (more than 1 language is a big plus)
- RDMBS development experience (administration experience a bonus)
- Experience developing in a distributed computing environment, such as Hadoop (HDFS, Hive, MapReduce), Kafka, Spark, etc.
- Cloud infrastructure development experience (AWS)
- Continuous integration and source repository management (Github, Jenkins)
- Excellent communication skills including documentation
- Bachelor's degree in computer science or related field or equivalent experience
Each of the following would be a Plus:
- NoSQL (e.g., search, columnar, graph)
- Message queues (JMS, Kafka, etc)
- Virtualization (e.g., Mesos, Docker, Kubernetes)
- Data visualization tools and software such as Looker, Tableau, BusinessObjects, OBIEE, etc.