The Computer Vision Cloud Tools team at Magic Leap, Inc. is currently in search of a Technical Engineering Lead.
The Engineering Lead will be responsible for direct management of a team in Sunnyvale, and global orchestration with teams in Europe and Israel. In addition to technical development you will be responsible for building, and delivering on, a roadmap for the team working with internal computer vision team customers. You will be responsible for cloud tools, including processing and analysis and data storage and accessibility. You will work on cutting-edge computer vision problems that require unique software and data infrastructure solutions. The tools that you develop will drive key engineering decisions and help guide the work of our Computer Vision and Machine Learning teams.
Qualified candidates need to be self-starters and able to operate in a highly dynamic environment.
- Provide leadership for the CV Cloud Tools team.
- Build large scale distributed systems that leverage distributed content and data processing.
- Design and implement complex big data systems with a focus on collecting, parsing, cleaning, managing, analyzing and visualizing large sets of data to turn information into insights.
- Maintain a high level of data integrity, quality and security checks.
- Develop data pipelines and RESTful services that are distributed, robust and highly performant.
- Work with other data teams to integrate data from different sources into deep learning pipelines.
- Act as a subject matter expert and mentor junior developers.
- 7+ years of proficient experience working on software products.
- 3+ years of team leadership.
- Extensive experience in maintaining high data integrity and quality with relational databases.
- Strong knowledge in REST API design and message queues.
- Strong programming skills in Python and / or C++.
- A proven track record of successful design and implementation of APIs and high-performance service-oriented architectures.
- Solid OOP and software design skills to create software that's extensible, reusable and meets desired architectural objectives.
- Experience integrating with a variety of SQL and NoSQL databases such as MySQL, PostgreSQL, MongoDB, Cassandra and Redis.
- Experience with Docker and container management and deployment.
- Comfortable with Linux, shell-scripting, and Git.
- Expert in data warehousing solutions and proficient in designing efficient and robust ETL workflows.
- Experience building large-scale data processing systems using MapReduce or frameworks such as Spark and Hive.
- Familiarity with Message Broker architectures such as RabbitMQ, ZeroMQ and Kafka.
- Experience with processing terabytes or petabytes of data on a daily basis.
- Experience deploying and scaling high-traffic services in private and public clouds such as AWS and Google Cloud.