Please note that visa sponsorship is not available for this position.
We are excited to consider a remote engineer for multiple opportunities we have on our Telemetry Data Platform teams! Remote team members will be expected to work out of their home office.
Are you interested in joining engineering teams working on systems that ingest billions of data points per minute, serve billions of web requests per day, and process millions of messages per second at petabyte scale? If so, New Relic Telemetry Data Platform, the largest multi-tenant observability platform in the world has multiple mid-level software engineering positions open.
- Large organizational scale (80+ engaged and productive engineering teams)
- Large technical scale (several billion events ingested per minute)
- Employee resource groups to support a diverse workforce
- 17,000+ passionate and paying customers
- 10 weeks of paid parental leave covering both adoption and foster placement
- Work on a product built by engineers for engineers
What You’ll Do
Software Engineers are responsible for the entire software development lifecycle, including deployment, operation, and maintenance. This requires an understanding of the patterns and tradeoffs inherent in building scalable systems and a belief that observability, operability, and reliability should be included from the beginning.
- 2+ years of building software in a compiled backend language such as Java, Go, and Rust.
- 2+ yrs of professional experience deploying and shipping software in a software production environment.
- Knowledge of fundamentals required to build and operate highly-available software at scale including data structures, architectural patterns, and distributed systems.
- A desire to work as part of a team that values curiosity, efficiency, and quality and strives to strike a balance between thoroughness and delivery.
- A willingness to be on-call for the software you build and a genuine desire to learn from mistakes.
- Our architecture is built around Apache Kafka, and every single one of our services interacts with Kafka in one way or another. Experience with Kafka or other data pipeline technologies is a plus, but not required.