Job Description :
The Lockheed Martin Analytics Center of Excellence (ACE) is looking for a talented Big Data Engineer to help build out and tune data processing pipelines and scenarios for both integration and interactive analysis of large data sets in distributed systems.
The ideal candidate will be equal parts technologist and evangelist – capable of both communicating the innovation and value proposition of big data processing, storage, and analysis to a variety of IT and business stakeholders as well as the experience necessary to build out the solutions that make the art of the possible a reality.
In this role, you’ll be involved throughout the lifecycle of a big data/analytics project, from engaging with end users to understand the desired goals and outcomes of the solution, help to determine the optimal technical approach from data sourcing to end user consumption, and help to educate internal and external stakeholders on how to access the data and incorporate it into their business rhythms.
You can expect to support the ACE in the construction of technical prototypes, demos, and internal analytics community education, as well as supporting a variety of Lockheed Martin business areas and functions to help advance their big data initiatives.
The position requires self-motivated individuals combined with hands-on entrepreneurial spirit and exceptional drive, with the ability to work in a fast-paced multi-disciplinary environment.
Limited travel (25% or less) is required for this role. Work location at a major LM EBS facility is preferred, but virtual applicants will be considered.
US Citizenship Required
- Experience transferring data from various sources into Hadoop using open source (sqoop, flume) or commercial ETL tools (Informatica, SAP Data Services)
- Experience in designing and implementing optimal data storage options (such as file types and formats, compression options)
- Experience applying processing logic in Hadoop and/or Spark (MapReduce and abstracted alternatives)
- Experience in exposing data within to external consumers using various open source SQL-like tools (i.e. Hive and Impala).
- Experience in developing Big Data solutions leveraging Java, Scala, Python or R.
- Experience with various methods for job orchestration (such as oozie) and monitoring in a big data environment
- A demonstrated understanding of business fundamentals and how big data analytics can create value for business stakeholders
- Experience installing, configuring, and tuning big data environments. Experience with Cloudera’s distribution is preferred
- Experience with open source message brokers (Kafka, etc) for real-time data feeds.
- Experience with open source search platforms (Solr, etc) for real-time indexing and searching capabilities.
- Experience with open source non-relational distributed databases (HBase, MongoDB, Neo4j and Cassandra) for storage and retrieval.
- Experience with architecting solutions using AWS service offerings, including S3 and EMR
- Ability to translate technical results to business centric, actionable insights
- Excellent written, verbal and presentation skills