This role will also be responsible for designing integrated solutions for Real-Time streaming sources such as Social Media, OTT systems, and IofT devices. This person will provide technical leadership and become a mentor to others on the Team. This role will be expected to have experience in designing and delivering scalable solutions using Technologies under the Hadoop ecosystem (Spark, Kafka, Flume, Oozie, HBase) on Cloud (AWS) Infrastructure.
Design and develop highly scalable Data Pipelines that incorporate complex transformations and efficient code. Data will need to flow to and from unstructured and relational systems for analytic processing.
You will surface these datasets in real time to mission-critical products and business applications throughout the company. You will be among the earliest adopters of bleeding-edge data technologies and features, working directly with infrastructure teams to integrate them into your services at scale.
Responsible for building tools and infrastructure to gather meta data from our service components and Hadoop clusters and store, transform and analyze this data to drive important customer and internal needs.
The Big Data Solution Architect is responsible for mapping business requirements to systems and technical requirements for the overall design and implementation of information management solutions that address specific business problems and achieve expected business benefits.
The Enterprise Data Architect will be responsible for the architectural oversight of Lam Researchs Enterprise Data Warehousing, Big Data and Predictive Analytics Strategy across its transactional, operational & analytical applications.