Big Data Hadoop Architect
11 - 15 years experience • IT Consulting/Services
Big Data Hadoop Architect
Major Consulting company
Green Card holders or US Citizens ONLY
**Do not apply unless you are a Green Card holder or a US Citizen**
• Bachelor Degree
• 10+ years’ experience and Implementation of over 10 Data Projects across all technologies listed above
• 5+ Years end to end experience in Big Data Projects with at least 2 Project Implementations (PoC Experiences alone not counted).
• Experience in Architecting End to end Hybrid Data Ecosystem. Should have experience on traditional ETL Tools, RDBMS, MPP, BI Tools and have a view of how these integrate with Big Data Environments
• Strong Programming Skills in Java / Scala / Python
• Hands-on expertise in Big Data/Hadoop: Spark, Strom, Spark Streaming, Kafka, Flume from a Development and Design capacity
• Experience in ingesting data - Multiple sources including Mainframes, Teradata and other unstructured; Batch – Messaging – Streaming patterns supporting extremely large volumes
• Experience in Designing and Developing Data Quality, Metadata Management, Information Lifecycle Management, Data Reconciliation, Data Wrangling, User Consumption Management
• Expertise on one of the Commercially available Hadoop Distribution – CDH, MapR or Hortonworks - deep understanding of the internals from Data Security, Integration and Performance tuning
• Experience with No-SQLs DB – HBase, Cassandra or Mongo DB with implementation experience
• Experience integrating Analytics Platforms such as Python / SAS / R with Big Data Platforms
• Conducting JAD session with the Users over PoCs/PoTs
• Designing and Building systems in NoSQL and Data Streaming platforms – With through understanding on various NoSQL and Hadoop physical platform selection
• Exposure/Implementation of Data initiatives in Cloud (AWS, Azure, GCP)
• Experience in Data Modeling (3NF, Dimensional Modelling etc.), Building Reporting Semantic, Data Visualization
• Text Analytics implementations
• NLP and AI, Machine Learning use case implementations
• Full text search Implementations with Solr, Elastic Search etc.
• Modeling experience in NoSQL Database technologies like Cassandra, Hbase, Mongo etc.
Hands on Big Data/Hadoop Architect having experience Implementing Data Lakes, Data Ingestion / consumption frameworks, Metadata Driven Development and BI Reporting and Visualization using next Gen tools on Big Data/Hadoop.
The need is to support Data Transformation programs from presales, Architecture, Design till implementation.
The candidate should have strong Design/Architecture knowledge on Data Processing Frameworks (E.g.: Spark, Strom), Messaging Frameworks (Ex: Kafka, Flume), NoSQL). The candidate is expected to work with CxO’s, Application Directors (C-1) in assessing the platform needs, building the technical capability and advising/directing the developers in enabling the solutions.
Candidate should be able to converse with Senior Architects, Senior functional SME on solving real world business problems with detailed technical and functional solution.
• Working with the CxO, C-1’s and Application Directors to understand the use cases and business capabilities (Excellent presentation and white-boarding skills)
• Building the Technical Architecture, Design deliberation, Product selection and execution roadmap for the Data Transformation programs
• Working as a leader to define technical solutions for request from the client like RFP’s, RFQ’s, proposals etc., coming up with next generation transformative ideas and proactively taking them to the client.
• Working with various internal, 3rd party and client teams in understanding the problem statement and driving towards a solution