Big Data/Hadoop Architect
Regular Full time Perm
Only Green Card and Citizen
? Perform architecture design, data modeling, and implementation of Big Data platform and analytic applications
- Analyze latest Big Data Analytic technologies and their innovative applications in both business intelligence analysis and new service offerings;
- Stand up and expand data as service collaboration with partners in US and otherinternational markets
- Apply deep learning capability to improve understanding of user behavior and data
- Develop highly scalable and extensible Big Data platforms which enable collection, storage, modeling, and analysis of massive data sets
- Over 8years of engineering and/or software development experience.
- Hands-on experience in Apache Big Data Components/Frameworks
- Deep technical expertise in Spark, Hive, Impala, Kudu
- Over 3years?experience with Python(PySpark) and Scala.
- Strong devops skills and ability to guide the client in the physical deployment of clusters. Should be expert in Maven, GitHub and Jenkins.
- Strong Data Modeling. The candidate must have end to end experience in at least two big data data warehouse projects.
- Expertise in real-time data streaming using Kafka and Spark Streaming. Should be able to deploy and monitor Kafka clusters.
- Strong expertise in developing the data as a service platform.
- Should have significant experience in consuming data from third party Data API?s.
- 3years of experience in working with and developing datasets for Tableau developers and Data scientists.
- Experience in architecture and implementation of large and highly complex projects
- Deep understanding of cloud computing infrastructure and platforms
- History of working successfully with cross-functional engineering teams
- Demonstrated ability to communicate highly technical concepts in business terms and articulate business value of adopting Big Datatechnologies
- Bachelor?s Degree