Big Data Solution Architect
We have an urgent job opening for the position of Big Data Solution Architect role. Please take a look on the below job description and send me a copy of your updated resume with the best time and contact details to reach you.
Title: Big Data Solution Architect
Essential skills, knowledge & experience:
This include the ability to understand and communicate the following at an executive level both internally and externally:
- Hands on experience with designing and implementing distributed architecture systems to terabyte/petabyte using OpenSource Software
- Experience in full life cycle Hadoop Solutions: requirement analysis, platform selection, design future state application and enterprise architecture, testing and deploying solution.
- Expert knowledge in modern distributed architectures and compute / data analytics / storage technologies on AWS
- Knowledge of a programming and scriptinglanguages such as Java/Python/Perl/Ruby/linux
- Understanding of architectural principles and design patterns using frameworks such as Hadoop / Spark and/or AWS EMR
- Knowledge of SQL ( MS SQL, PostgreSQL, mySQL) and NoSQL databases (HBase, DynamoDB, Cassandra)
- Knowledge of technical solutions, design patterns, and code for applications in Hadoop.
- Experience in architecting and building data warehouse systems and BI systems including ETL (Inofmratica, Talend).
- Experience in performance troubleshooting, SQL optimization, and benchmarking.
- Software Development Lifecycle (SDLC) experience
- AWS Architecture / Azure Architectureexperience ideally with the appropriate vendor certification
- Understanding of hybrid cloud solutions and experience of integrating public cloud into tradition hosting/delivery models
- Experience as principal technical lead on at least one major project.
Desirable skills, knowledge & experience:
- AWS or Azure trained / certified architect – e.g. Amazon Certified Solutions Architect – Associate or Professional Level
- AWS Redshift experience
- Oozie, Flume, ZooKeeper, Sqoop and/or R
- Hands on experience designing, developing, and maintaining software solutions in Hadoop Production cluster.
- Experience of implementing architectural governance and proactively managing issues and risks throughout the delivery lifecycle.
- Good familiarity with the disciplines of enterprise software development such as configuration & release management, source code & version controls, and operational considerations such as monitoring and instrumentation
- Experience ofconsulting or service provider roles (internal, or external);
- Strong knowledge on software development methodologies like Agile/Scrum, Kan Ban etc. Broad understanding of enterprise project lifecycle
- AWS certification in any of the following - Solutions Architect, Developer or Systems Ops
- A good degree-level education is highly desirable
- Team lead and programme or project management experience
- Experience using databasetechnologies like Oracle, MySQL is required and understanding of NoSQL, MongoDB is preferred.
- ISO27001/2 certification
- Redshift / Data warehouse experience
- Extensive automation experience with either Chef or Puppet
- Experience in designing or implementing data warehouse solutions is highly preferred.
- DevOps experience