Your key responsibilities
We are currently seeking a Manager to lead Big Data projects. Managers are typically responsible for managing standalone projects or elements of multiple client engagements. Their duties include service delivery, identifying future sales opportunities, and supporting practice development activities in the Digital Transformation practice. This professional will provide leadership to employees, manage and motivate teams with diverse skills and backgrounds, consistently deliver quality client services by monitoring progress, and demonstrate in-depth technical capabilities and professional knowledge.
Skills and attributes for success
- The expectations are that a Manager will be able to maintain long-term client relationships and network and cultivate business development opportunities.
- Provide high quality client services by directing daily progress of engagement work, informing engagement manager of engagement status, and managing staff performance.
- Expected to help recruit, develop and retain strategy professionals, coach and mentor team members, deliver strategy-related training and take on other practice management tasks as needed (e.g., development of practice offerings).
To qualify for the role you must have
- A Bachelor's degree in Computer Science, Engineering, Information Systems Management, Accounting, Finance or a related field.
- A minimum of 5 years of post-bachelor's work experience
- Must have experience assisting clients with Strategic Big Data initiatives, preferably focused in financial services.
- Must have presentation skills ' ability to create PowerPoint deck to communicate solution architecture to various stakeholders.
- Hands on experience in architecting Big Data applications using Hadoop technologies such as Spark, MapReduce, YARN, HDFS, Hive, Impala, Oozie, HBase, Elasticsearch, Cassandra.
- Experience working with Business Intelligence teams, Data Integration developers, Data Scientists, Analysts and DBA's to deliver well-architected and scalable Big Data & Analytics eco-system
- Experience using Neo4J or understanding of graph databases
- Strong Experience with event stream processing technologies such as Spark streaming, Storm, Akka, Kafka
- Experience with at least one programming language (Java, Scala, Python)
- Extensive experience with at least one major Hadoop platform (Cloudera, Hortonworks, MapR)
- Proven track record of architecting Distributed Solutions dealing with real high volume of data(petabytes)
Ideally, you'll also have
- Strong troubleshooting and performance tuning skills.
- Experience with SQL and scripting languages (such as Python, R)
- Deep understanding of cloud computing infrastructure and platforms.
- Good understanding of Big data design patterns