Within FSO IT Advisory, the Enterprise Architecture competency works collaboratively with clients to enable disruptive innovation through transformational design solutions. The team supports our clients in modernizing systems, processes and products to enhance business and customer-facing platforms and services to support high performance in a rapidly changing and competitive environment. The team leverages Microservices, Cloud, Omni channel integration, API Management and open source technologies to enable end to end automation with safe and sound architecture at optimum cost.
Your key responsibilities
We are currently seeking a Senior to help lead Big Data projects. Seniors are typically responsible for managing standalone projects or elements of multiple client engagements. Their duties include service delivery, identifying future sales opportunities, and supporting practice development activities in the Digital Transformation practice. This professional will provide leadership to employees, manage and motivate teams with diverse skills and backgrounds, consistently deliver quality client services by monitoring progress, and demonstrate in-depth technical capabilities and professional knowledge.
Skills and attributes for success
- The expectations are that a Senior will be able to maintain long-term client relationships and network and cultivate business development opportunities.
- Provide high quality client services by directing daily progress of engagement work, informing engagement manager of engagement status, and managing staff performance.
To qualify for the role you must have
- A Bachelor's degree in Computer Science, Engineering, Information Systems Management, Accounting, Finance or a related field.
- A minimum of 3 years of post-bachelor's work experience
- Must have experience assisting clients with Strategic Big Data initiatives, preferably focused in financial services.
- Must have presentation skills ' ability to create PowerPoint deck to communicate solution architecture to various stakeholders.
- Hands on experience in architecting Big Data applications using Hadoop technologies such as Spark, MapReduce, YARN, HDFS, Hive, Impala, Oozie, HBase, Elasticsearch, Cassandra.
- Experience working with Business Intelligence teams, Data Integration developers, Data Scientists, Analysts and DBA's to deliver well-architected and scalable Big Data & Analytics eco-system
- Experience using Neo4J or understanding of graph databases
- Strong Experience with event stream processing technologies such as Spark streaming, Storm, Akka, Kafka
- Experience with at least one programming language (Java, Scala, Python)
- Extensive experience with at least one major Hadoop platform (Cloudera, Hortonworks, MapR)
- Proven track record of architecting Distributed Solutions dealing with real high volume of data(petabytes)
Ideally, you'll also have
- Strong troubleshooting and performance tuning skills.
- Experience with SQL and scripting languages (such as Python, R)
- Deep understanding of cloud computing infrastructure and platforms.
- Good understanding of Big data design patterns