Director, Big Data Software Engineer in Mc Lean, VA

KPMG   •  

Mc Lean, VA 22101

Industry: Finance & Insurance

  •  

11 - 15 years

Posted 55 days ago

Innovate. Collaborate. Shine. Lighthouse – KPMG's Center of Excellence for Advanced Analytics – has both applied data science, AI, and big data architecture capabilities. Here, you'll work with a diverse team of sophisticated data and analytics professionals to explore the solutions for clients in a platform-diverse environment. This means your ability to find answers is limited only by your creativity in leveraging a vast array of techniques and tools. Be a part of a high-energy, diverse, fast-paced, and innovative culture that delivers with the agility of a tech startup and the backing of a leading global consulting firm. For you, that translates into the chance to work on a wide range of projects – covering technologies and solutions from AI to optimization – and the power to have a real impact in the business world. So, bring your creativity and pioneering spirit to KPMG Lighthouse.

Responsibilities:

  • Lead a technical team to rapidly architect, design, prototype, and implement and optimize architectures to tackle the Big Data and Data Science needs for a variety of Fortune 1000 corporations and other major organizations; Design, maintain and oversee the operational process to develop modular code base to solve “real” world problems; Conduct regular peer code reviews to ensure code quality and compliance following best practices in the industry
  • Work in cross-disciplinary teams with KPMG industry professionals to understand client needs and ingest rich data sources (social media, news, internal/external documents, emails, financial data and operational data)
  • Research, experiment, and utilize leading Big Data methodologies (Hadoop, Spark, Kafka, Netezza, SAP HANA, and AWS) with cloud/on premise/hybrid hosting solutions; Oversee a technical team to provide proficient documentation and operating guidance for users of all levels
  • Lead a technical team to architect, implement, and test data processing pipelines, and data mining/data science algorithms on a variety of hosted settings (AWS, Azure, client technology stacks, and KPMG's own clusters)
  • Translate advanced business analytics problems into technical approaches that yield actionable recommendations across multiple, diverse domains; Communicate results and educate others through design and build of insightful visualizations, reports, and presentations; Develop skills in business requirement capture and translation, hypothesis-driven consulting, work stream and project management and client relationship development
  • Help drive the process for pursuing innovations, target solutions, and extendable platforms for Lighthouse, KPMG, and client; Participate in developing and presenting thought leadership, and assist in ensuring that the Lighthouse technology stack incorporates and is optimized for using specific technologies; Promote the KPMG brand in the broader data analytics community

Qualifications:

  • Minimum of ten years of big data experience with multiple programming languages and technologies, three years as a lead and three years at a management level or PhD with minimum five years of big data experience and familiarity with the end-to-end sales process
  • Bachelor's degree or Master's degree from an accredited college/university in Computer Science, Computer Engineering, or related field
  • Ability to manage complex engagements and interface with senior level management internally as well as with clients; Ability to communicate complex technical concepts succinctly to non-technical colleagues, understand & manage interdependencies between all facets of a project; Ability to lead client presentations; Must have demonstrated advanced proficiency in complex, mature and sophisticated D&A technologies and solutions; Ability to mentor others and publish whitepapers or articles on complex D&A technologies or solutions
  • Market-leading proficiency with multiple large scale and/or distributed processing methodologies (Hadoop, Storm, Spark); Skilled ability to rapidly ingest, transform, engineer, and visualize data, both for ad hoc and product-level (e.g., automated) data & analytics solutions
  • Market-leading fluency in several programming languages (Python, Scala, or Java), with the ability to pick up new languages and technologies quickly; Understanding of cloud and distributed systems principles (such as load balancing, networks, scaling, in-memory vs. disk); Experience with large-scale, big data methods (MapReduce, Hadoop, Spark, Hive, Impala, or Storm); Ability to work efficiently under Unix/Linux environment or .NET, having experience with source code management systems like GIT and SVN; Strong knowledge with programming methodologies (version control, testing, QA) and development methodologies (Waterfall and Agile)
  • Experience with object-oriented design, coding, and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures; Familiarity with different architecture patterns of development such as Event Driven, SOA, micro services, functional programming, Lambda; Capability to architect highly scalable distributed systems, using different open source tools
  • Ability to travel up to eighty percent of the time; Applicants must be currently authorized to work in the United States without the need for visa sponsorship now or in the future

Valid Through: 2019-11-12