Associate, AI Engineer in Chicago, IL

KPMG   •  

Chicago, IL 60601

Industry: Finance & Insurance

  •  

Less than 5 years

Posted 56 days ago

Innovate. Collaborate. Shine. Lighthouse – KPMG's Center of Excellence for Advanced Analytics – has both applied data science, AI, and big data architecture capabilities. Here, you'll work with a diverse team of sophisticated data and analytics professionals to explore the solutions for clients in a platform-diverse environment. This means your ability to find answers is limited only by your creativity in leveraging a vast array of techniques and tools. Be a part of a high-energy, diverse, fast-paced, and innovative culture that delivers with the agility of a tech startup and the backing of a leading global consulting firm. For you, that translates into the chance to work on a wide range of projects – covering technologies and solutions from AI to optimization – and the power to have a real impact in the business world. So, bring your creativity and pioneering spirit to KPMG Lighthouse.

Responsibilities:

  • Work in multi-disciplinary and cross-functional teams to translate business requirements into artificial intelligence goals and solution architecture; Rapidly iterate models and results to refine and validate approach across deployment options from KPMG-hosted, client, laptop, cloud, and container
  • Work in a fast-paced and dynamic environment with both virtual and face-to-face interactions utilizing structured approaches to solving problems, managing risks, and documenting assumptions; communicate results and educate others through insightful visualizations, reports, and presentations; Build ingestion processes to prepare, extract, and annotate a rich data variety of unstructured data sources (social media, news, internal/external documents, images, video, voice, emails, financial data, and operational data)
  • Design, develop and maintain artificial intelligence-enabled managed services (APIs) with a team of Data Scientist, Software Engineers, and Project Managers; Architect, implement, and test data processing pipelines (e.g. Hadoop and Spark) and data mining/data science algorithms on a variety of hosted settings (Cloud, AWS, Azure, GCP, or KPMG's own clusters)
  • Develop automated reporting for API and system health (process, memory, response time) utilizing leading processes for software development and analytics
  • Translate advanced technical architectures into production systems and contribute to the continual maintenance and testing of processes, APIs and associated user interfaces; build continuous integration and automated deployment environments; Develop containers (Docker) to ensure that APIs and processing pipeline can be easily deployed across a variety of hardware and software architectures

Qualifications:

  • Minimum two years of prior experience working in teams of data & analytics professionals to deliver on business-driven analytics projects using big data methods on multiple programming languages and technologies; Direct experience or close working relationship with DevOps engineering; Multidisciplinary backgrounds preferred
  • Bachelor's degree from an accredited college/university or Master's degree from an accredited college/university in Computer Science, Computer Engineering, or related field with good understanding of object oriented design and design patterns
  • Ability to work with local and international teams to understand available resources and constraints around data, architecture, platforms, tools, processes, and security; Provide support, and resolve problems, using excellent problem-solving skills, verbal/written communication
  • Understanding of cloud and distributed systems principles, including load balancing, networks, scaling, in-memory vs. disk; Experience with large-scale, big data methods, such as MapReduce, Hadoop, Spark, Hive, Impala, or Storm; Familiarity with agile software development practices, testing strategies and solid unit testing skills
  • Fluency in several programming languages (Python, Scala, or Java), with the ability to pick up new languages and technologies quickly; Ability to work efficiently under Unix/Linux environment with experience with source code management systems like GIT; Experience with cloud computing and virtualization, persistence technologies both relational and No-SQL and multi-layered distributed applications
  • Ability to travel up to eighty percent of the time; Applicants must be currently authorized to work in the United States without the need for visa sponsorship now or in the future

Valid Through: 2019-11-12