Slalom is proud to be a Premier AWS Partner, Microsoft Gold Partner, Google Premier Partner, 5x Tableau Partner of the Year, and 2018 Snowflake Partner of the Year.
The Data & Analytics practice at Slalom is a full-service data practice with competencies across information strategy, modern data architecture, data visualization, and data science. We are seeking a Cloud Data Engineer to join our local Seattle consulting team who is passionate about developing innovative solutions to help organizations drive strategic business outcomes and enable data-driven insights. Slalom Cloud Data Engineers are cloud data platform engineering specialists responsible for client delivery, complex solutioning and knowledge management.
- Work as part of a team to design and develop cloud data solutions at local Seattle clients
- Deliver on the technical scope of projects & demonstrate thought leadership at clients as well as internally at Slalom
- Gather technical requirements, assess client capabilities and analyze findings to provide appropriate cloud solution recommendations and adoption strategy
- Research, analyze, recommend and select technical approaches for solving challenging and complex development and integration problems
- Assist in designing multi-phased cloud data strategies, including designing multi-phased implementation roadmaps
- Analyze, architect, design, and actively develop cloud data warehouse, data lakes, and other cloud-based data solutions
- Design and develop scalable data ingestion frameworks to transform variety of datasets
- Serve as a subject matter expert in a cloud platform for larger the Slalom practice and contribute back to community
- 4+ years of data engineering and/or data warehousing experience
- 2+ years of deep experience building cloud data solutions (Azure, AWS, GCP, Snowflake)
- Experience migrating from an on-prem data to cloud data platform.
- Deep experience with designing and deploying end to end solutions with a cloud platform's analytic services including storage, permissions, private cloud, database services, virtual machines, and parallel processing technologies.
- Experience with big data application development and/or with cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, Azure SQL DW, BigQuery)
- Working knowledge of agile development including DevOps concepts
- Experience with cloud SDKs and programmatic access services
- Proficient in a relevant programming language for cloud platform e.g. Python/Java/C#/Unix
- Proficient in SQL
- Working experience with version control platforms e.g. Git
- Strong communication skills
- One or more cloud certifications
- Consulting experience
- Expert programming skills in Python and a software development background
- Experience writing "infrastructure as code" deployments e.g. ARM, CloudFormation, Terraform
- Understanding of cloud strategies and best practices expanding into cloud networking, cloud security, encryption, private cloud configuration, and overall cloud governance approaches
- Strong background in data warehousing concepts, ETL development, data modeling, metadata management, and data quality