$100K — $150K *
As a Data Engineer for the American Red Cross, you will be part of a Data Management team that is modernizing and transforming our data and reporting capabilities across multiple verticals including Biomedical Services by implementing a new modernized data architecture.
The Data Engineer will design and develop highly scalable and extensible data platforms which enable collection, storage, distribution, modeling, and analysis of large data sets from numerous channels. This position requires an innovative software engineer who is passionate about data & data quality. The ideal candidate will possess strong data management and API integration experience and the ability to develop scalable data pipelines that make data management and analytics/reporting faster, more insightful, and more efficient.
• Develop, test, document and maintain scalable data pipelines.
• Build out new data integrations including APIs to support continuing increases in data volume and complexity.
• Establish and follow data governance processes and guidelines to ensure data availability, usability, consistency, integrity, and security.
• Build and implement scalable solutions that align to our data governance standards and architectural road map for data integrations, data storage, reporting, and analytic solutions.
• Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
• Design and develop data integrations and a data quality framework. Write unit/integration/functional tests and document work.
• Design, implement, and automate deployment of our distributed system for collecting and processing streaming events from multiple sources
• Perform data analysis required to troubleshoot data-related issues and assist in the resolution of data issues.
• Serve as tech lead when needed.
• Education: 4-year college degree or equivalent combination of education and experience. Prefer academic backgrounds in in Computer Science, Mathematics, Statistics, or related technical field.
• 7-10 years of relevant work experience in analytics, data engineering, business intelligence or related field.
• Experience with or knowledge of Agile software development methodologies.
• Experience using SQL queries as well as writing and optimizing SQL queries in a business environment with large-scale, complex datasets.
• Experience developing integrations across multiple systems and APIs.
• Experience creating ETL and/or ELT jobs.
• Excellent problem solving and troubleshooting skills.
• Process oriented with great documentation skills.
• Proficient with coding in Python.
• Experience with AWS technologies (e.g. Redshift, RDS, S3, EMR, EC2, Kinesis) is a plus.
• Experience with data warehouse technologies is a plus.
• Experience designing data schemas and operating SQL/NoSQL database systems is a plus.
• Experience with Big Data tools like Spark, Hadoop, Kafka, etc. is a plus.
Valid through: 7/16/2020