Essential Job Functions:
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using SQL and cloud 'big data' technologies.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep SFFCU and Member data secure and compliant to NCUA and regulatory policies.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our analytic capabilities into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data ecosystems.
- Strong analytical, problem solving and conceptual skills
- Whenever possible, provide opportunities to serve and teach others
- Be accountable for staying up to date with data storage technologies in a collaborative environment.
- All team members must comply with regulatory compliance and assigned training requirements including but not limited to BSA regulations corresponding to their specific job duties. Failure to do so may result in disciplinary and other employment related actions.
Knowledge, Education, and Experience
- Bachelor's degree in CS, MIS or MCDBA certification or equivalent experience
- 6 - 10 years' experience in data engineering
- 6 - 10 years' experience with ETL/ELT and data integration tools
- Working knowledge of cloud technologies
- Working knowledge of BI & data visualization tools