About the role:
We are looking for an experienced and high-performing individual to join our Data and Analytics Delivery team as a Senior ETL Application Designer. This senior technical leadership role is responsible for the design, building, deployment and support of data integration, BI reporting and analytic solutions, and application integration solutions using Informatica, Python, AWS Glue and supporting tools and services. The DBIS delivery team is involved in exciting projects to move data into our Enterprise Data Lake using leading-edge technologies and leveraging our AWS Cloud based data platform for advanced analytics and data science.
What will you do?
- Provide leadership on the construction and maintenance of robust and efficient data applications and reusable frameworks
- Mentor and guide other data developers across various locations to ensure all code follows applicable standards and is efficient and easily maintainable
- Translate detailed requirements into technical specifications for development
- Provide high level solution options and estimates for project proposals, and detailed work estimates in support of assigned work
- Deliver solutions according to Systems Development Life Cycle (SDLC) methodology for either waterfall or agile projects
- Provides consultation for the evaluation of data and software systems.
- Develop and manages effective working relationships with other departments, groups or personnel with whom work must be coordinated.
What do you need to succeed?
- Understanding and hands-on expertise on AWS Cloud Computing environment and experience with building ETL using AWS Glue (Python), step and lambda functions
- At least 2 years of experience with Python script development using PySpark and Python libraries such as Numpy and Pandas.
- Experience with creating complex data frames/structures in Hadoop for data integration and complex calculations
- Extensive experience in developing solutions for data warehouse loads and system integrations using ETL tools such as Informatica PowerCenter
- Minimum of 3 years of experience with Big Data including knowledge of MapReduce, HDFS, Tez, Hive, Pig and Spark (3+ years)
- Experience with production implementation change management processes
- Experience with project management and software development life cycle /SDLC; waterfall and agile (5+ years)
- Excellent documentation skills including technical writing, Visio, PowerPoint, flowcharting (5+ years)
- Strong communication and analytical skills, including conceptual, requirements interpretation, solution creation and problem-solving abilities
- Excellent collaboration and leadership skills and proven ability to adapt to challenges, coaching and mentoring
What's in it for you?
- Competitive salary and bonus structure influenced by market range data
- Pension, stock and savings programs to help build and enhance your future financial security
- A common sense dress code, where you dictate how you dress based on your day
- An environment of continuous learning and improvement
- The opportunity to move along a variety of career paths with amazing networking potential