The Marketing Analytic Systems data pipeline delivers value to various business partners by providing tools and capabilities to analyze large data sets for research, analytics, campaign management, reporting and tactical/strategic decision making. Our systems development team is looking for a Developer who has passion to work with data and to build solutions that supports our Analytic Systems Solution stack, which includes Google cloud platform (GCP), SAS (Linux environment) and Tableau Server/Desktop.
The successful candidate would have extensive experience as a data engineer or ETL developer building and automating data transformation and loading procedures. Strong knowledge and experience using Big Query, SQL, cloud composer, apache airflow and SAS to conduct data profiling/discovery, data modelling and process automation is required. The candidate must be comfortable working with data from multiple sources; Hadoop, DB2, Oracle, flat files. The projects are detail intensive, requiring the accurate capture and translation of data requirements (both tactical and analytical needs) and validation of the working solution. We work in a highly collaborative environment working closely with cross functional team members; Business Analysts, Product Managers, Data Analysts and Report Developers.
- Design, develop and implement end-to-end solutions on google cloud platform - big query, cloud composer, apache airflow; strong ability to translate business requirements into technical design plan.
- Automate, deploy and support solutions scheduled on Crontab or Control-M. Deployment includes proper error handling, dependency controls and necessary alerts. Triage and resolve production issues and identify preventive controls.
- Build rapid prototypes or proof of concepts for project feasibility.
- Document technical design specifications explaining how business and functional requirements are met. Document operations run book procedures with each solution deployment.
- Identify and propose improvements for analytics eco-system solution design and architecture.
- Participate in product support such patches and release upgrades. Provide validation support for Google cloud and SAS products, including any changes to other infrastructure, systems, processes that impact Analytics infrastructure.
- Participate in full SDLC framework using Agile/Lean methodology.
- Support non-production environments with the Operations and IT teams.
- Consistently demonstrate regular, dependable attendance and punctuality.
- Managing offshore resources' assignments and tasks.
- Perform other duties as assigned.
Qualifications and Competencies:
- Bachelor's degree in Computer Science/Engineering, Analytics, Statistics or equivalent work experience.
- 6+ years of work experience in Data Engineering, ETL Development and Data Analytics.
- 6+ years of hands-on experience using SQL and scripting language such as Unix Shell or Python.
- 5+ years of hands-on experience developing on a Linux platform.
- 4+ years of hands-on experience working in traditional RDBMS such as Oracle, DB2.
- 3+ years of hands-on experience working in HDFS, TEZ, MapReduce, Sqoop.
- 1+ years of hands-on experience working in Cloud technology.
- 1+ years of hands-on experience working scripting language such as Python or SAS with BASE SAS, SAS MACRO, and SAS STAT.
- Excellent written and verbal communication and presentation skills.
- Ability to collaborate with internal and cross functional teams.
- Must be able to work independently with minimal supervision, strategic thinking and organizational planning skills.
- Experience with Spark, PySpark, Zeppelin and Jupyter Notebook is preferred.
- Demonstrated experience implementing and automating ETL processes on large data sets.
- Experience with report development and supporting data requirements for reporting.
- Knowledge of Hadoop/Big Data architecture and operational workings is preferred.
- Ability to multi-task and meet deadlines.
- Ability to work with diverse teams and multiple technologies.