$80K — $100K *
Builds, enhances and maintains an Enterprise Data Warehouse and Data Lake that serves the reporting and analytical needs for VNSNY, as well as IT consumers. Participates in Systems Delivery Life Cycle for Data Integration projects including preparing detailed technical design documents from functional requirement documents, designing and developing Extract, Transform and Load (ETL) framework, establishing DB application schemas and developing database code, participating in unit and system testing of end-to-end integration, and performing release management and production support as needed.
Education: Bachelor’s degree in Computer Science, Information Systems, or a related field required.
Experience:Minimum of three years of experience with a Bachelor’s degree or one year of experience with a Master’s degree in one of the above majors in Data Integration projects working on ETL(or ELT) frameworks for Data Warehouse projects, preferably using Informatica 10.x, required. Experience with performance tuning techniques on ETL code as well as underlying databases required. Experience with database development for data warehouse projects including writing SQL queries, PL/SQL functions, procedures and packages in both leading open-source and commercial database platforms required. Familiarity with NoSQL databases and general Big data concepts and technologies preferred. Experience in RESTful Web Service required. Experience in Unix/Linux shell and Python scripting required. Experience with Advanced Scheduling tools such as Control-M, Autosys or Tivoli required. Working knowledge of data modeling, and understating various aspects of logical, conceptual and physical modeling of data warehouse projects required. Familiarity with cloud related technologies (Snowflake, RDS, S3, LAMDA, EC2, etc.) as related to data integration required. Working knowledge of an analytics platform/Visualization tool such as Oracle OBIEE, Microstrategy or Tableau required
Valid through: 12/11/2020