There are two positions available that will work on our team of ETL and BI Developers at the Kansas City National Security Campus. The team is responsible for working withinternal business customers to identify business requirements, developing and administering big data ingestion processes on our Hortonworks Hadoop platform, and supporting/interfacing with our Legacy ETL (Talend), Data Virtualization (Denodo), and Enterprise Analytics (MicroStrategy) platforms. Both positions will focus on our growing ETL and Big Data ingestion needs with opportunities to work on Microstrategy.
This position can be located anywhere in the Kansas City Metro area.
Summary of Duties:
- Design, translate business requirements, develop, document, test, monitor, troubleshoot and administrate our data environment using our ETL/Data Virtualization/BI Reporting/Big Data Management tool sets
- Participate in maintenance and upgrades for our platforms and tool sets
- Contribute to the support and resolution of incidents which involve all data platforms which includes a weekly on-call rotation
- Maintain knowledge of new technologies and methods for area of expertise and have knowledge sharing sessions with the team
- Communicate and accurately document all architectural decisions, plans, goals, functional requirements, and open items and issues to key stakeholders, including management, development teams, and business areas
- Build and maintain strong peer relationships within the team and across the organization
- Participates in short- and long-term planning sessions with customers to improve business processes and works closely with Technical Review Board and architects to assure all systems are in line with IT long-term strategy.
- Proactively analyzes existing information systems and applications to identify weaknesses and develop opportunities for improvements and may lead the evaluation and selection process for new application packages.
- Manages customer's priorities of projects and requests with the help of the Data Team Lead.
- Assesses customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost vs. benefits.
- Participate in data governance initiatives, maintain metadata (data definitions, relationships, sources, and consumers), and contribute to the company’s Data strategies and roadmaps
- Adheres to quality standards and procedures. Reviews code for quality assurance and checks compliance with data architecture standards and conducts unit testing to ensure code meets specifications.
You Must Have:
- U.S. Citizen in order to obtain and maintain US Dept. of Energy “Q” level security clearance
- Bachelor's degree in the field of Information Systems, Computer Science, Computer Engineering, Engineering, or Physical Science; In lieu of degree, minimum 4 years of SQL experience and/or 4 years of Big Data ingestion experience.
- Minimum 2 years of SQL experience (ability to develop, modify, troubleshoot and optimize complex queries)
- Minimum 2 years of experience developing with ETL toolsets and/or Data Virtualization toolsets (e.g. Talend Integration Suite, Informatica, Denodo)
- Minimum 2 years of experience working in Oracle or SQL Server or Teradata databases (e.g. table creation, insert, update, deletes, indexing, optimization)
- Experience with Apache NiFi and/or Waterline Data and/or Attunity Replicate
- Experience or knowledge of a major programming language (e.g. Java, Python, C#)
- Experience or knowledge of a scriptinglanguage (e.g. Linux shell, Windows batch, Perl, VBS)
- Experience working in a Red Hat Enterprise Linux and/or Windows environment
- Knowledge of data governance, data lineage, metadata management processes, and software development lifecycle
- Excellent problem solving, design, coding and debugging skills
- Experience working withmobile teams
ETL DevelopmentETL SupportCollaborate with stakeholders Must have or be eligible for asecurity clearance due tocontractual requirements.