Job Description
We are looking for Senior Ab Initio Solutions Engineer to be able to design and build Ab Initio-based solutions across Data Integration, Governance & Quality domains for our customer programs. The individual will be working with both Business Analysts and prospective Application Managers in order to gather requirements, perform implementations and guide production rollouts while creating highly optimized batch and real-time applications and ensuring the overall success of their programs. The programs have high visibility and are fast-paced key initiatives, which generally aim towards acquiring & curating data and metadata across internal and external sources, provide analytical insights and integrate with customer’s critical systems.
- Technical Stack: Ab Initio 3.5.x or 4.0.x software suite – Co>Op, EME, BRE, Conduct>It, Express>It, Metadata>Hub, Query>it, Control>Center
- Ab Initio 3.5.x or 4.0.x frameworks – Acquire>It, DQA, Spec-To-Graph, Testing Framework
- Big Data – Cloudera or Hortonworks Hadoop, Hive, Yarn
- CI/CD – Jenkins, Chef, Terraform
- Cloud – AWS or Azure
- Databases - Oracle 11G/12C, Teradata, MongoDB, Snowflake, Cassandra
- Others – JIRA, Service Now, Linux 6/7/8, SQL Developer, AutoSys, and Microsoft Office
- Job Duties Ability to design and build highly-optimized Ab Initio graphs (both continuous & batch) Conduct>it Plans, and integrate with a portfolio of Ab Initio software.
- Build E2E Ab Initio applications involving generic graphs leveraging PDL metaprogramming.
- Build generic Express>It templates with built-in BRE rulesets for data processing applications and allows business users to configure and build pipelines rapidly.
- Design Web-Service and RESTful models and build Web-service graphs accordingly.
- Perform schema customizations on Metadata Hub metamodels.
- Build View customizations on Metadata Hub, and enable users with intuitive UI options.
- Build import extractor graphs for Metadata Hub in order to scan heterogeneous technical platforms and acquire technical metadata.
- Build applications that integrate with heterogeneous data sources – Hadoop, Hive, AWS S3, MongoDB, Cassandra.
- Build applications leveraging frameworks – Acquire>It, Spec-To-Graph, Data Quality Assessment.
- Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or Service Now.
- Build custom Query>It subgraphs for cataloging data from different sources.
- Build generic parsers for XML, JSON & YAML documents including hierarchical models.
- Build data lake & warehouse data models, design patterns, and demonstrate experience in leveraging various Ab Initio components to develop complex applications.
- Build SQL scripts on database, performance tuning, relational model analysis, and perform data migrations.
- Ability to identify performance bottlenecks in graphs, and optimize them.
- Ensure Ab Initio code base is appropriately engineered to maintain current functionality and development that adheres to performance optimization, interoperability standards and requirements, and compliance with client IT governance policies
- Build technical design documents, use cases, test cases and write user manuals for various projects
- Conduct bug fixing, code reviews, and unit, functional and integration testing
- Act as a liaison to clients on highly technical and complex requests and serve as a technical advisor to other junior team members
- Provide input, support and review estimates
- Optimize performance for data access requirements by choosing the appropriate native Hadoop file formats (Avro, Parquet, ORC, etc) and compression codec respectively.
- Participate in the agile development process, and document and communicate issues and bugs relative to data standards
- Pair up with other data engineers to develop analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids
- Challenge and inspire team members to achieve business results in a fast-paced and quickly changing environment
- Perform other duties and/or special projects as assigned
- Qualifications: Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 5 years of experience
- Minimum 7 years of extensive experience in design, build and deployment of Ab Initio-based applications
- Minimum 5 years of experience in Data Integration and/or Governance projects
- Expertise in handling complex large-scale Data Lake and Warehouse environments
- Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities
- Ability to build abstracted, reusable code components
- Excellent verbal communication skills
- You’re the one we’re looking for if you take the lead
- Able to communicate and coordinate across various teams.
- Are comfortable tackling new challenges and ways of working
- Are ready to move from traditional methods to agile ones
- Are ready to define your career path
- Are comfortable challenging your peers and leadership team
- Can prove yourself quickly and decisively