Job Description
Job Description
Languages: SQL, Unix Shell
Tools: Ab Initio software suite - Co> Op, EME, BRE, Conduct> It, Express> It, Metadata> Hub, Query> it, Control> Center
Ab Initio frameworks - Acquire> It, DQA, Spec-To-Graph, Testing Framework
Databases: Db2, Oracle, MySQL, Teradata, and MongoDB
Big Data: Cloudera Hadoop, Hive
Dev Methodologies: Agile, Waterfall
Others: JIRA, Service Now, Linux 6/7/8, SQL Developer, AutoSys, and Microsoft Office
- Total of 7+ Years of IT Experience predominantly in Data Integration/ Data Warehouse area
- Must have 6+ years of ETL Design and Development experience using Ab Initio
- 2+ years of Data Integration project experience on Hadoop Platform, preferably Cloudera
- Experience in Ab Initio EME, GDE, parallelism techniques, Graphs using Data, Pipeline & Component parallelism.
- Ability to design and build Ab Initio graphs (both continuous & batch) and Conduct> it Plans, and integrate with portfolio of Ab Initio softwares.
- Complete understanding and analytical ability of Metadata Hub metamodel.
- Build graphs interfacing with heterogeneous data sources - Oracle, Snowflake, Hadoop, Hive, AWS S3.
- Parse XML, JSON & YAML documents including hierarchical models.
- Build and implement data acquisition and transformation/curation requirements in a data lake or warehouse environment and demonstrate experience in leveraging various Ab Initio components.
- Build Control Center Jobs and Schedules for process orchestration.
- Build BRE rulesets for reformat, rollup & validation use cases.
- Ability to identify performance bottlenecks in graphs and optimize them.
- Rigor in high code quality, automated testing, and other engineering best practices, ability to write reusable code components.
- Ability to unit test the code thoroughly and to troubleshoot issues in production environments.
- Code management and development for Extract, Transform, & Load (ETL) routines using Ab Initio.
- Ensure Ab Initio code base is appropriately engineered to maintain current functionality and development that adheres to performance optimization, interoperability standards and requirements, and compliance with client IT governance policies.
- Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or ServiceNow.
- Successful project implementation of Ab Initio CDC (Change Data Capture) in a Data Integration/ETL project
- Demonstrable experience with Ab Initio components such as Rollup, Scan, join Partition by Key, Partition by Round Robin, Gather, Merge, Interleave, Lookup, and others.
- Excellent SQL, UNIX shell scripting, and performance tuning skills.
- Technical experience in design (mapping specifications, HLD, LLD), development (coding, unit testing) using Ab Initio.
- Some Java development experience is nice to have.
- Knowledge of Agile/Waterfall Development practices is required.
- Must be able to work independently and support other junior developers as needed.
- Keep abreast of Ab Initio best practices and new features to continuously improve development processes.
Location
Cincinnati, OH
Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
Job Tags