A.C.Coy Company

Data Warehousing Engineer, Ab Initio/Teradata

A.C.Coy Company

contract

Posted on:

Origin:  • 🇺🇸 United States • Virginia

Visit company website
AI Apply
Manual Apply

Job Level

SeniorLead

Tech Stack

CloudETLKafkaLinuxShell ScriptingSQLUnix

About the role

  • Responsible for the development and maintenance of applications providing ETL services in support of the Data Asset Services group.
  • Tasks must be completed under the direction, oversight, and prioritization of the Data Technology Services Manager.
  • Ab Initio ETL Coding in GDE.
  • Ab Initio Metadata Hub Lineage.
  • Ab Initio TRMC.
  • ANSI SQL and Teradata SQL extensions.
  • Teradata SQL Assistant (a.k.a. QueryMan) for EDS Support.
  • Teradata Utilities for EDS Support: BTEQ FastLoad MultiLoad FastExport TPump.
  • Utilize UNIX commands and concepts to navigate source code directories, error logs, perform impact analysis assessments, edit code, and version files.
  • Utilize Linux Shell scripting in order to read and create driver scripts.
  • Ensure software standards are met.
  • Analyze user needs and software requirements to determine feasibility of design within time and cost constraints.
  • Confer with systems analysts, engineers, programmers and others to design systems and to obtain information on project limitations and capabilities.

Requirements

  • Bachelor's degree in computer science, software engineering or relevant field is desired.
  • 10+ years of software development experience.
  • Working knowledge of Ab Initio for data extraction, transformation, and loading (ETL), particularly involving Teradata as a source and/or target.
  • Hands-on experience developing graphs, plans, and PSETs, as well as creating and debugging test processes.
  • Working knowledge of how MicroStrategy and other OLAP tools interface with Teradata.
  • Strong understanding of major Data Warehouse modeling techniques (Third Normal Form and Dimensional Modeling) and their impact on ETL performance, structured query reporting, and unstructured data analysis.
  • Experience with Web API development.
  • Proficiency in storing, reading, and analyzing streaming data using technologies such as Kafka, MQSeries, or JMS.
  • Working knowledge of cloud technologies, including development and architecture.
  • Experience in designing and tuning BI metrics, dashboards, and reports for optimal performance.
  • Familiarity with PDL (Parameter Definition Language) scripting.
  • Excellent communication skills.
  • Must be able to obtain a Public Trust clearance.
  • All candidates must be a US Citizen or have permanent residence status (Green Card).
  • Candidate must have lived in the United States for the past 5 years.
  • Cannot have more than 6 months travel outside the United States within the last 5 years.
  • Military Service excluded.