qode.world

Senior Data Engineer

qode.world

full-time

Posted on:

Location Type: Hybrid

Location: ClevelandOhioPennsylvaniaUnited States

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Experience: 8–10 years in Data Engineering and Data Analysis.
  • Informatica Expertise: Strong hands-on experience in Informatica PowerCenter/IDQ for ETL design, development, and optimization.
  • PySpark Development: Advanced skills in PySpark for large-scale data processing, transformation, and analytics.
  • Hadoop Ecosystem: Solid working knowledge of Hadoop technologies (HDFS, Hive, Sqoop, MapReduce).
  • Programming Skills: Proficiency in Python and Kafka for streaming and batch data pipelines.
  • Database & Modeling: Strong understanding of database concepts, data design, data modeling, and ETL workflows.
  • ETL Lifecycle: Experience in analyzing, designing, and coding ETL programs including data extraction, ingestion, quality checks, normalization, and loading.
  • Agile Delivery: Hands-on experience with Agile methodology and Jira for project delivery.
  • Client Interaction: Proven ability in client-facing roles with strong communication and leadership skills to coordinate across SDLC.
  • Preferred Skills: Exposure to AWS data components and analytics. Familiarity with machine learning models and AI concepts. Experience with data modeling tools such as Erwin.

Requirements

  • 8–10 years in Data Engineering and Data Analysis.
  • Strong hands-on experience in Informatica PowerCenter/IDQ for ETL design, development, and optimization.
  • Advanced skills in PySpark for large-scale data processing, transformation, and analytics.
  • Solid working knowledge of Hadoop technologies (HDFS, Hive, Sqoop, MapReduce).
  • Proficiency in Python and Kafka for streaming and batch data pipelines.
  • Strong understanding of database concepts, data design, data modeling, and ETL workflows.
  • Experience in analyzing, designing, and coding ETL programs including data extraction, ingestion, quality checks, normalization, and loading.
  • Hands-on experience with Agile methodology and Jira for project delivery.
  • Proven ability in client-facing roles with strong communication and leadership skills to coordinate across SDLC.
  • Preferred Skills:
  • Exposure to AWS data components and analytics.
  • Familiarity with machine learning models and AI concepts.
  • Experience with data modeling tools such as Erwin.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
Informatica PowerCenterInformatica IDQPySparkHadoopHDFSHiveSqoopMapReducePythonKafka
Soft Skills
communicationleadershipclient-facingcoordination