Tiger Analytics

GenAI Data Engineer

Tiger Analytics

full-time

Posted on:

Location Type: Remote

Location: ConnecticutUnited States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Responsible for designing, building, and maintaining data pipelines, data integration processes, and data infrastructure.
  • Collaborate closely with data scientists, analysts, and other stakeholders to ensure efficient data flow and support data-driven decision making across the organization.

Requirements

  • Design and implement robust data pipelines that ingest, process, and store unstructured data formats at scale within **Snowflake** and **GCP**.
  • Leverage Snowflake’s unstructured data capabilities (Directory Tables, Scoped URLs, Snowpark) to make "dark data" queryable and actionable.
  • Build and maintain cloud-native ETL/ELT processes using BigQuery, Cloud Storage, and Dataflow, ensuring seamless integration between GCP and Snowflake.
  • Instead of just using LLMs, you will integrate AI tools (OCR, NLP entities, Document AI) into the engineering flow to transform unstructured blobs into structured insights.
  • Tune complex SQL queries and Python-based processing jobs to handle petabyte-scale environments efficiently.
Benefits
  • Significant career development opportunities exist as the company grows. The position offers a unique opportunity to be part of a small, challenging, and entrepreneurial environment, with a high degree of individual responsibility.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data pipelinesdata integrationETLELTSQLPythonunstructured datadata processingdata storagecloud-native
Soft Skills
collaborationcommunicationproblem-solvingdata-driven decision making