Spiral Scout

AI Data Engineer

Spiral Scout

contract

Posted on:

Location Type: Remote

Location: Remote • 🇺🇸 United States

Visit company website
AI Apply
Apply

Job Level

Mid-LevelSenior

Tech Stack

AWSAzureCloudETLGoogle Cloud PlatformPythonSQL

About the role

  • Build and maintain data pipelines (Python, SQL, ETL, APIs) to prepare structured and unstructured data for AI workflows.
  • Translate client problems into orchestrated AI workflows, balancing automation and human-in-the-loop design.
  • Configure multi-agent logic (planner/worker, feedback loops) using LangChain, Wippy, n8n, Zapier, Make, or custom Python code.
  • Prototype and ship proof-of-concepts: onboarding bots, quoting assistants, presales flows, project management helpers.
  • Facilitate scoping workshops with stakeholders to clarify requirements and design workflows.
  • Collaborate with engineers and product leads to create reusable AI workflow templates and automation patterns.

Requirements

  • Strong data engineering experience: Python + SQL, pipelines, ETL, API integrations.
  • Experience working with data ahead of AI: cleaning, structuring, connecting multiple sources.
  • Hands-on with AI tools: ChatGPT, Claude, LangChain, n8n, Zapier, Make, Autogen, or similar.
  • Understanding of LLM orchestration beyond prompt engineering.
  • Systems thinking: ability to design workflows with multiple agents, branching logic, loops, and state.
  • Strong communication skills: able to explain AI/data concepts and lead workshops with technical and non-technical stakeholders.
  • Experience with agent-based frameworks (LangGraph, CrewAI, AutoGen).
  • Designed human-in-the-loop workflows (customer support, onboarding, quoting, project management).
  • Prototyping with low-code/no-code platforms (Zapier, Airtable, Streamlit, custom GPTs).
  • Familiarity with cloud platforms (AWS, GCP, Azure).
  • Exposure to ML practices (fine-tuning, RAG, evaluation, multimodal inputs).
Benefits
  • Start with a paid pilot project to evaluate collaboration.
  • Work at the cutting edge of AI workflows and data-driven automation.
  • Fully remote, flexible schedule.
  • Fast-moving, innovation-driven culture where ideas quickly turn into practice.
  • Pathway to extended part-time or full-time engagement.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
PythonSQLETLAPIsdata pipelinesAI workflowsdata cleaningdata structuringagent-based frameworksML practices
Soft skills
communication skillssystems thinkingworkshop facilitationcollaborationproblem-solving
ATPCO

Principal Data Engineer

ATPCO
Leadfull-time$145k–$162k / yearVirginia · 🇺🇸 United States
Posted: 10 hours agoSource: jobs.smartrecruiters.com
Amazon RedshiftApacheAWSCloudJavaKafkaPythonScalaSQL
ATPCO

Senior Data Engineer

ATPCO
Seniorfull-time$112k–$132k / yearVirginia · 🇺🇸 United States
Posted: 10 hours agoSource: jobs.smartrecruiters.com
Amazon RedshiftApacheAWSCloudETLJavaKafkaPythonScalaSQL
Abbott

Staff Data Engineer

Abbott
Leadfull-time$97k–$195k / year🇺🇸 United States
Posted: 11 hours agoSource: abbott.wd5.myworkdayjobs.com
Amazon RedshiftAWSCloudGoKafkaPySparkPythonSpark
AlphaPoint

Cloud Data Engineer

AlphaPoint
Senior · Leadfull-time🇺🇸 United States
Posted: 13 hours agoSource: alphapoint.applytojob.com
AirflowApacheAWSAzureCassandraCloudDynamoDBElasticSearchETLGoogle Cloud PlatformJavaJavaScript+9 more