Kong Inc.

Senior Data Engineer

Kong Inc.

full-time

Posted on:

Location Type: Remote

Location: CaliforniaUnited States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $125,000 - $175,000 per year

Job Level

About the role

  • Design, build, and maintain ETL/ELT pipelines using Fivetran + Snowflake integrations to ingest data from a variety of sources into our Snowflake data warehouse
  • Develop and manage robust data models in Snowflake, ensuring data is structured for performance, reliability, and ease of use by analysts and business stakeholders
  • Use Hightouch to operationalize data by syncing warehouse data to downstream CRM, marketing, and sales tools
  • Monitor pipeline health, troubleshoot data quality issues, and implement alerting to proactively catch failures
  • Document data models, pipelines, and lineage to support a culture of data literacy and self-service analytics
  • Integrate Claude and other LLMs directly with our Snowflake data warehouse, enabling AI-powered querying, summarization, and insight generation on top of live revenue data
  • Build and maintain data sources, semantic layers, and search services within Snowflake Cortex and connected AI platforms
  • Design and deploy AI agents that can reason over structured and unstructured revenue data to support go-to-market workflows
  • Architect and manage multi-step agent workflows, coordinating across tools, APIs, and data sources to automate complex analytical and operational tasks
  • Evaluate and implement orchestration frameworks (e.g., LangChain, LlamaIndex, or custom solutions) best suited to our use cases
  • Run rigorous evaluations of AI tools, models, and platforms to determine the best solution for each use case (e.g., Snowflake Cortex vs. Claude vs. Gemini vs. custom fine-tuned models)
  • Develop evaluation frameworks covering quality, latency, cost, and security to inform build vs. buy decisions and guide our overall AI roadmap
  • Stay current on the rapidly evolving AI landscape and proactively recommend new tools or approaches as the space matures
  • Implement and manage row-level security (RLS) in Snowflake to ensure AI tools only surface data that users are authorized to see
  • Maintain and evolve role-based access controls (RBAC) alongside new RLS policies
  • Contribute to data governance practices, including access controls, PII handling, and schema management
  • Partner with Data, Security, and Legal teams to establish AI data governance standards and guardrails
  • Monitor and manage AI credit consumption across Snowflake Cortex, API usage, and other platforms to keep spending within budget
  • Identify and implement optimizations — such as caching, prompt tuning, model selection, and query efficiency improvements — to reduce cost without sacrificing quality
  • Build reporting to give stakeholders visibility into AI spend and usage trends
  • Partner with Revenue Operations, Finance, and Sales to understand data needs and translate them into scalable engineering solutions
  • Collaborate across technical and non-technical teams to deliver data and AI solutions that directly influence revenue strategy

Requirements

  • 3+ years of experience in a data engineering, analytics engineering, or AI/ML engineering role
  • Hands-on experience with Snowflake, including data modeling, query optimization, Cortex Analyst, Cortex Search, semantic layers, and security model (RBAC, RLS)
  • Proficiency with Fivetran for pipeline orchestration and connector management
  • Experience with Hightouch or similar reverse ETL tools for syncing data to operational systems
  • Experience integrating LLMs (Claude, GPT-4, or similar) into production data workflows via API
  • Familiarity with agent orchestration frameworks and patterns (e.g., LangChain, LlamaIndex, CrewAI, or custom implementations)
  • Strong understanding of AI/LLM evaluation methodologies — you know how to measure whether an AI solution is actually working
  • Experience with prompt engineering, retrieval-augmented generation (RAG), and/or fine-tuning
  • Strong SQL and Python skills
  • A security-first mindset with experience managing data access controls in cloud data platforms
  • Strong communication skills and the ability to collaborate across technical and non-technical teams.
Benefits
  • Healthcare benefits
  • 401(k) plan
  • Short and long term disability benefits
  • Basic life and AD&D insurance
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
ETLELTdata modelingquery optimizationSQLPythonprompt engineeringretrieval-augmented generationdata access controlsAI evaluation methodologies
Soft Skills
strong communicationcollaborationtroubleshootingdocumentationdata literacyproactive recommendationorganizational skillsanalytical thinkingstakeholder engagementcross-functional teamwork