Workiva

Staff Data Engineer

Workiva

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $129,000 - $207,000 per year

Job Level

Lead

Tech Stack

KafkaPythonSQL

About the role

  • Architect and build high-performance data solutions that directly power customer-facing features, utilizing the internal data platform (Snowflake, dbt, Kafka).
  • Lead the design and delivery of complex data projects independently, from initial discovery with stakeholders to production deployment and monitoring.
  • Partner with Product Managers and Business Leaders to translate customer needs into technical data requirements, ensuring "Data-as-a-Product" excellence.
  • Work embedded with Application Engineering teams to advocate for upstream data quality and ensure application architectures support downstream data needs.
  • Act as the "Lead Customer" for our internal Data Platform team—identifying gaps in the platform’s capabilities and contributing to its strategic roadmap based on product requirements.
  • Establish and evangelize standards for data modeling, observability, and performance within the product domain.
  • Design resilient, production-grade pipelines using DLT and Snowpipe that handle enterprise-scale workloads with low latency.
  • Own the domain’s dbt layer, ensuring code is modular, tested, and optimized for high-performance serving in Snowflake.
  • Guide and elevate Senior and Mid-level engineers on the team through code reviews, design docs, and technical coaching.

Requirements

  • 8+ years of experience in Data Engineering, with a strong emphasis on building solutions for customer-facing products or applications.
  • Independent Execution: Proven ability to lead large-scale projects from concept to completion with minimal supervision, navigating ambiguity across multiple teams.
  • Strategic Communication: Exceptional ability to evangelize data strategy to non-technical stakeholders and influence application engineers on data best practices.
  • Mastery of the Stack: Deep expertise in Snowflake, dbt, and Kafka for building real-world, high-value data products.
  • Software Mindset: Strong Python and SQL skills, with a focus on building reusable libraries, automation, and CI/CD-driven workflows.
  • Product Sense: Experience working in a SaaS product environment, understanding how data latency and accuracy impact the end-user experience.
  • Data Modeling: Advanced understanding of modeling for both analytical (OLAP) and application-support (high-concurrency) use cases.
Benefits
  • A discretionary bonus typically paid annually
  • Restricted Stock Units granted at time of hire
  • 401(k) match and comprehensive employee benefits package

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
Data EngineeringSnowflakedbtKafkaPythonSQLData ModelingCI/CDDLTSnowpipe
Soft skills
Independent ExecutionStrategic CommunicationTechnical CoachingProject LeadershipStakeholder EngagementData AdvocacyAmbiguity NavigationInfluencing SkillsCollaborationMentorship