Wayvia (formerly PriceSpider)

Data Engineer III

Wayvia (formerly PriceSpider)

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $130,000 - $160,000 per year

About the role

  • Lead the design, development, testing, deployment, maintenance, and improvement of advanced data engineering solutions and pipelines.
  • Prioritize, manage, and deliver multiple high-impact projects.
  • Establish data quality standards — automated tests, anomaly detection, and freshness SLAs
  • Write and maintain dbt models, tests, and documentation to ensure data quality and consistency
  • Identify infrastructure improvements to reduce cost, improve reliability, and eliminate single points of failure
  • Build and govern canonical dbt models in Snowflake using dimensional modeling principles, ensuring consistent metric definitions and entity relationships across Looker and the MCP server for both BI and AI consumers
  • Own dbt model governance — standardize modeling patterns, enforce testing and documentation requirements, and manage the project structure across domains
  • Define and maintain a consistent semantic layer — metric definitions, business logic, and entity relationships — ensuring alignment across BI tools, the MCP server, and AI consumers regardless of the tooling layer
  • Support Snowflake administration/Looker administration alongside other data platform team members

Requirements

  • BS degree in Computer Science or related field or equivalent practical experience.
  • 5+ years of data engineering experience, with a track record of designing, building, and maintaining ELT pipelines and dimensional data models in production environments
  • Expertise in processing and integrating large-scale data from multiple sources.
  • Advanced SQL query skills with complex joins and common-table expressions.
  • Strong hands-on Snowflake experience (prior admin experience a plus, but not required)
  • Proficiency with dbt — production-grade model development, testing, documentation, and project governance
  • Proficient coding skills in Python, Java, or similar languages.
  • Proficiency in Git-based version control and CI/CD pipelines for data infrastructure
  • Familiarity with workflow orchestration tools or Data Orchestration Platform — (Airflow/Perfect/Dagster, etc)
  • Strong sense of data ownership — you treat data as a product and hold a high bar for quality and reliability
Benefits
  • Flexible work-from-home arrangements
  • 401K Match
  • Flexible vacation
  • Medical/Dental/Vision
  • 16 weeks of paid parental leave (US)
  • Technical stipend
  • Professional development programs
  • Wellness programs
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringELT pipelinesdimensional data modelsSQLSnowflakedbtPythonJavaGitCI/CD
Soft Skills
project managementdata ownershipquality assurancecommunicationleadership
Certifications
BS degree in Computer Science