FREE ACCESS
5,000–10,000 jobs/day

See all jobs on JobTailor
Search thousands of fresh jobs every day.
Discover
- Fresh listings
- Fast filters
- No subscription required
Create a free account and start exploring right away.
Tech Stack
Tools & technologiesAWSCloudDynamoDBGrafanaPythonSQLTableauTerraform
About the role
Key responsibilities & impact- Build and maintain ELT pipelines that ingest data from systems.
- Co-own the mapping and migration of source data into the new 3NF EDW, ensuring data integrity, reducing redundancy, and maintaining automated unit and data tests.
- Develop data observability processes and monitoring dashboards to track pipeline health, freshness, and data quality across Databricks.
- Build new data and AI-powered tooling to improve the productivity of the data engineering team and broaden self-service data access for Route employees and external partners.
- Help harden the Integration Pipeline by automating deployment of shared staging and production infrastructure for new pipelines and managing dependency updates for dbt and CI templates.
- Support the full migration from Snowflake to Databricks, targeting completion by end of Q2 2027, including reporting services and ingest/egress jobs.
- Coordinate with engineering, analytics, product, and business teams to define and prioritize data requirements and ensure end-to-end data lifecycle coverage for existing and new products.
- Champion data democratization, help establish a company-wide data retention policy, and expand the foundation for a self-service Silver layer (EDW) that serves as a single source of truth.
Requirements
What you’ll need- 4+ years of formal, professional data engineering experience
- 3+ years of SQL, fluency in complex transformations, window functions, query optimization
- 2+ years of python, data pipeline development, scripting, testing, and package management (Poetry)
- 2+ years of experience with AWS (e.g. - S3, RDS, DMS, DynamoDB) across data-related services
- 1+ years of experience using Databricks, our primary development platform for this role
- Experience using Terraform and GoLang
- PagerDuty / Grafana / Tableau, preferred experience
- Understanding of third normal form (3NF) data modeling and when to apply it
- Knowledge and application of data theory
- Working knowledge of data security practices and least-privilege access standards
- Experience with data access controls in cloud environments (IAM roles, catalog permissions, etc.)
Benefits
Comp & perks- We offer to pay 95% - 100% of your health insurance premiums for you and your family
- Remote or hybrid work arrangements
- Unlimited PTO
- 401k matching
- Formalized growth opportunities
- Learning & development
- DEI programs & events
- So much more.
ATS Keywords
✓ Tailor your resumeApplicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
SQLPythonData pipeline developmentData testingData modelingData theoryData security practicesTerraformGoLangAutomated unit tests
Soft Skills
CollaborationCommunicationData democratizationProblem-solvingPrioritization
