
Data Engineer
Career Opportunities International
full-time
Posted on:
Location Type: Remote
Location: Remote • Vermont • 🇺🇸 United States
Visit company websiteJob Level
Mid-LevelSenior
Tech Stack
AirflowCloudPythonSQL
About the role
- Architect, build, and maintain next-generation data pipelines
- Design and build robust, scalable ELT pipelines to ingest data into Snowflake
- Own the dbt project structure, developing complex SQL-based data models
- Manage the Snowflake environment for cost-efficiency and performance
- Champion data integrity and implement observability tools
- Mentor junior engineers and establish best practices for SQL and version control
Requirements
- Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related technical field (or equivalent practical experience)
- 3+ years of professional experience in Data Engineering and Database Development with medical and prescription claims
- 2+ years of hands-on experience specifically with Snowflake (architecture, snowpipe, streams/tasks, and security)
- 1+ years of production experience with dbt (developing packages, macros, and incremental models)
- SQL Mastery: Expert-level SQL skills with the ability to write complex, highly optimized queries
- Programming: Proficiency in Python for scripting, custom connectors, or orchestration tasks
- Orchestration: Experience with workflow orchestration tools (e.g., Airflow or FiveTran, dbt or dbt Cloud)
- Version Control: Strong familiarity with Git flows and CI/CD pipelines for data (e.g., GitHub Actions, GitLab CI)
Benefits
- Health insurance
- Retirement plans
- Flexible work arrangements
- Professional development
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data pipelinesELT pipelinesSQLdbtSnowflakePythonworkflow orchestrationGitCI/CD
Soft skills
mentoringbest practicesdata integrityobservability