
Senior Data Engineer – Snowflake, Azure
Streamline
full-time
Posted on:
Location Type: Remote
Location: United States
Visit company websiteExplore more
Job Level
About the role
- Design, develop, and deploy Azure Functions and broader Azure data services to extract, transform, and load data into Snowflake data models and marts.
- Implement automated data quality checks, monitoring, and alerting to ensure accuracy, completeness, and timeliness across all pipelines.
- Optimize workloads to reduce cloud hosting costs, including right-sizing compute, tuning queries, and leveraging efficient storage and caching patterns.
- Build and maintain ELT/ETL workflows and orchestration to integrate multiple internal and external data sources at scale.
- Design data pipelines that support both near real-time streaming data ingestion and scheduled batch processing to meet diverse business requirements.
- Collaborate with engineering and product teams to translate requirements into robust, secure, and highly available data solutions.
Requirements
- Strong expertise with Azure data stack (e.g., Azure Functions, Azure Data Factory, Event/Service Bus, storage) and Snowflake for analytical workloads.
- Proven experience designing and operating production data pipelines, including CI/CD, observability, and incident response for data systems.
- Advanced SQL and performance tuning skills, with experience optimizing transformations and Snowflake queries for cost and speed.
- Solid programming experience in Python or similar for building reusable ETL components, libraries, and automation.
- Experience with streaming and batch ingestion patterns (e.g., Kafka, Spark, Databricks) feeding Snowflake.
- Familiarity with BI and analytics tools (e.g., Power BI, Grafana) consuming Snowflake data models.
- Background in DevOps practices, including containerization, CI/CD pipelines, and infrastructure-as-code for data platforms.
- Experience with modern data transformation tools (e.g., dbt) and data observability platforms for monitoring data quality, lineage, and pipeline health.
- Ability to adapt to a fast-paced and dynamic work environment.
- Self-motivated and able to work independently with minimal supervision, taking initiative to drive projects forward.
- Expert-level problem-solving skills with the ability to diagnose complex data pipeline issues and architect innovative solutions.
- Proven ability to integrate and analyze disparate datasets from multiple sources to deliver high-value insights and drive business impact.
- Strong problem-solving skills and attention to detail.
- Proven ability to manage multiple priorities and deadlines.
- Passionate about staying current with emerging data engineering technologies and best practices, driving innovation to enhance product capabilities and maintain competitive advantage.
- Experience developing and architecting SaaS platforms with a focus on scalability, multi-tenancy, and cloud-native design patterns.
Benefits
- A challenging and rewarding role in a dynamic and international environment.
- Opportunity to be part of a growing company with a strong commitment to innovation and excellence.
- A supportive and collaborative team culture that values personal growth and development.
- Competitive compensation and benefits package.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
Azure FunctionsAzure Data FactorySnowflakeSQLPythonKafkaSparkDatabricksdbtCI/CD
Soft Skills
problem-solvingself-motivatedadaptabilityinitiativeattention to detailtime managementcollaborationinnovationindependencecommunication