Streamline

Senior Data Engineer – Snowflake

Streamline

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇺🇸 United States

Visit company website
AI Apply
Apply

Job Level

Senior

Tech Stack

AzureCloudETLGrafanaKafkaPythonSparkSQL

About the role

  • Design, develop, and deploy Azure Functions and broader Azure data services to extract, transform, and load data into Snowflake data models and marts.
  • Implement automated data quality checks, monitoring, and alerting to ensure accuracy, completeness, and timeliness across all pipelines.
  • Optimize workloads to reduce cloud hosting costs, including right-sizing compute, tuning queries, and leveraging efficient storage and caching patterns.
  • Build and maintain ELT/ETL workflows and orchestration to integrate multiple internal and external data sources at scale.
  • Design data pipelines that support both near real-time streaming data ingestion and scheduled batch processing to meet diverse business requirements.
  • Collaborate with engineering and product teams to translate requirements into robust, secure, and highly available data solutions.

Requirements

  • Strong expertise with Azure data stack (e.g., Azure Functions, Azure Data Factory, Event/Service Bus, storage) and Snowflake for analytical workloads.
  • Proven experience designing and operating production data pipelines, including CI/CD, observability, and incident response for data systems.
  • Advanced SQL and performance tuning skills, with experience optimizing transformations and Snowflake queries for cost and speed.
  • Solid programming experience in Python or similar for building reusable ETL components, libraries, and automation.
  • Experience with streaming and batch ingestion patterns (e.g., Kafka, Spark, Databricks) feeding Snowflake.
  • Familiarity with BI and analytics tools (e.g., Power BI, Grafana) consuming Snowflake data models.
  • Background in DevOps practices, including containerization, CI/CD pipelines, and infrastructure-as-code for data platforms.
  • Experience with modern data transformation tools (e.g., dbt) and data observability platforms for monitoring data quality, lineage, and pipeline health.
Benefits
  • A challenging and rewarding role in a dynamic and international environment.
  • Opportunity to be part of a growing company with a strong commitment to innovation and excellence.
  • A supportive and collaborative team culture that values personal growth and development.
  • Competitive compensation and benefits package.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
Azure FunctionsAzure Data FactorySnowflakeSQLPythonKafkaSparkDatabricksdbtdata quality checks
Soft skills
collaborationproblem-solvingcommunicationincident responseobservability