Lean Tech

Senior Data Engineer

Lean Tech

full-time

Posted on:

Origin:  • 🇺🇸 United States

Visit company website
AI Apply
Manual Apply

Job Level

Senior

Tech Stack

AWSAzureCloudDockerETLGoogle Cloud PlatformHadoopKubernetesPythonSparkSQLTableau

About the role

  • Design, build, and optimize advanced ETL/ELT pipelines and data integration workflows across the Microsoft Azure ecosystem
  • Leverage Azure Data Factory, Azure Fabric, Azure Data Lake, and SQL Server to modernize, migrate, and maintain data systems
  • Implement and maintain medallion architecture (bronze/silver/gold) to structure data for downstream analytics
  • Establish data quality assurance processes: validate, monitor, troubleshoot, and performance tune data pipelines
  • Integrate and manage REST APIs and data flows within the Microsoft environment
  • Contribute to pipeline orchestration and monitoring using ADF, Azure Databricks, Azure Functions, and Logic Apps
  • Develop and utilize advanced SQL and Python (including Azure SDK for Python) to process and automate data workflows
  • Document data models, pipeline architecture, and integrations to support transparency and scalability
  • Contribute to ongoing data platform re-architecture and modernization initiatives
  • Collaborate in Agile teams with Data Scientists, BI teams, and stakeholders; participate in sprint planning and daily stand-ups
  • Implement systems and best practices to monitor data quality, ensure data integrity, and manage data security
  • Automate data collection, processing, and reporting to improve operational efficiency
  • Stay current with evolving Azure data and analytics technologies and industry best practices

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or a related field
  • 5+ years of experience in data engineering or backend systems with a strong data focus
  • Advanced expertise in designing, building, and optimizing ETL/ELT pipelines for data integration
  • Strong proficiency in SQL and Python for data processing and automation; experience with the Azure SDK for Python
  • Hands-on experience with Microsoft Azure ecosystem: Azure Fabric, Azure Data Factory, Azure Data Lake, and SQL Server
  • Practical experience with data pipeline orchestration, monitoring, and data quality assurance using Azure Data Factory, Azure Databricks, Azure Functions, and Logic Apps
  • Working familiarity with medallion architecture (bronze/silver/gold) and modern data warehouse practices
  • Applied knowledge of API integration, specifically REST APIs, within the Microsoft ecosystem
  • Experience documenting data models, pipelines, and integrations
  • Proficiency in version control (Git) and Agile tools (Jira, Confluence); experience participating in Agile ceremonies
  • Strong analytical and problem-solving skills, performance tuning and troubleshooting large-scale data systems
  • Experience implementing and maintaining data infrastructure and data quality monitoring; ensuring data integrity and security
  • Automation of data collection, processing, and reporting tasks to improve operational efficiency
  • Nice to have: familiarity with AzureML, Spark/Hadoop, AWS/GCP, Microsoft Certified: Azure Data Engineer Associate, Docker/Kubernetes, DevOps practices, Power BI/Tableau, mentoring experience