
Senior Data Engineer – SQL Migrations
Auxo Commercial
full-time
Posted on:
Location Type: Hybrid
Location: New York City • New York • United States
Visit company websiteExplore more
Salary
💰 $140,000 - $160,000 per year
Job Level
About the role
- Architect and lead enterprise-scale SQL Server 2012 to Azure SQL/Fabric Lakehouse migration initiatives.
- Design comprehensive migration strategies including phased approaches, cutover plans, rollback procedures, and zero-downtime deployment patterns.
- Own the design and optimization of complex ETL/ELT processes that transform legacy relational schemas into modern lakehouse data models.
- Architect and develop Lakehouse-native data pipelines using Azure Data Factory, Databricks, and Microsoft Fabric.
- Design and own APIs and composable data services built on Fabric Lakehouse.
- Architect data governance frameworks, data quality systems, and metadata management practices for migrated data environments.
- Lead technical assessments of legacy SQL Server 2012 environments and establish best practices for Lakehouse architecture, data modeling, and security.
- Mentor junior engineers on data migration methodologies, Azure platform services, and data engineering patterns.
Requirements
- Bachelor's Degree in Computer Science or related field; Azure Cloud Certifications strongly preferred.
- At least 6 years of Data Engineering experience, with 2+ years specifically architecting and leading SQL Server migrations to cloud platforms.
- Deep expertise in SQL Server 2012 architecture, performance tuning, schema design, and complex migration scenarios (dependency analysis, ETL refactoring, compatibility resolution).
- Proven experience architecting Lakehouse solutions and migrating normalized SQL schemas into lakehouse data models.
- Strong proficiency in Azure Data Factory, Synapse Analytics, Azure SQL, Data Lake Storage, and Microsoft Fabric.
- Demonstrated expertise designing and building APIs/data services on top of Lakehouse/Delta Lake.
- Advanced SQL and T-SQL optimization skills; proficiency in Python for data transformation and automation.
- Expertise in dbt for dimensional modeling, data lineage, and transformation documentation.
- Experience architecting Azure Infrastructure-as-Code solutions (Bicep, ARM templates, Terraform).
- Strong experience building and optimizing CI/CD pipelines for data infrastructure and data applications.
- Deep knowledge of data governance frameworks, data quality patterns, security (RBAC, row-level security), and metadata management.
- Experience with agile delivery on complex, multi-team external client projects.
- Strong communication and presentation skills; ability to influence technical and business stakeholders.
Benefits
- Medical, Dental, and & Vision Insurance.
- Life, Short Term Disability, and Long Term Disability Insurance.
- Accrued time off (25 days/year).
- Paid holidays.
- Annual target bonus.
- Company sponsored 401(k) plan.
- Monthly wellness/tech stipends.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
SQL Server 2012ETLELTAzure Data FactoryDatabricksMicrosoft FabricAPIsPythondbtInfrastructure-as-Code
Soft skills
mentoringcommunicationpresentationinfluencing stakeholders
Certifications
Bachelor's Degree in Computer ScienceAzure Cloud Certifications