Auxo Commercial

SQL Migration Data Engineer

Auxo Commercial

full-time

Posted on:

Location Type: Hybrid

Location: New York CityNew YorkUnited States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $97,000 - $125,000 per year

About the role

  • Collaborate with the delivery team on a Server 2012 to Azure SQL/Fabric Lakehouse migration, including assessment, planning, and execution.
  • Develop and optimize ETL/ELT processes to migrate legacy SQL 2012 databases to modern cloud data platforms, minimizing data loss and downtime.
  • Design and build data pipelines using Azure Data Factory, Databricks, and Microsoft Fabric Lakehouse to transform monolithic databases into distributed Lakehouse architectures.
  • Develop APIs and data services on top of Microsoft Fabric Lakehouse to expose migrated data for downstream applications and stakeholders.
  • Collaborate with infrastructure and application teams to assess legacy SQL 2012 environments, identify technical debt, and plan phased migration approaches.
  • Develop infrastructure and automation required for optimal extraction, transformation, and loading of data from SQL Server 2012 and other legacy sources using SQL, dbt, Python, and Fabric technologies.
  • Define and document cloud solution architectures, migration roadmaps, and technical designs for data modernization initiatives.
  • Generate and document unit tests, performance benchmarks, and migration validation scripts.
  • Establish data quality frameworks and governance practices for migrated data assets in Lakehouse environments.

Requirements

  • Bachelor's Degree in Computer Science or related field.
  • Azure Cloud Certifications strongly preferred.
  • At least 3 years of Data Engineering experience, with 1+ years specifically in SQL Server migrations to cloud platforms.
  • Hands-on experience with SQL Server 2012 architecture, T-SQL optimization, and migration patterns (compatibility issues, index strategies, etc.).
  • Proficiency in Azure Data Factory, Synapse Analytics, Azure SQL, Data Lake Storage, and Microsoft Fabric (especially Lakehouse), including data modeling, partitioning, and optimization for analytical workloads.
  • Demonstrated experience building APIs or data services on top of Lakehouse/Delta Lake architectures.
  • Proficiency with dbt for transformation logic and data lineage documentation.
  • Strong command of Python, SQL, T-SQL, and scripting for automation and data validation.
  • Experience with Azure Infrastructure-as-Code (Bicep, ARM templates, Terraform).
  • Experience building CI/CD pipelines for data infrastructure.
  • Knowledge of data governance, metadata management, and data quality frameworks.
  • Ability to work independently in Agile environments with minimal supervision on external client projects.
Benefits
  • Medical, Dental, and Vision Insurance.
  • Life, Short Term Disability, and Long Term Disability Insurance.
  • Accrued time off (25 days/year).
  • Paid holidays.
  • Annual target bonus.
  • Company sponsored 401(k) plan.
  • Monthly wellness/tech stipends.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
ETLELTdata pipelinesAPIsdata servicesSQLdbtPythonT-SQLdata modeling
Soft Skills
collaborationindependenceproblem-solvingdocumentationcommunicationorganizational skillsattention to detailadaptabilitycritical thinkingtime management
Certifications
Azure Cloud Certifications