Invictus Capital Partners

Data Engineer – Azure, Databricks

Invictus Capital Partners

full-time

Posted on:

Location Type: Hybrid

Location: Bloomington • Minnesota • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $130,000 - $150,000 per year

Job Level

Mid-LevelSenior

Tech Stack

AzurePySparkSparkSQLUnity

About the role

  • Design, develop, and optimize data pipelines in Azure Databricks using PySpark and SQL, applying Delta Lake and Unity Catalog best practices.
  • Build modular, reusable libraries and utilities within Databricks to accelerate development and standardize workflows.
  • Implement Medallion architecture (Bronze, Silver, Gold layers) for scalable, governed data zones.
  • Integrate external data sources via REST APIs, SFTP file delivery, and SQL Server Managed Instance, implementing validation, logging, and schema enforcement.
  • Utilize parameter-driven jobs and manage compute using Spark clusters and Databricks serverless.
  • Collaborate with data analytics teams and business stakeholders to understand requirements and deliver analytics-ready datasets.
  • Monitor and troubleshoot Azure Data Factory (ADF) pipelines (jobs, triggers, activities, data flows) to identify and resolve job failures and data issues.
  • Automate deployments and manage code using Azure DevOps for CI/CD, version control, and environment management.
  • Contribute to documentation, architectural design, and continuous improvement of data engineering best practices.
  • Support the design and readiness of the data platform for AI and machine learning initiatives.

Requirements

  • Bachelor’s degree in Computer Science, Data Engineering, Information Systems.
  • 5+ years of hands-on data-engineering experience in Azure-centric environments.
  • Expertise with Azure Databricks, PySpark, Delta Lake, and Unity Catalog.
  • Strong SQL skills with experience in Azure SQL Database or SQL Server Managed Instance.
  • Proficiency in Azure Data Factory for troubleshooting and operational support.
  • Experience integrating external data using REST APIs and SFTP.
  • Working knowledge of Azure DevOps for CI/CD, version control, and parameterized deployments.
  • Ability to build and maintain reusable Databricks libraries, utility notebooks, and parameterized jobs.
  • Proven track record partnering with data analytics teams and business stakeholders.
  • Excellent communication, problem-solving, and collaboration skills.
  • Interest or experience in AI and machine learning data preparation.
Benefits
  • Great compensation package
  • Attractive benefits plans and paid time off
  • 401(k) w/ company matching
  • Professional learning and development opportunities
  • Tuition Reimbursement
  • And much more!

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
PySparkSQLDelta LakeUnity CatalogAzure DatabricksAzure Data FactoryREST APIsSFTPCI/CDdata engineering
Soft skills
communicationproblem-solvingcollaboration
Certifications
Bachelor’s degree in Computer ScienceBachelor’s degree in Data EngineeringBachelor’s degree in Information Systems