Verinext

Data Consultant, Databricks

Verinext

contract

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Lead the development of scalable data pipelines within the Databricks ecosystem
  • Architect robust ETL/ELT processes using a "configuration-as-code" approach
  • Migrate data ingestion and transformation workloads to Databricks using Lake flow declarative pipelines

Requirements

  • Strong hands-on experience with Databricks in production environments
  • Deep expertise in PySpark and advanced SQL
  • Experience with:
  • - Delta Lake
  • - ingestion pipelines (batch + streaming)
  • - data transformation frameworks/patterns
  • Proven experience implementing:
  • - CI/CD in Databricks
  • - Databricks Asset Bundles (DABs)
  • - declarative pipelines (Lakeflow)
  • Strong AWS infrastructure familiarity (S3, IAM, compute patterns)
  • Terraform experience specifically with Databricks + AWS resources
  • PowerShell scripting experience (asset)
Benefits
  • Retirement Plan (401k, IRA)
  • Work From Home
  • Health Care Plan
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
DatabricksPySparkSQLDelta LakeETLELTCI/CDTerraformPowerShelldata transformation frameworks