qode.world

Senior Databricks Architect

qode.world

full-time

Posted on:

Location Type: Hybrid

Location: TexasTexasUnited States

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Architect and implement end-to-end data solutions using Databricks Lakehouse Platform
  • Design and develop scalable data pipelines using PySpark, Spark, and SQL
  • Lead cloud data platform implementations on AWS/Azure/GCP
  • Drive modernization of legacy data systems to cloud-native architectures
  • Implement Medallion Architecture (Bronze, Silver, Gold layers)
  • Ensure data governance, security, and compliance (especially for Financial Services)
  • Collaborate with business stakeholders, data scientists, and engineering teams
  • Optimize performance, cost, and reliability of data pipelines
  • Mentor and guide data engineering teams and provide architectural leadership.

Requirements

  • 12+ years of experience in Data Engineering / Data Architecture
  • Strong expertise in Databricks, PySpark, and Spark ecosystem
  • Hands-on experience with cloud platforms (Azure, AWS, or GCP)
  • Experience with data orchestration tools (Airflow, ADF, etc.)
  • Solid understanding of data modeling, ETL/ELT, and distributed systems
  • Experience with Delta Lake and Unity Catalog is a plus
  • Strong knowledge of Financial Services domain (Banking, Insurance, Capital Markets)
  • Excellent problem-solving and stakeholder management skills.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
DatabricksPySparkSparkSQLdata pipelinesdata modelingETLELTDelta LakeUnity Catalog
Soft Skills
problem-solvingstakeholder managementmentoringleadership