
Databricks Platform Engineer
Precision Solutions
full-time
Posted on:
Location Type: Hybrid
Location: Atlanta • United States
Visit company websiteExplore more
About the role
- Drive the next generation of our analytics products by architecting and delivering end-to-end cloud analytics solutions
- Partner with product managers, data scientists, software engineers, ML engineers and business analysts to enable and support a platform that powers personalized experiences and operational efficiency
- Leverage automated model CI/CD, containerized deployments (Docker/Kubernetes), and real-time monitoring to ensure our ML systems are reliable, secure, and scalable
- Design, implementation, and maintenance of the Databricks workspaces and its associated Azure infrastructure
- Manage and optimize data and machine learning workloads, ensuring a reliable, secure, and scalable platform for business analysts, data engineering and data science teams
- Lead the administration and governance framework of the platform regarding the security posture of the enterprise and its key stakeholders
- Design and implement automated solutions to streamline accounts and workspace administration
- Collaborate with cross-functional teams to understand business needs and translate them into ML solutions
- Mentor junior engineers and provide technical leadership in Databricks best practices and techniques
- Conduct cost and performance analysis of the various computes and data assets within the platform
- Stay current with the latest advancements in Databricks SQL, Machine Learning and AI, and integrate new technologies and methods as applicable
- Engage in knowledge sharing and contribute to building a strong, collaborative team environment
Requirements
- 5+ years of experience with Databricks Account and Workspace Administration, including governance, configuration, and security management
- Experience with optimizing Databricks environments for cost efficiency, performance, and reliability
- Experience with developing and maintaining Databricks data pipelines using Autoloader, Delta Live Tables, and Databricks Workflows
- Experience with implementing MLOps solutions using MLflow 2.0/3.0, Serving Endpoints, and Model Registry in production environments
- Experience with Databricks SQL tools such as Queries and AI/BI Dashboards
- Experience with large datasets and distributed computing across cloud platforms (AWS, GCP, Azure), including Azure services like Entra ID, Data Factory, Key Vault, and Purview
- Knowledge of Unity Catalog Governance Architecture and enterprise data security best practices
- Possession of strong communication and documentation skills with the ability to clearly convey complex technical concepts to both technical and non-technical audiences
- Ability to obtain and maintain a Public Trust or Suitability/Fitness determination based on client requirements
- Ability to work on-site in Atlanta, Georgia
- Master’s degree
Benefits
- Ability to Obtain a Public Trust Required
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
Databricks Account AdministrationDatabricks Workspace AdministrationMLOpsMLflow 2.0Databricks SQLData PipelinesAutoloaderDelta Live TablesContainerizationCI/CD
Soft Skills
communication skillsdocumentation skillstechnical leadershipmentoringcollaborationproblem-solvinganalytical skillscost analysisperformance analysisknowledge sharing
Certifications
Master’s degreePublic Trust