
Data Platform Engineer
White Cap
full-time
Posted on:
Location Type: Remote
Location: Tennessee • United States
Visit company websiteExplore more
About the role
- Designing and implementing high-performance data pipelines, APIs, and integrations
- Building and maintaining batch and streaming data pipelines using Databricks
- Developing and managing inbound/outbound data feeds via APIs, SFTP, pub/sub, or middleware platforms
- Building and optimizing data models in Postgres and synchronizing with analytical layers
- Collaborating with product, architecture, and InfoSec teams to ensure secure data movement
- Implementing data quality, observability, and governance standards
- Automating deployment and testing with CI/CD tools
- Refactoring existing data pipelines to modern, scalable approaches
- Creating build vs buy proposals
- Implementing “greenfield” solutions or integrating 3rd party apps
Requirements
- BS/BA in a related discipline
- 2-5 years of experience in a related field OR MS/MA with 2-4 years of experience
- Proficiency in Python or Scala
- Strong SQL skills
- Experience with Databricks or Spark-based data engineering
- Experience integrating APIs and building middleware connectors
- Solid understanding of Postgres or similar OLTP databases
- Familiarity with cloud environments (Azure preferred) and containerization (Docker/Kubernetes)
- Relevant certifications (e.g., Databricks Certified Data Engineer, Azure Data Engineer Associate)
- Experience working in Agile/Scrum environments
- Strong documentation and technical writing skills
Benefits
- Health insurance
- 401(k) matching
- Paid time off
- Professional development opportunities
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonScalaSQLDatabricksSparkPostgresAPIsSFTPCI/CDdata modeling
Soft Skills
collaborationdocumentationtechnical writing
Certifications
Databricks Certified Data EngineerAzure Data Engineer Associate