
Data Engineer
Cybermedia Technologies, LLC (CTEC)
full-time
Posted on:
Location Type: Remote
Location: United States
Visit company websiteExplore more
About the role
- Design, develop, and maintain scalable ETL pipelines and data workflows.
- Build, optimize, and maintain data processing solutions using Azure Databricks.
- Support phased data migration from legacy databases and ETL tools.
- Implement layered lakehouse data architectures in Databricks.
- Develop data processing notebooks, workflows, and distributed data transformations using Python and PySpark.
- Develop data validation, reconciliation, and testing processes.
- Integrate Databricks data platforms with analytics and reporting tools.
- Support data governance initiatives including metadata management.
- Maintain source control and CI/CD pipelines for Databricks workflows.
- Provide ongoing support for Databricks workflows and troubleshoot complex issues.
- Provide guidance to junior data engineers and contribute to documentation and team enablement.
Requirements
- At least seven (7–9+) years of experience in data engineering, ETL development, or large-scale data integration environments.
- Strong experience designing and developing ETL pipelines and data transformations in Azure Databricks environments.
- Strong proficiency in SQL and Python, with hands-on experience using PySpark for distributed data processing.
- Experience working with cloud-based data platforms, data lakes, and lakehouse environments, preferably on Microsoft Azure.
- Experience implementing layered lakehouse data architectures (bronze, silver, gold) for enterprise analytics.
- Familiarity with Spark-based big data processing frameworks.
- Experience supporting data migration from legacy databases and ETL tools to Databricks-based platforms.
- Experience integrating Databricks platforms with business intelligence and reporting tools such as Power BI.
- Familiarity with data governance, metadata management, and data security best practices.
- Experience with source control and CI/CD pipelines for data engineering and Databricks workflows.
- Working knowledge of SDLC and Agile delivery methodologies.
- Excellent organizational, communication, and collaboration skills.
Benefits
- Paid vacation & Sick leave
- Health insurance coverage
- Career training
- Performance bonus programs
- 401K contribution & Employer Match
- 11 Federal Holidays
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
ETL pipelinesdata workflowsAzure DatabricksPythonPySparkSQLdata validationdata reconciliationCI/CD pipelinesdata transformations
Soft Skills
organizational skillscommunication skillscollaboration skills