Tech Stack
AirflowApacheAWSAzureGoogle Cloud PlatformPythonSparkSQL
About the role
- Develop reliable batch and streaming data pipelines using Spark, Delta Lake, and modern frameworks.
- Implement ingestion, transformation, and curation workflows across Databricks, Fabric, Snowflake, AWS, or GCP.
- Design logical and physical models that support analytics, AI, and business processes.
- Use orchestration tools like Lakeflow, Airflow, dbt, or equivalent to manage end-to-end data processes.
- Apply best practices for data governance, lineage, and secure access.
- Partner with data scientists, analysts, and business stakeholders to deliver data solutions that meet real-world needs.
- Monitor and tune pipelines and infrastructure for scalability, reliability, and cost efficiency.
Requirements
- Strong hands-on experience with Apache Spark, Delta Lake, and streaming technologies
- Solid programming skills in Python and SQL
- Good understanding of data modeling, governance, and security best practices
- Experience in one or more of Databricks, Fabric, Snowflake, AWS, or GCP
- Knowledge of Azure data services is considered transferable to Fabric
- Problem-solving approach with a focus on scalability and efficiency
- Strong communication and collaboration skills in cross-functional teams
- Experienced in Agile methodologies and consulting (a plus)
- Medical insurance
- Sports reimbursement budget
- Home office support
- A number of free psychological and legal consultations
- Maternity and paternity leave support
- Internal workshops and learning initiatives
- English language classes compensation
- Professional certifications reimbursement
- Participation in professional local and global communities
- Growth Framework to manage expectations and define the steps to move towards the selected career
- Mentoring program with the ability to become a mentor or a mentee to grow to a higher position
- Valtech Ukraine has a system of progressive benefits packages in place — the longer you stay with the company — the more benefits you get.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
Apache SparkDelta LakePythonSQLdata modelingdata governancedata securitystreaming technologiesAgile methodologiesdata ingestion
Soft skills
problem-solvingcommunicationcollaborationcross-functional teamworkfocus on scalabilityfocus on efficiency