Tech Stack
AirflowApacheAWSAzureCloudGoogle Cloud PlatformPythonSparkSQLTerraformUnity
About the role
- Lead the technical direction of data engineering projects, including architectural design, technology selection, and standards definition
- Mentor and guide engineers, fostering a high-performance, collaborative culture
- Ensure adherence to best practices in data engineering, governance, and security
- Design and build streaming and batch data pipelines using Spark, Delta Lake, and modern frameworks
- Develop and optimize workflows for data ingestion, transformation, and curation across Databricks, Fabric, Snowflake, AWS, or GCP
- Leverage Microsoft Fabric capabilities such as OneLake, Lakehouses, Power BI integration, and governance tooling to build end-to-end data solutions
- Define and manage logical and physical data models to support analytics, AI, and business processes
- Oversee orchestration and automation using tools like Lakeflow, Airflow, dbt, or equivalent
- Establish and enforce data governance, lineage, and security best practices across cloud environments
- Collaborate with analysts, data scientists, and business stakeholders to ensure alignment of solutions with business needs
- Continuously improve systems for performance, reliability, and cost efficiency
Requirements
- Advanced knowledge of Apache Spark, Delta Lake, Unity Catalog, and streaming technologies
- Strong programming skills in Python and SQL
- Solid background in data modeling, governance, and security
- Proven ability to lead engineering teams and make architectural decisions at scale
- Experience in solution design, platform modernization, and cloud architecture
- Hands-on experience with one or more of Databricks, Fabric, Snowflake, AWS, or GCP
- Experience with Azure data services (Data Lake, Synapse, Data Factory, etc.) is considered highly relevant and transferable to Fabric
- Strong leadership, communication, and stakeholder management skills
- Ability to mentor, coach, and inspire team members
- Strategic mindset with focus on scalability, innovation, and efficiency
- Experience with data architecture frameworks and enterprise design patterns (nice to have)
- Familiarity with CI/CD pipelines and infrastructure-as-code (Terraform, ARM, CloudFormation) (nice to have)
- Exposure to machine learning pipelines and AI-driven solutions (nice to have)
- Medical insurance
- Sports reimbursement budget
- Home office support
- A number of free psychological and legal consultations
- Maternity and paternity leave support
- Internal workshops and learning initiatives
- English language classes compensation
- Professional certifications reimbursement
- Participation in professional local and global communities
- Growth Framework to manage expectations and define the steps to move towards the selected career
- Mentoring program with the ability to become a mentor or a mentee to grow to a higher position
- Valtech Ukraine has a system of progressive benefits packages in place — the longer you stay with the company — the more benefits you get.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
Apache SparkDelta LakePythonSQLdata modelingdata governancedata securitycloud architecturedata ingestiondata transformation
Soft skills
leadershipcommunicationstakeholder managementmentoringcoachingcollaborationstrategic mindsetinnovationefficiencyhigh-performance culture