Tech Stack
AkkaAmazon RedshiftAndroidAngularASP.NETAWSAzureCloudETLGWTHibernateiOSJavaJavaScriptjQueryKafkaMongoDBMySQL.NETOraclePythonSpringSQLSSISTerraform
About the role
- Design, develop, and maintain robust ETL/ELT processes for extraction, transformation, cleaning, pre-processing, aggregation, and loading of data from diverse sources
- Develop, enhance, and maintain ETL solutions using SSIS and cloud-based data platforms
- Design, develop, and maintain scalable data solutions across legacy and modern cloud platforms
- Deliver across the full agile lifecycle: design, coding, testing, and deployment of iterative data solutions
- Collaborate with cross-functional teams to ensure data quality, integrity, availability, and auditability
- Perform Snowflake performance tuning and cloud data warehousing tasks
- Use Terraform (or similar) to provision and manage cloud infrastructure
- Support event-driven data flows using messaging/queuing systems and streaming platforms
- Maintain documentation and communicate complex technical concepts to stakeholders
Requirements
- Minimum 10 years’ experience in data engineering
- Deep expertise in Snowflake, including cloud-based data warehousing, ELT/ETL pipeline development, and performance tuning
- Demonstrated experience building and maintaining ETL processes, ideally using SSIS and Snowflake
- Strong proficiency in Python for scripting, automation, and data pipeline development
- Hands-on experience with AWS (data lakes, Glue, S3, Redshift)
- Exposure to Azure data services
- Practical experience using Terraform or similar IaC tools
- Proficiency with JSON, XML, and other semi-structured data
- Experience with messaging and queuing systems (SQS, SNS)
- Familiarity with streaming platforms (Kafka, Kinesis) advantageous
- Strong background in Git-based workflows, CI/CD pipelines
- Excellent verbal and written English communication skills