Tech Stack
AirflowAWSCloudJavaKotlinPySparkPythonSQLTerraform
About the role
- Develop a scalable cloud-based data backbone driving the company using up-to-date data technologies
- Shape an AWS-based batch data processing and data streaming solution ingesting data from internal backend services and third parties
- Create a financial data warehouse combining modern technologies with features required for regulatory needs
- Prepare and clean structured and unstructured data and develop high-quality data models for reporting, advanced analytics, and AI use cases
- Collaborate closely with data scientists, product and development colleagues to release product features
- Work within growing data teams and share expert knowledge about data best practices across the company
Requirements
- Excellent University degree in computer science, mathematics, natural sciences, or a similar field and relevant working experience
- Experience designing and operating data pipelines in AWS
- Excellent SQL skills, including advanced concepts such as window functions
- Experience with dbt is strongly desired
- Very good programming skills in Python, ideally including Airflow and data processing frameworks (e.g. PySpark)
- Knowledge of Java and Kotlin is a plus
- Experience with AWS services like S3, Athena, DMS and Glue
- Experience using infrastructure-as-code tools such as Terraform
- Passion for everything-as-code and writing well architected, testable, documented code
- Data-driven and good with numbers while able to explain complex concepts simply
- Experience using agile frameworks like Scrum
- Understanding of data governance requirements for a regulated industry
- Interest in financial services and markets
- Fluent English communication and presentation skills
- Sense of humour and positive outlook on life