Salary
💰 €25 - €35 per hour
Tech Stack
Amazon RedshiftAWSAzureCloudETLGoogle Cloud PlatformKafkaPySparkPythonSQLUnity
About the role
- Design and develop Enterprise Data Warehouse solutions.
- Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
- Develop transformation processes using PySpark or SQL as per business data model requirements.
- ELT/ETL pipeline development experience using Azure Data Factory, Databricks workflow, etc.
- Experience in analyzing/researching solutions and developing/implementing recommendations accordingly.
Requirements
- At least 3 years of data engineering experience.
- Strong analytical and problem-solving skills.
- Data Warehouse principles and data modeling experience.
- Advanced SQL expertise.
- Scripting language (e.g., Python, PowerShell) experience.
- Working experience with any cloud platform Azure, AWS, GCP.
- Good written and spoken English communication skills - to communicate with international clients and colleagues.
- Considered an advantage Azure Cloud platform experience.
- Azure Databricks experience.
- MPP Data Warehouse solutions like Snowflake, RedShift, Databricks Unity Catalog.
- Capital markets industry knowledge or experience.
- Understanding and experience in implementing medallion architecture to ingest and transform data.
- Real-time data ingestion using Kafka.
- Work equipment: laptop (Windows only), headset, keyboard, mouse, and backpack.
- Loyalty gifts.
- Annual reviews and growth opportunities.
- Corporate event travel budget.
- 20 unpaid time-off days per working year.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data engineeringdata warehousedata modelingSQLPySparkELTETLscriptingreal-time data ingestionmedallion architecture
Soft skills
analytical skillsproblem-solving skillscommunication skills