Tech Stack
AWSCloudETLNoSQLPythonServiceNowSQLVault
About the role
- Work cross-functionally to intake, process, transform, and store large amounts of data from various sources
- Build and maintain the core data infrastructure that powers the organisation's advanced analytics, machine learning applications, and data-driven decision-making
- Architect robust and scalable data pipelines on cloud platforms
- Build and maintain data warehouses, data lakes, and other large-scale data storage systems
- Design and implement ETL processes and data transformation techniques
- Communicate and collaborate with stakeholders at all levels
- Support machine learning pipelines and model deployment where required
Requirements
- 10+ years of experience with building and maintaining data warehouses, data lakes, and other large-scale data storage systems
- Knowledge of ETL processes and data transformation techniques
- Ability to design and develop data aggregation and integration processes across different systems and environments
- Understanding of data architecture patterns such as lambda and kappa architecture
- Snowflake
- Data Vault
- AWS
- Python
- Knowledge of SQL and NoSQL databases
- Experience with data visualization and business intelligence tools
- Familiarity with machine learning pipelines and model deployment
- Experience with blockchain infrastructure and data integration (Nice to have)
- This role is for contractors