Tech Stack
AirflowApacheAWSAzureCloudDockerETLGoogle Cloud PlatformNoSQLPythonRedisSQL
About the role
- Build shippable software following Engineering standards
- Build and maintain key engineering blocks that other teams can rely upon (APIs and Big Data implementations)
- Support the current stack and extend it with new features
- Work on ad-hoc R&D projects
- Work closely with client BI users, operations and development teams; encourage a data-driven, pragmatic approach
- Ensure deliveries are on time and of the required quality
- Maintain the company’s data assets at required quality levels
- Design and build solid, efficient, stable APIs
- Maintain high standard of code and enforce best practices in code quality and process design
- Keep up to date with latest technologies and methodologies
- Ensure a globally robust and highly scalable approach to development to support global users and services
Requirements
- Python development skills
- Ability to implement ETL data pipelines in Python
- Creating REST APIs
- Advanced SQL scripting knowledge
- Experience with Google Cloud Platform, AWS or Azure
- 2+ years of experience in data or software development
- Knowledge of big data platforms
- Knowledge of relational databases
- Knowledge of technologies: Git, Docker, Bash language
- Ability to propose, design and implement simple ETL solutions both in batch and real-time
- Understanding of continuous delivery pipeline and ability to design a process
- Ability to pick the correct technology for the correct task
- Experience with DBT (Data build Tools) to develop data pipelines (desirable)
- Experience with Data streams in Google Dataflow or Apache Beam (desirable)
- Experience using Airflow (desirable)
- Experience with NoSQL databases like Redis, Elastic Search (desirable)
- English used daily