
Data Engineer
Sweed POS
full-time
Posted on:
Location Type: Remote
Location: Anywhere in the World
Visit company websiteExplore more
About the role
- Design, build, and maintain data pipelines using Airflow, and Trino to ingest and process data from multiple systems (databases, APIs, event streams, 3rd-party integrations).
- Improve data quality and reliability – implement monitoring, validation, and alerting for pipelines to ensure accuracy and consistency.
- Develop and maintain the platform – deploy, monitor and maintain services in our aws cloud
- Collaborate with cross-functional teams – work closely with Product Analytics, Data Architecture, and Engineering teams to define data requirements and deliver them effectively.
- Optimize performance and cost – fine-tune queries, pipelines, and storage for speed and efficiency.
- Document data assets – maintain clear documentation for data sources, pipelines, and models.
Requirements
- 3+ years of experience in data engineering or a related field.
- Strong proficiency in SQL and Python.
- Hands-on experience with:
- Airflow for workflow management
- OLAP databases, distributed query engines and data processing frameworks (clickhouse, trino, spark etc) for analytical workloads
- Solid understanding of data modeling and data engineering best practices
- Familiarity with event-based data architectures and streaming (spark, flink)
Benefits
- 100% remote – We’re a remote-first company, no offices needed!
- Flexible working hours – Core team time: 09:00-15:00 GMT (flexible per team)
- 20 paid vacation days per year
- 12 holidays per year
- 3 sick leave days
- Medical insurance after probation
- Equipment reimbursement (laptops, monitors, etc.)
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
SQLPythonAirflowTrinoClickhouseSparkFlinkdata modelingdata engineering best practicesdata processing frameworks
Soft Skills
collaborationcommunicationproblem-solvingdocumentation