
Contract Data Engineer, Platform, AWS, Data Pipelines
SOUTHWORKS
contract
Posted on:
Location Type: Remote
Location: Argentina
Visit company websiteExplore more
About the role
- Design, build, maintain and primarily operate scalable streaming and batch data pipelines, with a strong focus on maintenance, monitoring, troubleshooting and continuous improvement of existing pipelines.
- Work with AWS services, including Redshift, EMR and ECS, to support data processing and analytics workloads.
- Develop and maintain data workflows using Python and SQL.
- Orchestrate and monitor pipelines using Apache Airflow.
- Build and deploy containerized applications using Docker and Kubernetes.
- Break down high-level system designs into well-defined, deliverable tasks with realistic estimates.
- Collaborate with cross-functional teams in a fast-paced and distributed environment across the US and Europe.
- Drive automation, observability and monitoring to improve reliability, performance and operational efficiency.
- Support knowledge transfer and ownership handover as part of the planned transition to the consuming team.
Requirements
- Strong professional experience with Python and SQL.
- Hands-on experience with AWS, specifically Redshift, EMR and ECS. AWS experience is mandatory (other cloud providers are not considered equivalent for this role).
- Proven experience building and operating both streaming and batch data pipelines.
- Professional experience with Apache Airflow, Docker and Kubernetes.
- Ability to translate high-level system designs into actionable technical tasks and realistic estimates.
- Comfortable working in dynamic and fast-paced environments and in distributed teams.
- Strong interest in automation and monitoring.
- Strong hands-on experience with Apache Spark.
- Senior-level profile with strong autonomy, communication skills and ability to work effectively in distributed teams.
- Proven ability to transfer knowledge and support ownership handovers.
- Fluent or professional working proficiency in English (both written and spoken).
Benefits
- Contractor (40 hours per week)
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonSQLApache AirflowDockerKubernetesApache Sparkdata pipelinesdata workflowsautomationmonitoring
Soft Skills
communication skillsautonomycollaborationability to work in distributed teamsability to translate designs into tasksstrong interest in automationknowledge transfersupport ownership handovers