Salary
💰 $100,000 - $150,000 per year
About the role
- Monitor and maintain ETL/ELT workflows for uptime and performance.
- Implement data validation, quality checks, and reconciliation to ensure accuracy.
- Reduce mean time to detect (MTTD) and mean time to resolve (MTTR) for data issues.
- Troubleshoot and resolve operational issues in data pipelines.
- Partner with Dev Team and Data QA teams to ingest and normalize feeds from market data vendors.
- Optimize workflows for cost and performance across cloud platforms.
- Maintain documentation and runbooks for operational processes.
- Support incident management and root cause analysis related to data movement failures.
- Visualize data flows and error tracking.
Requirements
- Strong SQL and scripting (Python, Bash).
- Experience with workflow orchestration.
- Familiarity with data warehouses like Snowflake.
- Experience with FTP and similar data distribution platforms.
- Cloud-native data services like AWS SQS and general AWS knowledge.
- Basic observability (monitoring, logging, alerting).
- Financial Data knowledge preferred.