Salary
💰 $101,810 - $120,000 per year
Tech Stack
CloudETLPostgresPythonSparkSQL
About the role
- Design, build, and maintain pipelines and tools to ensure organizational data flows reliably and is ready for use in CRM, ESP, donation, and analytics systems.
- Build and maintain ETL pipelines using Python and SQL to move data between internal systems and external platforms (e.g., CRMs, ESPs).
- Ensure data reliability and integrity across systems and develop automated validation and alerting.
- Troubleshoot issues across platforms and contribute to root cause analysis and long-term solutions.
- Collaborate with team members and stakeholders to understand business requirements and translate them into scalable technical solutions.
- Participate in monitoring, performance tuning, and refactoring of existing data flows.
- Help maintain and extend internal tools that support data operations and stakeholder reporting needs.
- Document systems, processes, and logic to ensure knowledge sharing and continuity.
Requirements
- 3–5 years of experience in a data engineering, systems integration, or similar role.
- Proficient in Python (Or Equivalent) and SQL, with experience building maintainable ETL pipelines.
- Experience working with relational databases (e.g., SQL Server, Postgres).
- Comfortable navigating and integrating with REST APIs and working with cloud-based services.
- Experience working with CRM or ESP platforms (e.g., Salesforce, RevCRM, Marketing Cloud, EveryAction, etc.) is a plus.
- Strong problem-solving skills with the ability to debug across systems.
- Excellent communication and documentation habits.
- Comfortable navigating and troubleshooting legacy pipelines and code, with a focus on understanding historical context before making changes.