FREE ACCESS
5,000–10,000 jobs/day

See all jobs on JobTailor
Search thousands of fresh jobs every day.
Discover
- Fresh listings
- Fast filters
- No subscription required
Create a free account and start exploring right away.

BI ETL Data Developer II/III
Miami University College of Education, Health and Society. Develop, maintain, and enhance data pipelines using modern ELT tools (e.g., dbt, Fivetran, Airflow, or similar) in a cloud-based environment.
Posted 4/21/2026full-timeOxford • Ohio • 🇺🇸 United StatesJuniorMid-Level💰 $65,000 - $87,000 per yearWebsite
Tech Stack
Tools & technologiesAirflowAmazon RedshiftAzureBigQueryCloudPythonSQL
About the role
Key responsibilities & impact- Develop, maintain, and enhance data pipelines using modern ELT tools (e.g., dbt, Fivetran, Airflow, or similar) in a cloud-based environment.
- Write, optimize, and maintain SQL and/or Python-based transformations that support scalable analytics solutions.
- Monitor, troubleshoot, and resolve issues in production data pipelines to ensure reliability, performance, and data integrity.
- Implement data validation, testing, and quality checks to improve the consistency and trustworthiness of data assets.
- Collaborate with stakeholders and technical partners to translate business needs into scalable data solutions.
- Support and improve existing data integrations and workflows with a focus on maintainability and performance optimization.
- Contribute to and follow best practices for version control, testing, and deployment (CI/CD).
- Create and maintain documentation for data pipelines, models, and system processes.
- Contribute to team practices that support consistent delivery and continuous improvement.
- Design, develop, and optimize scalable data pipelines using modern ELT tools (e.g., dbt, Fivetran, Airflow, or similar) in a cloud-based environment.
- Lead the development of advanced SQL and/or Python-based transformations supporting enterprise analytics and reporting.
- Own production data pipelines, including monitoring, performance tuning, troubleshooting, and ensuring reliability and data integrity.
- Design and implement robust data validation, testing, and quality frameworks to ensure trusted data at scale.
- Design scalable data models and transformation patterns to support enterprise reporting and analytics needs.
- Partner with stakeholders to translate complex business requirements into sustainable, high-impact data solutions.
- Drive improvements to existing data pipelines and processes to enhance performance, scalability, and maintainability.
- Lead adoption of best practices for version control, testing, deployment, and operational support (CI/CD).
- Develop and maintain comprehensive documentation for data pipelines, models, and architecture.
- Guide team practices that support consistent delivery, operational excellence, and continuous improvement.
- Mentor team members and contribute to the growth of technical standards and capabilities across the team.
Requirements
What you’ll need- Bachelor’s degree in computer science, information technology, or a relevant field earned by date of hire with two to four or more years of relevant experience OR Associate’s degree in computer science, information technology, or a relevant field earned by date of hire and four to six or more years of relevant experience.
- Ability to analyze complex data and develop practical, scalable solutions.
- Ability to troubleshoot and resolve data pipeline and data quality issues in a timely manner.
- Ability to translate business needs into effective technical data solutions.
- Ability to communicate technical concepts clearly to both technical and non-technical audiences.
- Ability to work collaboratively across teams and build effective working relationships.
- Ability to manage multiple priorities and adapt to changing requirements in a dynamic environment.
- Ability to document solutions and processes to support maintainability and knowledge sharing.
- Experience developing and maintaining data pipelines using modern ELT tools (e.g., dbt, Fivetran, Airflow, or similar).
- Experience working with cloud data platforms such as Snowflake, Amazon Redshift, Google BigQuery, or Azure Synapse.
- Experience using Python for data processing, automation, or integration.
- Experience implementing data validation, testing, or monitoring solutions.
- Experience using version control systems (e.g., Git) and contributing to CI/CD workflows.
- Experience building or supporting data models and analytics solutions.
- Experience working in Agile or iterative development environments.
- Experience supporting enterprise data systems in a higher education or similarly complex environment.
- Experience designing and optimizing scalable ELT pipelines using tools such as dbt, Fivetran, Airflow, or similar.
- Experience working with cloud data platforms such as Snowflake, Amazon Redshift, Google BigQuery, or Azure Synapse at scale.
- Strong experience using Python for data processing, automation, and system integration.
- Experience designing dimensional data models and large-scale data transformations.
- Experience implementing and managing data quality, testing, and monitoring frameworks.
- Experience leading or significantly contributing to CI/CD practices for data pipelines.
- Experience optimizing data pipelines for performance, scalability, and cost efficiency.
- Experience working in complex organizational environments (e.g., higher education, healthcare, or enterprise settings).
Benefits
Comp & perks- Benefit Eligible 📊 Check your resume score for this job Improve your chances of getting an interview by checking your resume score before you apply. Check Resume Score
ATS Keywords
✓ Tailor your resumeApplicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
SQLPythondata pipelinesdata validationdata qualityELT toolsdimensional data modelsCI/CDdata transformationscloud data platforms
Soft Skills
analytical skillstroubleshootingcommunicationcollaborationadaptabilitydocumentationmentoringproblem-solvingrelationship buildingprioritization