Work with GCP services (BigQuery, Pub/Sub, Cloud Storage, etc.) to support scalable and reliable data systems
Develop and optimize DAGs in Airflow to schedule and automate workflows
Write efficient Python and SQL code to process, transform, and analyze large datasets
Partner with Data Engineering and Business Intelligence teams to ensure data quality, consistency, and availability across the company
Support initiatives to improve the scalability, monitoring, and reliability of our data infrastructure
Requirements
Currently pursuing a Bachelor’s degree, graduating in 2027 or later, in computer science, data engineering, or a related technical discipline, with a 3.0 minimum GPA or equivalent
Exposure to Python and SQL for data processing and pipeline development
Familiarity with data engineering concepts such as batch and streaming data processing
Exposure to tools such as Kafka, Pub/Sub, Airflow, BigQuery, or other GCP services
Understanding of software engineering best practices (version control, testing, CI/CD) is a plus
Ability to communicate clearly and work collaboratively with technical and non-technical teams
Benefits
Incentive compensation
Equity grants
Paid time off
Group health insurance coverage
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonSQLdata processingdata pipeline developmentDAGsdata transformationdata analysisbatch processingstreaming data processingsoftware engineering best practices