Salary
💰 $121,000 - $199,584 per year
Tech Stack
AirflowAWSAzureCloudETLGoogle Cloud PlatformKafkaMySQLPostgresPythonSQLTableau
About the role
- Design and maintain ETL/ELT pipelines and data workflows to enable reliable, timely, and scalable data flows
- Collaborate with analysts, data scientists, and business stakeholders to deliver curated datasets and internal data products
- Implement best practices in schema design, data modeling, and metadata management
- Own and evolve internal data infrastructure for quality, monitoring, and discoverability
- Partner with software engineering teams where application data intersects with internal pipelines to ensure business-critical data is clean, structured, and usable
- Deliver production-ready pipelines, datasets, or internal data products and take technical ownership over them
- Participate in onboarding and mentoring activities and help shape the team’s roadmap and long-term data foundations strategy
Requirements
- 2+ years of professional experience in data engineering or at least 2 years hands on building, deploying and maintaining production-grade data infrastructure and pipelines
- Demonstrated proficiency in SQL (e.g. MySQL, PostgreSQL) and data modeling
- Proficiency in Python
- Experience working in relational and non-relational databases
- Hands on experience with ELT and transformation frameworks (e.g., dbt) and with orchestrators (e.g., Airflow, dbt, dagster)
- Experience building internal data products (curated datasets, semantic layers, or reusable modeling frameworks)
- Proven experience applying standard software development practice to data engineering including testing, version control, code reviews, incident management, CI/CD, documentation, and observability for data
- Preferred: experience managing semantic layers and BI dashboards (e.g. Tableau, Hex, Looker)
- Preferred: experience implementing data quality frameworks, testing methodologies, and monitoring practices
- Preferred: hands-on experience with event-driven or streaming frameworks (Kafka or NSQ, Kinesis, Pub/Sub)
- Preferred: direct experience with cloud infrastructure (AWS, GCP, or Azure) and implementing cost optimization strategies
- Proven ability to collaborate with cross-functional teams and communicate complex technical concepts to non-technical stakeholders