Salary
💰 $110,000 - $130,000 per year
Tech Stack
AirflowCloudETLGoogle Cloud PlatformKubernetesPythonSQLTerraform
About the role
- Design, develop, and maintain high-performance data pipelines using Python, SQL, dbt, and Snowflake on GCP.
- Engage in efforts to advance code quality, test coverage, and maintainability of data pipelines.
- Champion and implement software engineering best practices, including code reviews, testing methodology, CI/CD, and documentation.
- Support the adoption of data quality tools and practices (e.g., data lineage, automated alerting).
- Research, evaluate, and recommend new technologies and tools to improve the data platform.
- Contribute to the data architecture and design of the data warehouse.
- Collaborate with software engineering teams to define data structures, streamline ingestion processes, and ensure data consistency.
- Work closely with stakeholders (data scientists, analysts, business users) to understand data needs and translate them into technical requirements.
- Troubleshoot and resolve complex data pipeline issues, ensuring data quality and reliability.
- Contribute to the development and maintenance of CI/CD pipelines for data infrastructure.
- Participate in on-call rotation to support critical data pipelines.
- Identify and address inefficiencies in data engineering processes.
Requirements
- Bachelor's degree in Computer Science, Engineering, or a related field
- 2+ years of experience in data engineering
- Strong analytical SQL skills
- Strong Python skills
- Understanding of software engineering principles and best practices (e.g., version control, testing, CI/CD)
- Experience with data warehousing technologies, preferably Snowflake
- Experience with cloud platforms, preferably GCP (Google Cloud Platform), including services like Cloud Functions, and GCS
- Experience designing and implementing reliable and resilient ETL/ELT pipelines
- Excellent communication, collaboration, and problem-solving skills
- Mission Lane is not sponsoring new applicant employment authorization (applicants must be authorized to work in the U.S.)
- Experience with Snowflake
- Experience with dbt
- Experience with Montecarlo
- Experience with Airflow
- Experience with data governance and data quality frameworks
- Knowledge of data modeling principles
- Experience with infrastructure-as-code tools (e.g., Terraform, Kubernetes Config Connector)