
Senior Data Engineer
Bestow
full-time
Posted on:
Location Type: Remote
Location: United States
Visit company websiteExplore more
Salary
💰 $134,500 - $158,500 per year
Job Level
Tech Stack
About the role
- Transform and build robust solutions for transferring data from first and third-party applications to and from our data warehouse
- Envision and design toward industry patterns on data exchange with our enterprise clients through a mix of traditional push delivery, cloud, and event-driven (eg: API, grpc) data sharing methods.
- Support implementation and data integration as new partners roll onto the Bestow platform, and recommend improvements on configurability of platform.
- Making decisions as a team. The things you build will be maintained and improved upon by others; there is a shared responsibility to make defensible design considerations and high collaboration.
- Champion test-first design principles, proactively writing tests before code to maintain high coverage and pipeline reliability.
- Develop hardened and repeatable (CI/CD) data models and pipelines to enable reporting, modeling and machine learning.
- Ensure data quality through automated monitoring and alerting, and occasionally serving within an on-call rotation.
- Leverage Google Cloud (GCP) tools (eg: Cloud Run, Cloud Functions, Vertex AI, App Engine, Cloud Storage, IAM, etc.) and services (eg: Astronomer - Apache Airflow) to bring data workloads to production.
- Drive and support MLOps to improve Data Science monitoring and governance
- Enable and support Generative AI (eg: LLM) pipelines, allowing internal teams to quickly prototype. Support the architecture and rollout of GenAI products and features into the marketplace.
- Collaborate with product, engineering, stakeholders and data teams to deliver informed solutions to platform and client needs
Requirements
- 6+ years working in a data engineering role supporting incoming/outgoing products for internal and external customers.
- 4+ years demonstrated expertise in designing an end-to-end data pipeline in cloud frameworks (such as GCP, AWS, Azure) with requirements from multiple stakeholders.
- 4+ years of Python experience writing efficient, testable, and readable code
- 2+ years of experience in building streaming data ingestion pipelines
- 1+ year of ML (Machine Learning) support and implementation or MLOps.
- Advanced SQL expertise with columnar databases (BigQuery, Snowflake, Amazon Redshift) and performance tuning.
- Demonstrated experience with AI Coding assistants – AI tools are heavily engrained in Bestow culture.
- Cloud Native: Deep experience with cloud services (GCP preferred: Cloud Run, Pub/Sub, BigQuery) and containerization (Docker/Kubernetes).
- Demonstrated experience with AI Coding assistants – AI tools are heavily engrained in Bestow culture.
- Orchestration: Expert-level knowledge of Apache Airflow (DAG optimization, custom operators).
- Experience building CICD pipelines for data processing using tools such as Docker, CircleCI, dbt, git, etc
- Infrastructure as Code: Proven experience managing infrastructure using Terraform or Pulumi.
- Experience with creating alerts and monitoring pipelines which contribute to overall data governance.
- Familiarity with standard IT security practices such as identity and access management (IAM), data protection, encryption, certificate, and key management.
- Adaptability to learn new technologies and products as the job demands.
- Nice to have: Familiarity with building tools that draw upon Generative AI (GenAI) integrations (Enterprise-grade, not simply vibe-coded).
- Nice to have: experience with data contracts, data lakes, and API development
Benefits
- Competitive salary and equity based on role
- Policies and managers that support work/life balance, like our flexible paid time off and parental leave programs
- 100% paid-premium option for medical, dental, and vision insurance
- Lifestyle stipend to support your physical, emotional, and financial wellbeing
- Flexible work-from-home policy and open to remote
- Remote and WFH options, as well as a beautiful, state-of-the-art office in Dallas’ Deep Ellum, for those who prefer an office setting
- Employee-led diversity, equity, and inclusion initiatives
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data engineeringend-to-end data pipelinePythonstreaming data ingestionMachine LearningSQLApache AirflowCICD pipelinesInfrastructure as Codedata governance
Soft Skills
collaborationdecision makingadaptabilitytest-first designproactive communication