Salary
💰 $130,000 - $140,000 per year
Tech Stack
AirflowApacheBigQueryCloudDockerGoJavaKafkaPerlPHPPostgresPythonSaltStackSQL
About the role
- Design, build, and optimize robust, scalable data pipelines, leading the migration from legacy systems to a modern, AI-centric platform.
- Evolve data models and schemas to better support complex analytics, AI training, and fine-tuning workloads.
- Collaborate with AI/ML teams to productionize models, streamline training data delivery, and support development of agentic systems.
- Partner with BI developers and analysts to design highly efficient queries and unlock insights.
- Champion data governance and compliance to ensure secure and trustworthy data handling.
- Modernize historical data flows to AI-driven workflows and scale the data ecosystem for future demands.
Requirements
- Proven experience (4+ years) in a data engineering role, with a track record of building and managing complex data systems.
- Deep expertise in SQL and query optimization.
- Hands-on experience with cloud data warehouses and databases, specifically Google BigQuery and CloudSQL (PostgreSQL).
- Programming experience with Python or JAVA.
- A proactive and self-motivated & managed mindset, perfect for a fully remote environment with a high degree of autonomy.
- Excellent communication and documentation skills; you can clearly articulate complex technical concepts to diverse audiences.
- The ability to work a flexible schedule and the readiness to respond to occasional off-hours emergencies.
- Bonus: Experience with Google's VertexAI platform.
- Bonus: Proficiency in Go.
- Bonus: Familiarity with dbt and airflow.
- Bonus: Familiarity with event-streaming platforms like Apache Kafka.
- Bonus: Real time streaming analytics experience.
- Bonus: Experience with containerization (Docker) and serverless compute (Google Cloud Run).
- Bonus: Experience with Perl or PHP is a plus.
- Willingness/ability to be legally authorized to work in the United States (application question asks about authorization).