
Data Engineer
Neko Health
full-time
Posted on:
Location Type: Remote
Location: United States
Visit company websiteExplore more
About the role
- Build and own scalable data pipelines for ingestion, integration, and processing across batch and streaming systems.
- Architect and maintain robust data models across databases and lakehouse platforms supporting analytics and ML workloads.
- Develop and own core data platform components and infrastructure.
- Ensure data integrity and quality through monitoring, alerting, lineage, and traceability.
- Manage and optimize data infrastructure including clusters, storage, and compute resources.
- Implement metrics and observability across services using logging, tracing, and monitoring.
- Troubleshoot production issues, pipeline failures, and performance bottlenecks.
- Collaborate cross-functionally with data scientists, analysts, and backend engineers on modelling, governance, and integration.
Requirements
- Strong programming skills in Python and SQL.
- Hands-on experience with relational and NoSQL databases, data lakes, and lakehouse technologies (e.g. Delta Lake, Parquet).
- Experience working with cloud infrastructure (Azure, AWS, or GCP).
- Familiarity with CI/CD pipelines and automated testing practices.
- Strong data quality and observability experience including metrics, tracing, and alerting.
- Experience working with data governance practices including lineage, cataloguing, access control, data contracts, and regulatory compliance.
Benefits
- Neko Health supports a flexible workplace that prioritizes work-life balance
- Remote-First company
- Opportunities for collaboration and team connection through in-person meetups
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonSQLdata pipelinesdata modelsdata integrationdata processingdata qualityobservabilityCI/CDautomated testing
Soft Skills
collaborationtroubleshootingproblem-solving