
ML Platform Engineer – Feature Store
Salesforce
full-time
Posted on:
Location Type: Hybrid
Location: New York City • Colorado, Illinois, New York, Washington • 🇺🇸 United States
Visit company websiteSalary
💰 $167,300 - $253,000 per year
Job Level
Mid-LevelSenior
Tech Stack
AirflowAWSCloudDockerKafkaKubernetesPythonSparkTerraform
About the role
- Architect, implement, and maintain a scalable feature store serving offline (batch), online (real-time), and streaming ML use cases.
- Build robust integrations between the feature store and ML ecosystem components such as data pipelines, model training workflows, model registry, and model serving infrastructure.
- Design and manage streaming pipelines using technologies like Kafka, Kinesis, or Flink to enable low-latency feature generation and real-time inference.
- Define and enforce governance standards for feature registration, metadata management, lineage tracking, and versioning to ensure data consistency and reusability.
- Partner with data scientists and ML engineers to streamline feature discovery, definition, and deployment workflows, ensuring reproducibility and efficient model iteration.
- Build and optimize ingestion and transformation pipelines that handle large-scale data while maintaining accuracy, reliability, and freshness.
- Implement CI/CD workflows and infrastructure-as-code to automate feature store provisioning and feature promotion across environments (Dev → QA → Prod).
- Collaborate with platform and DevOps teams to ensure secure, scalable, and cost-effective operation of feature store and streaming infrastructure in cloud environments.
- Develop monitoring and alerting frameworks to track feature data quality, latency, and freshness across offline, online, and streaming systems.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- 5+ years of experience in data engineering, platform engineering, or MLOps roles.
- Strong proficiency in Python and familiarity with distributed data frameworks such as Airflow, Spark or Flink.
- Hands-on experience with feature store technologies (e.g., Feast, SageMaker Feature Store, Tecton, Databricks Feature Store, or custom implementations).
- Experience with cloud data warehouse (e.g., snowflake) and transformation framework (e.g. dbt) for data modeling, transformation and feature computation in batch environment.
- Expertise in streaming data platforms (e.g., Kafka, Kinesis, Flink) and real-time data processing architectures.
- Experience with cloud environments (AWS preferred) and infrastructure-as-code tools (Terraform, CloudFormation).
- Strong understanding of CI/CD automation, containerization (Docker, Kubernetes), and API-driven integration patterns.
- Knowledge of data governance, lineage tracking, and feature lifecycle management best practices.
- Excellent communication skills, a collaborative mindset, and a strong sense of ownership.
Benefits
- time off programs
- medical
- dental
- vision
- mental health support
- paid parental leave
- life and disability insurance
- 401(k)
- employee stock purchasing program
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonAirflowSparkFlinkFeastSageMaker Feature StoreTectonDatabricks Feature StoreSnowflakedbt
Soft skills
communicationcollaborationownership
Certifications
Bachelor’s degree in Computer ScienceMaster’s degree in Computer ScienceBachelor’s degree in Data EngineeringMaster’s degree in Data Engineering