Mindex

Data Science Engineer

Mindex

full-time

Posted on:

Location Type: Remote

Location: Remote • New York • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $90,000 - $120,000 per year

Job Level

Mid-LevelSenior

Tech Stack

AirflowAWSAzureCloudDockerFlaskGoogle Cloud PlatformPythonSparkSQL

About the role

  • Design and implement scalable data pipelines to ingest, process, and transform large datasets (structured & unstructured).
  • Develop, validate, and optimize supervised and unsupervised machine learning models leveraging Python, SQL, and modern libraries.
  • Conduct feature engineering, model selection, and statistical modeling to deliver high-impact solutions.
  • Build and expose model APIs or containerized workflows for seamless integration and deployment in production environments.
  • Apply MLOps best practices to model versioning, testing, monitoring, and deployment.
  • Work with Big Data technologies such as Databricks and Snowflake to unlock analytics at scale.
  • Orchestrate complex workflows using tools like Airflow or Dagster for automation and reliability.
  • Collaborate with AI teams to refine prompt engineering and leverage AI tooling for model fine-tuning and augmentation.
  • Maintain familiarity with leading cloud platforms (AWS, Azure, GCP) for model training, deployment, and infrastructure management.
  • Partner with product, engineering, and business teams to translate requirements into technical solutions.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, Statistics, or a related field.
  • 3+ years of experience in data science engineering or related roles.
  • Proficiency in Python and SQL for data extraction, analysis, and modeling.
  • Strong background in statistical modeling and machine learning algorithms (supervised and unsupervised).
  • Experience with feature engineering and end-to-end model development.
  • Hands-on experience with MLOps foundations (CI/CD, model monitoring, automated retraining).
  • Familiarity with Big Data tools (Databricks, Snowflake, Spark).
  • Experience with workflow orchestration platforms such as Airflow or Dagster.
  • Understanding of cloud architecture and deployment (AWS, Azure, GCP).
  • Experience deploying models as APIs or containers (Docker, FastAPI, Flask).
  • Familiarity with prompt engineering techniques and AI tooling for cutting-edge model development.
  • Excellent problem-solving and communication skills.
  • Experience with advanced AI tools (e.g., LLMs, vector databases).
  • Exposure to data visualization tools and dashboarding.
  • Knowledge of security, privacy, and compliance in ML workflows.
Benefits
  • Health insurance
  • Paid holidays
  • Flexible time off
  • 401k retirement savings plan and company match with pre-tax and ROTH options
  • Dental insurance
  • Vision insurance
  • Employer paid disability insurance
  • Life insurance and AD&D insurance
  • Employee assistance program
  • Flexible spending accounts
  • Health savings account with employer contributions
  • Accident, critical illness, hospital indemnity, and legal assistance
  • Adoption assistance
  • Domestic partner coverage
  • Tickets to local sporting events
  • Teambuilding events
  • Holiday and celebration parties
  • Leadership training
  • License to Udemy online training courses
  • Growth opportunities

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
PythonSQLmachine learningstatistical modelingfeature engineeringMLOpsmodel developmentdata extractiondata analysismodel monitoring
Soft skills
problem-solvingcommunication