Everspring

Data Engineer II

Everspring

full-time

Posted on:

Origin:  • 🇺🇸 United States • Illinois

Visit company website
AI Apply
Manual Apply

Salary

💰 $90,000 - $125,000 per year

Job Level

Mid-LevelSenior

Tech Stack

AirflowAmazon RedshiftAWSAzureBigQueryCloudDockerETLGoogle Cloud PlatformPythonSQLTerraform

About the role

  • Design and implement scalable, maintainable ETL/ELT pipelines for a variety of use cases (analytics, operations, product enablement)
  • Build and optimize integrations with cloud services, databases, APIs, and third-party platforms
  • Own production data workflows end-to-end, including testing, deployment, monitoring, and troubleshooting
  • Collaborate with cross-functional stakeholders to understand business needs and translate them into technical data solutions
  • Lead technical discussions and participate in architecture reviews to shape our evolving data platform
  • Write clean, well-documented, production-grade code in Python and SQL
  • Improve data model design and data warehouse performance (e.g., partitioning, indexing, denormalization strategies)
  • Champion best practices around testing, observability, CI/CD, and data governance
  • Mentor junior team members and contribute to peer code reviews
  • Reports to Executive Director, Technical Strategy and Operations

Requirements

  • 3+ years of experience in a data engineering or software engineering role, with a strong track record of delivering robust data solutions
  • Proficiency in Python and advanced SQL for complex data transformations and performance tuning
  • Experience building and maintaining production pipelines using tools like Airflow, dbt, or similar workflow/orchestration tools
  • Strong understanding of cloud-based data infrastructure (e.g., AWS, GCP, or Azure)
  • Knowledge of data modeling techniques and data warehouse design (e.g., star/snowflake schemas)
  • Experience working with structured and semi-structured data from APIs, SaaS tools, and databases
  • Familiarity with version control (Git), CI/CD, and Agile development methodologies
  • Strong communication and collaboration skills
  • Preferred: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or related technical field
  • Preferred: Experience with modern data warehouses like Redshift, BigQuery, or Snowflake
  • Preferred: Exposure to modern DevOps/dataops practices (e.g., Terraform, Docker, dbt Cloud)
  • Preferred: Experience integrating with Salesforce or other CRM/marketing platforms
  • Preferred: Knowledge of data privacy and compliance considerations (e.g., FERPA, GDPR)