The National Society of Leadership and Success (NSLS)

Data Engineer

The National Society of Leadership and Success (NSLS)

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Contribute to the Snowflake migration: Work with the broader data team to deprecate our legacy Redshift and Apache Hop infrastructure by completing the migration to Snowflake and dbt Cloud
  • Build production-grade dbt pipelines: Develop and maintain SQL transformations in dbt that power analytics for business stakeholders across the organization
  • Establish data architecture: Refine and maintain our medallion architecture (bronze, silver, and gold layers) to create clear separation of concerns and a single source of truth
  • Collaborate with the data team: Partner with our AWS Engineer, Analytics Engineer, and Business Analyst in a sprint-based workflow with code reviews and regular standups
  • Maintain dbt transformations (60-70% of role): Own approximately 15-20 core models and summary views that drive business reporting, ensuring they're performant, well-documented, and easy to maintain
  • Build new data pipelines: Ingest data from APIs and new sources as business needs evolve
  • Enable reverse ETL: Develop and manage batch processes to send transformed data to downstream services like HubSpot using tools like Hightouch
  • Monitor data quality: Implement testing and alerting to catch issues before they impact stakeholders
  • Contribute to technical standards: Help establish and maintain best practices for code quality, documentation, and data modeling

Requirements

  • 2-5 years of experience as a Data Engineer, Analytics Engineer, or similar role
  • Expert SQL skills: You write advanced, readable SQL including CTEs, window functions, and query optimization techniques
  • dbt proficiency: You've built and maintained production dbt projects and understand modeling best practices (this is critical for the role)
  • Python fundamentals: Comfortable working with dataframes, querying APIs, and writing scripts for data processing
  • Modern data stack familiarity: Experience with Snowflake, AWS, and orchestration tools like dbt Cloud or Airflow
  • Code craftsmanship: You write clean, modular, well-documented code that others can easily understand and maintain
  • AI-assisted development: Proficient with AI coding tools (Claude Code, GitHub Copilot, Cursor, or similar) to accelerate development, debug efficiently, and learn new technologies quickly
Benefits
  • Health insurance
  • Professional development opportunities
  • Work-life balance: Standard business hours (9-5 or similar), no on-call or off-hours expectations
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
SQLdbtPythondata modelingquery optimizationdata transformationETLreverse ETLdata architecturedata quality monitoring
Soft Skills
collaborationcode craftsmanshipdocumentationproblem-solvingcommunication