Blue Nile

Data Engineer III

Blue Nile

full-time

Posted on:

Location Type: Remote

Location: New YorkUnited States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $147,000 - $172,000 per year

Job Level

About the role

  • Design, implement, and maintain complex data pipelines, ensuring scalability and reliability using Airflow, dbt, Rivery, Python, and SQL, enabling robust ingestion and transformation of structured and semi-structured data.
  • Serve as a strategic partner to business teams, working closely with stakeholders to translate high-level goals into data solutions that support forecasting, performance tracking, and optimization.
  • Develop and maintain clean, well-documented data models in Snowflake and BigQuery that support analytics, reporting, and operational workflows and contribute to architecture decisions.
  • Integrate data from a variety of internal and external sources, including Google Analytics and third-party APIs, to support full-funnel visibility across departments.
  • Enable self-service analytics by ensuring data assets are discoverable and usable via tools such as Tableau, including thoughtful semantic layer design and performance tuning.
  • Contribute to the development of robust monitoring and observability practices for data quality and pipeline health.
  • Collaborate on architecture and design decisions, including cloud infrastructure and containerization using AWS, Pulumi, and Docker.
  • Maintain strong documentation and promote engineering standards that ensure transparency, maintainability, and reusability of data systems.

Requirements

  • 7+ years of professional experience in data engineering, analytics engineering, or related roles.
  • Advanced proficiency in SQL and Python, with expertise in efficient query writing, data structures, and software engineering principles.
  • Hands-on experience with Snowflake and/or Big Query, including data modeling and performance optimization.
  • Proficiency with orchestration tools (e.g., Airflow) and data integration tools like dbt.
  • Experience working with cloud platforms, especially AWS, for data storage, compute, and infrastructure management, including services such as AWS Batch, ECR, Lambda functions, and related tools.
  • Familiarity with data analytics and visualization tools, particularly Tableau, and ability to support data consumers in building actionable dashboards.
  • Experience with marketing and product data sources, including Google Analytics and similar platforms.
  • Strong knowledge of ETL/ELT design and data warehousing solutions.
  • Familiarity with CI/CD pipelines and DevOps practices for data engineering.
  • Strong skills in Microsoft Suite for documentation and collaboration.
  • Robust experience with API design and integration.
  • Familiarity with SCRUM development methodologies and tools like Jira
Benefits
  • Paid Time Off
  • Medical, Dental, Vision and Prescription Insurance
  • 401(k) Retirement Plan with Company Match
  • Flexible Spending Account | Health Savings Account
  • Tuition Reimbursement
  • Employee Discount
  • Parental Leave
  • Life Insurance
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
SQLPythonSnowflakeBigQueryAirflowdbtAWSETLAPI designdata modeling
Soft Skills
collaborationcommunicationdocumentationproblem-solvingstakeholder engagementperformance optimizationdata visualization supportengineering standards promotiontransparencymaintainability