Renew Home

Senior Engineer, Data

Renew Home

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $160,000 - $200,000 per year

Job Level

About the role

  • Architect and deploy secure, scalable, and highly available batch and real-time data pipelines. Implement and optimize data lake architectures for structured and unstructured data from millions of connected devices.
  • Work closely with development teams to integrate data engineering services into the broader system architecture. Collaborate with cross-functional teams consisting of engineers, data scientists, and analysts to deliver clean, reliable data.
  • Analyze and enhance the performance of PostgreSQL Aurora and Redshift databases through query tuning, indexing & partitioning strategies, and efficient resource allocation.
  • Maintain system performance, data integrity, and uptime. Manage and participate in on-call rotations and ensure strong operational standards.
  • Contribute to the design and evolution of our data architecture to support growing business needs.
  • Work with tools and platforms such as Python, Redshift, Postgres, AWS/GCP, AWS Lambda, Kinesis, Prefect (or Airflow), Redis, Git, and Terraform.
  • Participate in our agile development process, including regular team updates, stand-up meetings, and one-on-ones.

Requirements

  • 5-10+ years of industry experience.
  • Bachelor's or Master's degree in computer science or equivalent experience in the software industry.
  • Self-starter who takes initiative to identify improvement areas, rigorously tests potential solutions, and proposes actionable enhancements to drive operational success.
  • Proficiency in Python and SQL, plus solid software engineering fundamentals.
  • Hands-on experience building scalable batch and real-time data pipelines using structured and unstructured data. Experience with orchestration tools like Prefect Airflow, Dagster etc.
  • Experience with streaming technologies like Apache Kafka, AWS Kinesis, Apache Flink, or GCP Pub/Sub.
  • Strong knowledge of data lake architectures and technologies (e.g., AWS S3, Iceberg, AWS Glue, Delta Lake, or similar).
  • Proven ability to analyze and optimize database performance, including query tuning, partitioning, indexing strategies, and resource allocation with extensive hands-on experience using Redshift and Postgres.
  • Proficiency in using CDK and Terraform for automating infrastructure deployment and management.
  • Ability to work collaboratively with development teams, providing guidance and mentorship on data infrastructure-related issues and best practices.
  • Commitment to staying up-to-date with the latest advancements in cloud infrastructure and database technologies, and continuously improving processes and systems.
  • Bonuses:
  • Extensive experience in data warehousing best practices and familiarity with advanced Redshift features (e.g., Spectrum, workload management).
  • Exposure to machine learning pipelines or big data frameworks like Apache Spark or Hadoop.
  • Contributions to open-source data projects or relevant certifications (e.g., AWS Certified Data Analytics, GCP Professional Data Engineer).
Benefits
  • A full-time position, with a competitive salary based on experience. The base salary for this role is: $160k - $200k. We use market data and consider your job family, background, skills, experience, and U.S. work location to determine compensation within our established pay range.
  • Fully remote work environment with home office set-up allowance.
  • Real and lived work-life balance - Company perks include no pre-set vacation limits (with a top-down culture of taking meaningful PTO every year!), parental leave benefits, and a corporate value of working sustainably and putting families first.
  • Competitive benefits package that includes numerous health and wellness benefits.
  • 401(k) plan, with employer contributions to the same.
  • Opportunity to work with amazing people who are passionate about their mission, thriving in a fully-remote work environment, and learning and growing every day.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
PythonSQLdata pipeline architecturequery tuningindexingpartitioningdata lake architecturestreaming technologiesinfrastructure automationdata warehousing best practices
Soft Skills
self-starterinitiativecollaborationmentorshipcommitment to improvementanalytical skillsproblem-solvingcommunicationteamworkleadership
Certifications
AWS Certified Data AnalyticsGCP Professional Data Engineer