Knowledge Anywhere

Lead Engineer, Data Platform

Knowledge Anywhere

full-time

Posted on:

Origin:  • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $150,000 - $190,000 per year

Job Level

Senior

Tech Stack

ApacheAWSCloudETLJavaKafkaMySQLOraclePythonSQL

About the role

  • Anywhere is at the forefront of driving the digital transformation and building best-in-class products that help our agents and brokers sell more homes, make more money, and work more efficiently.
  • We're seeking a talented, creative, and motivated Lead Engineer who enjoys building data platform tools and is eager to collaborate with a team of individuals who share your passion.
  • You’ll work with other Data Engineers for the build-out of the Next Generation Data Platform.
  • You’ll design and develop a Data Ingestion Service for real-time streaming of data from SQL Server, MySQL, and Oracle using CDC-based technologies.
  • You’ll design and develop a Data Ingestion Service for real-time streaming of data from third-party APIs, internal micro-services, and files stored in S3/SFTP servers.
  • You’ll work with the team to design and develop a Data Platform Storage Optimization & Delta Detection Service using Apache Iceberg.
  • You’ll work with the team to design and develop a Data Catalog Service using Snowflake Horizon and Polaris.
  • You’ll work with the team to design and develop Data Observability using DataDog and Data Recon to detect data anomalies.
  • You will design and develop a CI/CD process for continuous delivery in AWS Cloud and Snowflake.
  • You’ll design, develop, and test robust, scalable data platform components.

Requirements

  • Bachelor in computer science, Engineering, or related technical discipline, or equivalent combination of training and experience
  • 10+ years of programming experience: building application frameworks and back-end systems for high-volume pipelines using Java/Python.
  • 10+ years of experience building data frameworks and platforms, scaling them to handle large volumes of data.
  • 5+ years of experience building streaming platforms using Apache Kafka, Confluent Kafka, and AWS Managed Kafka Service.
  • 5+ years of experience ingesting data from SQL Server/MySQL/Oracle using Change Data Capture, Debezium, and Kafka Connect.
  • 5 years’ experience using AWS Data Services: DMS, EMR, Glue, Athena, S3, and Lambda.
  • 2 years’ experience building Platform Monitoring using DataDog.
  • 2 years’ experience building Data Observability using Monte Carlo.
  • 2 years’ experience building data solutions using Apache Iceberg and Apache Hudi.
  • 1-year experience with data architecture, ETL, and processing of structured and unstructured data.
  • 5 years’ experience with DevOps tools (any combination of GitLab, Bitbucket) and methodologies (Lean, Agile, Scrum, Test Driven Development)
  • Excellent communication skills to convey information on various levels of seniority.
  • Team player, enjoying being part of a cross-functional setup.
  • Ability to perform well on time-critical endeavors and on multiple fronts at the same time.
  • Strong dedication to quality and a 'client' mindset.
  • A passion for learning and continuous improvement.