Lean Tech

Senior Data Engineer

Lean Tech

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇨🇴 Colombia

Visit company website
AI Apply
Apply

Job Level

Senior

Tech Stack

AWSAzureCloudDockerGoogle Cloud PlatformKafkaKubernetesOraclePostgresSQLTerraform

About the role

  • Design, build, and maintain advanced data models and transformations using DBT across Materialize and Snowflake environments.
  • Develop and optimize expert-level SQL queries, views, and stored procedures, focusing on compute cost, memory usage, and advanced indexing strategies.
  • Translate business and BI requirements into scalable semantic models and curated tables to support real-time dashboards and reporting.
  • Monitor, tune, and optimize Materialize cluster usage, managing compute resource sizing and memory performance.
  • Troubleshoot and resolve upstream data issues by collaborating with Data Platform engineers on components such as CDC connectors, pipelines, or message flows.
  • Participate in schema design, data quality assessments, and the implementation of data governance best practices.
  • Collaborate with BI engineers to define data models that effectively support analytical and reporting requirements.
  • Analyze and support streaming-enabled architectures, including data flows from CDC, Kafka, and Materialize into Snowflake.
  • Support infrastructure tasks by understanding Infrastructure as Code (IaC) deployments, reviewing containerized flows, and exploring system logs.
  • Engage in continuous improvement efforts focused on pipeline reliability, performance tuning, cost optimization, and technical documentation.

Requirements

  • Expert-level SQL proficiency, including query optimization, indexing strategies, understanding of database engine behavior, and the ability to write complex transformations.
  • Hands-on experience with DBT for data transformation and modeling.
  • Strong understanding of relational database concepts, including schemas, views, indexes, and query plans.
  • Experience working with modern data warehouses such as Snowflake.
  • Solid understanding of SQL-based stored procedures or functions, preferably in PostgreSQL, with experience in other engines like Oracle or SQL Server also being valuable.
  • Experience with streaming-enabled databases like Materialize, including an understanding of compute resource usage and cluster sizing.
  • Ability to debug and troubleshoot upstream pipeline issues related to CDC, connectors, or ingestion workflows.
  • Familiarity with streaming and real-time systems concepts, such as Kafka and JSON message consumption patterns.
  • Experience working in modern cloud environments (AWS, GCP, Azure, or Oracle).
  • General software engineering skills, including the ability to understand data flows, investigate logs, and reason about deployment components.
  • Familiarity with container technologies such as Docker and orchestration tools like ECS or Kubernetes from a conceptual standpoint.
  • Conceptual understanding of Infrastructure as Code (IaC) tools, such as Terraform or CloudFormation.
  • Ability to handle schema evolution challenges, including adjusting models when upstream schemas change (e.g., new fields, nullability changes, removed fields).
Benefits
  • Professional development opportunities with international customers
  • Collaborative work environment
  • Career path and mentorship programs that will lead to new levels

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
SQLDBTdata modelingquery optimizationindexing strategiesstored proceduresstreaming databasesdata governanceInfrastructure as Codeschema evolution
Soft skills
collaborationtroubleshootingcontinuous improvementanalytical thinkingproblem-solving