Join Creative Tech

Senior Data Architect

Join Creative Tech

full-time

Posted on:

Location Type: Remote

Location: Brasil

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Provide technical leadership for data architecture in a Lakehouse environment, ensuring scalability, performance, and security;
  • Act as a bridge between business areas, data engineering, and analytics teams, translating needs into technical data solutions;
  • Define, evolve, and standardize data architecture on the Databricks platform (clusters, notebooks, Delta Lake, Delta Live Tables – DLT);
  • Design and implement batch and streaming data pipelines, including full and incremental loads (CDC) from source systems (e.g., Oracle) into the Lakehouse;
  • Ensure data governance best practices, including cataloging, security, lineage, and technical documentation;
  • Define partitioning, modeling, and optimization strategies for Delta Lake tables to support high data volumes;
  • Support data development teams in resolving complex performance, architecture, and data quality issues;
  • Conduct technical reviews of code (Python, SQL, notebooks) and data solutions, ensuring adherence to defined standards;
  • Participate in impact analysis, solution design, and architectural decision-making for new projects and platform evolutions;
  • Provide production support and troubleshooting for pipelines and critical platform components in production environments;
  • Collaborate on defining and implementing Data Quality practices (rules, monitoring, alerts) and data observability;
  • Help organize the technical backlog of the data platform and plan deliveries together with the team;
  • Identify technical risks and propose structural improvements to the data architecture, focusing on reliability, cost, and performance.

Requirements

  • Bachelor’s degree in IT or a related field;
  • Proven experience with Databricks;
  • Proven experience as a Data Architect or in similar roles;
  • Specific experience with Delta Live Tables (DLT) for orchestration, processing, and data quality;
  • Experience modernizing data warehouses or legacy systems to a Lakehouse architecture;
  • Experience defining and evolving cloud-based data architectures;
  • Knowledge of data integration using JDBC/ODBC connectors, Spark Structured Streaming, and REST APIs;
  • Experience with partitioning strategies, Z-Order, and tuning/optimization of Delta Lake tables;
  • Knowledge of Data Quality and Data Observability practices (monitoring, alerts, data SLOs);
  • Knowledge of data governance and metadata management (Unity Catalog or similar catalogs);
  • Experience with agile methodologies applied to data teams;
  • Experience with CI/CD pipelines for data projects (deploying notebooks, jobs, infrastructure as code, etc.);
  • Advanced knowledge of data modeling, database design, and database management systems (e.g., SQL, NoSQL, Hadoop);
  • Experience with ETL tools and data integration;
  • Knowledge of Cloud Computing and related services (e.g., AWS, Azure, Google Cloud);
  • Expertise in data governance and security practices;
  • Strong analytical and problem-solving skills;
  • Good communication skills and the ability to work in cross-functional teams.
Benefits
  • Infrastructure allowance;
  • Flexible working hours;
  • Day off on your birthday – with a surprise!
  • Support for training and certifications;
  • Access to Alura;
  • Partnership with FIAP;
  • Referral bonus;
  • Health insurance;
  • Dental plan;
  • Vittude – mental health platform;
  • Wellhub – for physical health;
  • New Value – discount coupons;
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data architectureDatabricksDelta Live TablesPythonSQLdata integrationSpark Structured StreamingETLdata modelingdatabase design
Soft Skills
analytical skillsproblem-solving skillscommunication skillscollaborationtechnical leadership