CI&T

Principal Data Architect

CI&T

full-time

Posted on:

Location Type: Remote

Location: Remote • Colorado • 🇺🇸 United States

Visit company website
AI Apply
Apply

Job Level

Lead

Tech Stack

AirflowAWSCloudDockerETLGraphQLKafkaKubernetesPythonSQLVault

About the role

  • Define, architect, and implement scalable data platforms and end-to-end ELT pipelines aligned with modern Lakehouse principles.
  • Work closely with cross-functional teams across the US, Colombia, and Brazil to ensure that our data ecosystem is reliable, future-proof, and aligned with enterprise architecture standards.
  • This position requires deep technical expertise, strong architectural thinking, and the ability to influence and mentor engineering teams.
  • Collaborate with global stakeholders, presenting architectural recommendations, and ensuring alignment across distributed teams.

Requirements

  • Expert level SQL with demonstrated ability to optimize, refactor, and validate large-scale transformations.
  • Advanced Python (or similar) for automation, orchestration, and pipeline development.
  • Hands-on architecture and engineering experience with Snowflake, including performance tuning, security, data governance, dynamic tables, and workload management.
  • Advanced dbt expertise, including transformation logic, testing, documentation, deployment patterns, and CI/CD integration.
  • Proven production experience with Data Vault 2.0, including Hubs, Links, Satellites, PIT tables, multi-active satellites, and Business Vault patterns.
  • Experience with AutomateDV or equivalent frameworks is a strong asset.
  • Deep understanding of Data Lakehouse architectures, including medallion zone structures, incremental ingestion, and open table formats (Iceberg, Delta, Hudi is a plus).
  • Solid foundation in data modeling best practices, including normalized models, dimensional modeling, historization, and scalable enterprise patterns.
  • Ability to translate complex business requirements into robust, extensible architectural designs.
  • Experience orchestrating ELT/ETL workflows using Airflow, including DAG design, dependency strategies, and dynamic task generation.
  • Familiarity with modern orchestration frameworks such as Prefect, Dagster, or AWS Glue.
  • Comfort with CI/CD pipelines using GitHub Actions or similar tools, integrating dbt testing and Snowflake deployments.
  • Understanding of infrastructure automation, configuration-as-code, and environment management.
  • Nice to Have: Experience with data observability platforms (Monte Carlo, Datafold, Great Expectations).
  • Knowledge of Docker or Kubernetes for reproducibility and scalable deployments.
  • Familiarity with Kafka, AMQP, or other message brokers and event-driven architectures.
  • Experience working with REST/GraphQL APIs, streaming ingestion (Kinesis, Firehose), or real-time processing.
  • Experience supporting hybrid architectures, multi-cloud designs, or enterprise Lakehouse strategies.
Benefits
  • Premium Healthcare
  • Meal voucher
  • Maternity and Parental leaves
  • Mobile services subsidy
  • Sick pay-Life insurance
  • CI&T University
  • Colombian Holidays
  • Paid Vacations

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
SQLPythonSnowflakedbtData Vault 2.0Data LakehouseAirflowCI/CDDockerKubernetes
Soft skills
architectural thinkinginfluencementoringcollaborationcommunication