
Senior Data Engineer, Analytics
CI&T
full-time
Posted on:
Location Type: Hybrid
Location: Campinas • Brazil
Visit company websiteExplore more
Job Level
About the role
- Design, develop, and maintain data pipelines (batch and streaming) for ingestion, transformation, and delivery of data for analytics and application consumption.
- Build and evolve analytical modeling (bronze/silver/gold layers, data marts, star schemas, wide tables), ensuring consistency, documentation, and reusability.
- Implement data quality best practices (tests, validations, contracts, SLAs/SLOs, monitoring of freshness/completeness/accuracy) and manage incident resolution with root cause analysis (RCA).
- Define and maintain technical data governance: catalog, lineage, versioning, naming conventions, ownership, access policies, and audit trails.
- Optimize performance and cost of queries and pipelines (partitioning, clustering, incremental loads, materializations, job tuning).
- Support the full delivery lifecycle (discovery → development → validation → operations), aligning business requirements with technical needs and ensuring predictability.
- Collaborate with BI/Analytics teams to define metrics, dimensions, facts, and the semantic layer, ensuring traceability of key indicators.
- Enable and operationalize AI/ML use cases.
- Integrate sources and systems (APIs, databases, queues, events, files), ensuring security, idempotency, fault tolerance, and end-to-end traceability.
- Produce and maintain technical and functional documentation relevant for auditing, support, and knowledge transfer.
Requirements
- Proven experience as a Data Engineer focused on Analytics (building pipelines, modeling, and making data available for consumption).
- Strong command of SQL and solid experience with Python (or an equivalent language) for data engineering and automation.
- Experience with orchestration and workflow design (e.g., Airflow, Dagster, Prefect, or similar).
- Experience with data warehouses/lakehouses and analytical formats/architectures (e.g., BigQuery, Snowflake, Databricks, Spark; Parquet, Delta, Iceberg).
- Hands-on experience with ETL/ELT, incremental loads (CDC when applicable), partitioning, and performance/cost optimization.
- Knowledge of data quality and reliability best practices (data testing, observability, metrics, incident management, and RCA).
- Experience with version control (Git) and delivery practices (code review, branching patterns, basic CI).
- Strong verbal and written communication skills for interacting with technical teams and stakeholders, with the ability to translate requirements into clear deliverables.
Benefits
- Health and dental insurance;
- Meal and grocery allowance;
- Childcare assistance;
- Extended parental leave;
- Partnerships with gyms and health/wellness professionals via Wellhub (Gympass) TotalPass;
- Profit-sharing program;
- Life insurance;
- Continuous learning platform (CI&T University);
- Employee discount club;
- Free online platform dedicated to physical and mental health and wellbeing;
- Pregnancy and responsible parenting course;
- Partnerships with online course platforms;
- Language learning platform;
- And many more
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data pipelinesanalytical modelingdata quality best practicestechnical data governanceperformance optimizationSQLPythonETLdata warehousesincremental loads
Soft Skills
communication skillscollaborationincident managementroot cause analysisdocumentation