Sicredi

Senior Data Engineer

Sicredi

full-time

Posted on:

Location Type: Remote

Location: Brazil

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • **Build** scalable, resilient and high-performance data pipelines on Databricks and AWS.
  • **Develop** solutions in Python and SQL, applying best practices, patterns and automation.
  • **Implement** data models and processing layers based on the Lakehouse architecture.
  • **Optimize** Spark jobs, ensuring efficiency, governance and reliability of data workloads.
  • **Integrate** different data sources, ensuring security, quality and compliance.
  • **Collaborate** with architects and engineers to define technical solutions, standards and engineering practices.
  • **Support** platform user squads by providing technical guidance and assistance in the adoption of solutions.
  • **Monitor** pipelines, metrics and performance, proactively addressing issues.
  • **Document** solutions, workflows, processes and engineering standards.
  • **Contribute** to modernization, automation and continuous improvement initiatives for the Data & AI Platform.

Requirements

  • Bachelor's degree in a technology-related field (Engineering, Computer Science, Systems or related areas).
  • Strong experience with **Databricks** (required), including Spark, Delta Lake, jobs and optimization.
  • Advanced knowledge of **AWS**, especially S3, Glue, IAM, Lambda and big data services.
  • Proficiency in **Python** and **SQL** for building pipelines and engineering processes.
  • Experience with modern data architectures (Data Lakehouse, batch/streaming).
  • Experience with engineering practices: CI/CD, IaC, version control, automation.
  • Knowledge of security, governance and data-oriented architecture principles.
  • Ability to work autonomously, with an ownership mindset and long-term technical vision.
  • **Preferred Requirements**
  • Experience with **Teradata** (desirable).
  • Experience with distributed systems and Spark performance tuning.
  • Experience with event-driven and streaming architectures (Kafka, Kinesis).
  • Familiarity with orchestration tools (Airflow, Step Functions, Databricks Workflows).
  • AWS or Databricks certifications.
  • Experience with observability and monitoring of pipelines.
  • Participation in data platform modernization or migration projects
Benefits
  • 14th and 15th fixed salaries.
  • Profit-sharing (based on seniority).
  • Health and Dental plans with no co-pay.
  • Wellbeing programs via Wellhub (formerly Gympass): Nutrition, Psychology, Occupational Health, Massage, running group and local gym.
  • Meal and Food allowances (VA/VR) – flexible percentage allocation between cards, with no co-pay.
  • Extended maternity and paternity leave.
  • Childcare or nanny allowance for children up to 6 years and 11 months.
  • Support for children with disabilities, with no age limit.
  • Life insurance.
  • Private pension plan up to 8% of salary.
  • Training platform – Sicredi Aprende, offering a variety of courses.
  • 40-hour workweek – using a time bank system.
  • Remote work allowance (except for positions that are 100% on-site).
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
PythonSQLDatabricksSparkDelta LakeAWSCI/CDIaCautomationdata architectures
Soft Skills
collaborationtechnical guidanceautonomyownership mindsetlong-term technical vision
Certifications
AWS certificationDatabricks certification