
Data Engineer
Thaloz
full-time
Posted on:
Location Type: Remote
Location: Remote • 🇧🇷 Brazil
Visit company websiteJob Level
Mid-LevelSenior
Tech Stack
AzureETLPySparkPythonSparkSQL
About the role
- Designing and developing data pipelines to extract, transform, and load data from various sources into the data warehouse leveraging Python & notebooks.
- Writing complex SQL queries for data extraction and manipulation from the data warehouse.
- Building and maintaining ETL processes using Azure Databricks with PySpark.
- Implementing data integration workflows using Azure Data Factory.
- Collaborating with cross functional teams including developers, data analysts, and business stakeholders to understand requirements and deliver high quality solutions.
- Optimizing performance of the data pipelines and ensuring scalability and reliability of the systems.
- Monitoring Data quality and troubleshooting issues in collaboration with the operations team.
- Maintaining documentation of the design and implementation of the data pipelines.
- Ability to collaborate on best practices in code creation while maintaining communication with the team for new business logic transformations.
Requirements
- Expertise in SQL, ideally with experience in working with data warehousing concepts.
- Strong firsthand experience with Azure Databricks and Spark.
- Proficiency in designing and implementing data integration workflow using Azure Data Factory.
- Demonstrates proficiency in Python programming and the ability to develop scalable data engineering pipelines in Python.
- Solid understanding of data engineering fundamentals including data modeling, data transformation, change data capture and performance optimization techniques.
- Experience working with Azure Data Lake for storing large data sets, maintaining Parquet/Delta tables, and performing efficient querying.
- Experience with version control systems and familiarity with CI/CD practices.
- Strong interpersonal skills, ability to clearly communicate, and voice concerns in a group setting.
- Initiative-taking, self-reliant approach with a willingness to learn business logic and work with critical faults. Candidates should be able to independently understand business requirements without relying on subject matter experts for ongoing explanations.
- Ability to collaborate effectively in planning and refinement sessions.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonSQLETLAzure DatabricksPySparkAzure Data Factorydata modelingdata transformationchange data captureperformance optimization
Soft skills
interpersonal skillscommunicationinitiativeself-reliancecollaborationproblem-solvingteamworkadaptabilitycritical thinkingindependence