Salary
💰 $125,000 - $175,000 per year
Tech Stack
AWSAzureCloudDistributed SystemsGoogle Cloud PlatformPySparkPythonSQL
About the role
- Design, develop, and maintain scalable data pipelines and processes to support data warehouses and lakehouses.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
- Maintain best practices for data governance, data quality, and data security.
- Troubleshoot and resolve data-related issues in a timely manner.
- Implement and maintain security practices for sensitive data.
Requirements
- 3+ years' experience in a software or data engineering role.
- Consulting experience is** REQUIRED **
- Professional experience with **Databricks, PySpark, & Snowflake.**
- Experience with SQL and relational databases.
- Professional experience programming with Python.
- Basic understanding of distributed systems and cloud computing platforms (e.g., AWS, Azure, GCP).
- Experience in designing, developing, and maintaining data pipelines and infrastructure.
- Able to work independently and as part of an agile team in a larger corporate environment.
- Excellent communication skills with the ability to relay information between technical and non-technical teams.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data engineeringdata pipelinesSQLPythonDatabricksPySparkSnowflakedistributed systemscloud computing
Soft skills
communicationcollaborationtroubleshootingindependenceagile teamwork