Salary
💰 $127,500 - $185,150 per year
Tech Stack
AWSCloudETLJavaMySQLPostgresPythonScalaSparkSQL
About the role
- Responsible for designing, constructing, and maintaining scalable data pipelines and systems
- Work with programming languages like Python, Java, and SQL
- Leverage big data technologies such as Spark
- Collaborate with cross-functional teams to understand business requirements
- Ensure data quality and integrity
- Design, develop, and deploy modular data pipelines
- Ensure efficient functioning of data storage and processes
- Interact with clients, provide cloud support, and make recommendations
Requirements
- 5+ years of experience in data engineering, building pipelines
- 3+ Experience with version control, Gitlab, CI/CD pipelines, and containerization tools
- 3+ Experience with Glue, Spark, and other big data processing frameworks.
- Proficiency in languages such as Python, Java, Scala, or SQL.
- Experience with relational databases (e.g., MySQL, PostgreSQL)
- Experience of ETL (Extract, Transform, Load) processes and tools
- Experience with AWS GovCloud
- Experience translating business requirements into technical specifications
- Experience with cross-functional collaboration
- Technical Bachelor's degree
- Health insurance
- Flexible spending accounts
- Health savings accounts
- Retirement savings plans
- Life and disability insurance programs
- Paid time off
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonJavaSQLSparkETLGitlabCI/CDGlueScalaMySQL
Soft skills
collaborationcommunicationbusiness requirements translationdata quality assurancecross-functional teamwork
Certifications
Technical Bachelor's degree