
Data Engineer – GCP, Python
Egen
full-time
Posted on:
Location Type: Remote
Location: Remote • 🇺🇸 United States
Visit company websiteSalary
💰 $85,000 - $100,000 per year
Job Level
Mid-LevelSenior
Tech Stack
AirflowBigQueryCloudGoogle Cloud PlatformLinuxPythonSparkSQLUnix
About the role
- Design, develop, and support GCP data pipelines to extract, load, and transform data
- Maintain a holistic view of information assets by creating and maintaining artifacts that illustrate how information is stored, processed, and supported (i.e. documentation)
- Work with the project team, estimate and plan the work effort
- Attend daily team stand-ups
- Work with business analysts and project management to review business requirements and produce technical design specs that will meet the requirements
Requirements
- 3+ years of hands-on experience as a Data Engineer or Data Architect
- Proven track record leading technical projects and teams
- Expert proficiency in Google Cloud Platform (GCP) tools, including: Google Cloud Storage (GCS), BigQuery, Cloud Composer/Airflow, Dataproc/Spark
- Strong data analysis and problem-solving skills
- Advanced SQL skills, including writing, tuning, and interpreting complex SQL queries
- Experience writing and maintaining Unix/Linux shell scripts
- Solid understanding of Data Build Tool (DBT)
- Experience with CI/CD pipelines
- Machine Learning experience
- Proficiency in developing Python-based ELT data pipelines
- Expertise in optimizing GCP BigQuery SQL queries and scripts
Benefits
- health insurance
- retirement plans
- professional development opportunities
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data engineeringdata architectureGoogle Cloud PlatformGoogle Cloud StorageBigQueryCloud ComposerAirflowDataprocSparkSQL
Soft skills
problem-solvingleadershipteam collaborationcommunicationproject estimationplanning