Tech Stack
AirflowBigQueryCloudGoogle Cloud PlatformPythonSQLTerraform
About the role
- Work on ingesting, transforming, and analyzing large datasets to support the Enterprise Securitization Solution
- Operationalize data lakes, data warehouses, and analytics platforms on Google Cloud Platform or other cloud environments
- Work in collaborative environment that leverages paired programming
- Work on a small agile team to deliver curated data products
- Work effectively with product owners, data champions and other technical experts
- Advocate for well-designed solutions through technical knowledge and communication skills
- Develop analytical data products using both streaming and batch ingestion patterns on GCP with solid data warehouse principles
- Serve as Subject Matter Expert in Data Engineering with focus on GCP native services and integrated third-party technologies
Requirements
- Bachelor's degree in Computer Science, Engineering, or a related technical field
- 5+ years of SQL development experience
- 5+ years of analytics/data product development experience
- 3+ years of Google cloud experience with solutions designed and implemented at production scale
- Experience with GCP native services like BigQuery, Google Cloud Storage, Dataflow, Dataproc
- 2+ years Experience working with Airflow for scheduling and orchestration of data pipelines
- 1+ Experience working with Terraform to provision Infrastructure as Code
- 2+ years professional development experience in Python
- Visa sponsorship is not available for this position
- Candidates must be legally authorized to work in the United States