Tech Stack
AirflowApacheAWSAzureCloudETLGoogle Cloud PlatformNoSQLPySparkPythonSparkSQL
About the role
- Build and optimise data infrastructure to support advanced analytics and predictive modeling.
- Design, build, and optimise data pipelines, data lakes, and data warehouses using AWS and Databricks.
- Manage and maintain AWS and Databricks environments for optimal performance and uptime.
- Ensure data integrity, accuracy, and consistency through rigorous quality checks and monitoring.
- Collaborate with cross-functional teams to translate business needs into technical solutions.
- Explore and implement new tools and technologies to enhance ETL platform performance.
Requirements
- Degree educated in a computer science-based subject.
- Proficient in SQL for complex data extraction and performance optimization.
- Skilled in Python, PySpark, and Airflow for building scalable ETL processes.
- Experience with SQL/NoSQL and vector databases for large language models.
- Familiarity with data modeling and performance tuning for OLAP and OLTP systems.
- Knowledge of Apache Spark, Apache Airflow, and DevOps practices.
- Experience with cloud platforms such as AWS, GCP, or Azure.
- Vast opportunities to learn, grow, and advance across our global organization.
- A diverse and inclusive culture where your voice is heard and your ideas matter.
- A comprehensive Amgen Total Rewards Plan covering health, financial, and career benefits.
- Flexible work arrangements to support work-life balance.
- Reasonable accommodations for individuals with disabilities throughout application, interview, and employment.
ATS Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
SQLPythonPySparkAirflowETLdata modelingperformance tuningApache SparkDevOpsNoSQL
Soft skills
collaborationcommunicationproblem-solvinganalytical thinking