Build an integrated, scalable, secure, and robust Analytics Platform to enable internal teams and customers to leverage data for insights, decision making and innovation.
Architect and develop high-quality data assets for both product and AI/ML use cases.
Design and implement scalable data pipelines to enable self-service analytics, feature engineering and ML driven insights.
Develop data transformations and establish robust data models, ensuring high data quality and performance.
Implement DataOps best practices, including CI/CD pipelines for analytics, ensuring data reliability through automated testing and documentation.
Ensure compliance with evolving data protection regulations, maintaining data security and privacy.
Mentor team members to enhance their skills and improve team performance.
Promote trust in data quality and implement tools to meet regulatory and business needs.
Present deep dives on both technical and product-related aspects to stakeholders.
Requirements
8+ years of experience in Data Engineering or Analytics Engineering
At least 3+ years of hands-on experience designing and implementing large-scale data and ML pipelines
Prior experience working with cloud native data solutions on AWS, GCP or Azure
Proficiency in SQL
Proficiency in at least one data engineering language (Python/Scala)
Experience in ML feature engineering and collaboration with ML engineers & Data Scientists
Strong working experience with modern data engineering stack (Spark, dbt, Feast, Airflow, MLflow, Containers, Iceberg etc.)
Excellent communication skills
Strong understanding of data management concepts (modeling, warehousing, governance)
Hands-on experience with relational (PostgreSQL) and columnar (Redshift, etc.) databases
Experience with source control like Git, Infrastructure as Code (Terraform) and CI/CD for data workflows
Solid problem-solving and troubleshooting capabilities
Demonstrated ability to embrace AI and apply it to your role
Nice to have: Experience in event-driven architectures (Kafka, AWS Event Bridge/SQS, Kinesis)
Nice to have: Understanding of Data Mesh principles and decentralized data ownership
Nice to have: Proven ability to influence technical direction in large organizations
Benefits
All full-time positions or part-time roles working 30 hours or more a week at Guidewire are eligible for benefits that support their health and well-being including health, dental, and vision insurance, paid time off, and a company sponsored retirement plan.
In addition, some roles may be eligible for the annual company bonus plan, commissions, and/or long term incentive awards which are contingent on a variety of factors including, but not limited to, company and employee performance.
Disability Accommodations and Guidewire’s Appeals Process
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
Data EngineeringAnalytics EngineeringSQLPythonScalaML feature engineeringData modelingData warehousingData governanceProblem-solving
Soft skills
Excellent communicationMentoringCollaborationInfluencing technical direction