Swiss Re

Data Engineer

Swiss Re

full-time

Posted on:

Location Type: Hybrid

Location: BangaloreIndia

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Design and build robust, scalable data pipelines (batch and streaming) that transform complex insurance data into actionable insights
  • Engineer and maintain analytics-ready data products with clear contracts and comprehensive documentation
  • Ensure high data quality, observability, lineage, and reliability across all data pipelines
  • Own end-to-end delivery of data engineering initiatives from design through production implementation
  • Break down complex requirements into executable technical plans and deliver against aggressive timelines
  • Proactively identify and resolve bottlenecks, technical debt, and operational risks before they impact business operations
  • Establish and enforce engineering standards including coding practices, testing protocols, CI/CD pipelines, and data modeling approaches
  • Drive reusability and simplification across the data ecosystem to reduce fragmentation and technical debt
  • Collaborate with analytics, data science, underwriting, and business teams to translate needs into scalable solutions
  • Communicate effectively on trade-offs, timelines, and risks to technical and non-technical stakeholders
  • Provide technical guidance and constructive code/design reviews to help elevate team capabilities.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, or related technical field
  • 8+ years of professional experience in data engineering, with at least 5 years specifically in the insurance or financial services industry
  • Advanced expertise in PySpark for large-scale data processing, including performance optimization and best practices
  • Strong proficiency in TypeScript for developing robust, type-safe applications and data services
  • Experience with Palantir platforms and tools for data integration and analytics workflows
  • Extensive experience designing, implementing, and maintaining complex ETL/ELT pipelines in cloud environments (preferably AWS or Azure)
  • Proven track record of implementing data governance, quality, and lineage solutions at enterprise scale
  • Deep knowledge of data modeling techniques and best practices for both analytical and operational data store.
Benefits
  • Hybrid work model where the expectation is that you will be in the office at least three days per week
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringPySparkTypeScriptETLELTdata modelingdata governanceperformance optimizationcloud environmentsanalytics-ready data products
Soft Skills
communicationcollaborationproblem-solvingtechnical guidancedocumentationrisk managementtime managementleadershipcritical thinkingadaptability
Certifications
Bachelor's degree in Computer ScienceMaster's degree in Engineering