Travelers

Data Engineer II, Enterprise Analytics

Travelers

full-time

Posted on:

Origin:  • 🇺🇸 United States

Visit company website
AI Apply
Manual Apply

Salary

💰 $123,000 - $203,000 per year

Job Level

Mid-LevelSenior

Tech Stack

AirflowAWSCloudDockerDynamoDBJavaScriptJenkinsKafkaMongoDBOpenShiftPythonSQLTerraformVault

Requirements

  • Bachelor’s Degree in STEM related field or equivalent
  • Eight years of related experience
  • Experience with building robust data pipelines and working with large-scale datasets.
  • Highly proficient use of tools, techniques, and manipulation including Cloud platforms, programming languages, and a full understanding of modern engineering practices.
  • Proficiency in tools like AWS, Databricks, Snowflake, Ab Initio, Terraform etc.
  • Passion for automation, optimization and delivering high-quality data solutions.
  • The ability to deliver work at a steady, predictable pace to achieve commitments, deliver complete solutions but release them in small batches, and identify and negotiate important tradeoffs.
  • Demonstrated track record of domain expertise including understanding technical concepts necessary and industry trends and possess in-depth knowledge of immediate systems worked on and some knowledge of adjacent systems.
  • Strong problem solver who ensures systems are built with longevity and creates innovate ways to resolve issues.
  • Strong written and verbal communication skills with the ability to work collaborate well with team members and business partners.
  • Ability to lead team members and help create a safe environment for others to learn and grow as engineers.
  • and a proven track record of self-motivation in identifying opportunities and tracking team efforts.
  • Experience with some of the following tools & platforms (or similar): AWS (S3, Lambda, Kinesis, API Gateway, IAM, Glue, SNS, SQS, EventBridge, EKS, VPC, Step Functions, ECS/EKS, DynamoDB, etc.), Databricks, Python, JavaScript, Kafka, dbt, Terraform, Snowflake, SQL, Jenkins, GitHub, Airflow, Alation, Secrets Management (HashiCorp Vault, AWS Secrets Manager, or similar), Docker / OpenShift / Open Cloud Foundry, MongoDB, SonarQube
  • Knowledge and experience with the some of the following concepts: Real-time & Batch Data Processing, Workload Orchestration, Cloud, Data lakes, Data Security, Networking, Serverless, Testing/Test Automation (Unit, Integration, Performance, etc.), DevOps, Logging, Monitoring, and Alerting, Containerization, Encryption / Decryption, Data Masking, Cost & Performance Optimization
  • What is a Must Have? Bachelor’s degree or equivalent training with data tools, techniques, and manipulation.
  • Four years of data engineering or equivalent experience.