Samsara

Manager, Data and Integrations

Samsara

full-time

Posted on:

Origin:  • 🇮🇳 India

Visit company website
AI Apply
Manual Apply

Job Level

SeniorLead

Tech Stack

AWSAzureCloudETLGoogle Cloud PlatformIoTPythonSDLCSQL

About the role

  • Lead the data engineering team by driving the design, build, test, and launching of new data pipelines, data models, and predictive models on our production data platform
  • Lead a team of data engineers and guide them with the best data strategies in line with data needs
  • Identify, design, and implement process improvements including building/re-engineering data models, data architectures, pipelines, and data applications
  • Oversee data management, governance, security, and analysis and continuously look for data optimization processes
  • Hire, mentor, and grow the team; offer technical guidance and leadership on planning, designing, and implementing data solutions
  • Manage data delivery through high-performing dashboards, visualizations, and reports
  • Ensure data quality and security across every product vertical and related areas
  • Design, create and launch new data models and pipelines as per needs
  • Act as a project manager for data projects and ensure test-driven, maintainable, reusable pipelines
  • Work towards achieving high performance, operational excellence, accuracy, and reliability of the overall system
  • Design and build infrastructure for extraction, transformation, and loading of data from a wide range of data sources
  • Build and maintain data foundations including tools, infrastructure, and pipelines to support marketing and sales teams
  • Increase automation and build analytic solutions at scale to serve business requirements
  • This is a hybrid role requiring 2 days per week in the Bangalore office and 3 days remote; availability during East Coast hours and on-call is required

Requirements

  • Minimum 10+ years of experience in managing, designing, and maintaining large scale data and integration systems solutions
  • Bachelors degree in Computer Science, Math, Statistics, Information Systems, Informatics, Analytics, or another quantitative field
  • Hands-on experience analyzing, architecting/re-architecting data and analytics platforms on Azure/AWS/GCP
  • Experience designing and architecting data lakes on cloud platforms and BI tools/solution design
  • Experience developing real time application integration for operational data integrations using tools such as Workato or Mulesoft or other relevant integration tools
  • Experience developing or managing REST APIs
  • Programming knowledge and hands-on experience in Python
  • Working knowledge of full software development life cycle that includes planning, coding standards, testing methods, build process, and source control management
  • Experience offering technical leadership and guiding teams for data engineering best practices
  • Strong knowledge of AWS tools and technologies
  • (Ideal) 10+ years of experience with very large scale data warehouse projects
  • (Ideal) Experience working with APIs to collect or ingest data and building APIs
  • (Ideal) Strong working knowledge of AWS Databricks, DBT and integration tools such as Workato
  • (Ideal) Knowledge of data security best practices and experience building secure data solutions
  • (Ideal) Advanced working SQL knowledge and experience with relational databases and query tools
  • (Ideal) Hands-on experience in building data pipelines, platforms, architecture, structures, visualizations, and data modeling
  • (Ideal) Experience optimizing data models and conducting performance engineering for large scale data
  • (Ideal) Experience with data visualizations and self-service data preparation tools