BETSOL

Data Engineer

BETSOL

full-time

Posted on:

Location Type: Remote

Location: India

Visit company website

Explore more

AI Apply
Apply

About the role

  • design, build, and maintain scalable data pipelines and ingestion frameworks
  • deliver high value POCs to stabilize and build a strong foundation for an enterprise data platform
  • develop custom ingestion, optimize data workflows
  • ensure reliable data delivery into Snowflake or other cloud-based platforms
  • collaborate with analytics, product, and engineering teams to enable data-driven decision-making

Requirements

  • 5+ years of experience in data engineering or related roles.
  • Successfully delivered multiple high-value POCs projects.
  • Proficiency in Python for building data pipelines and automation scripts.
  • Hands-on experience with dbt for data transformation and modeling.
  • Strong expertise in Snowflake or similar cloud data warehouse platforms.
  • Strong expertise developing in AWS, Azure, or GCP cloud storage for federated data lakes.
  • Experience coding custom ingestion to load data from diverse sources.
  • Familiarity with modern ETL/ELT tools and best practices.
  • Understanding of cloud-based architecture and data security principles.
  • Experience working in Agile development environments.
  • Strong communication skills for cross-functional collaboration.
  • Nice-to-have: Experience with modern data observability and data catalog tools.
Benefits
  • comprehensive health insurance
  • competitive salaries
  • volunteer programs
  • scholarship opportunities
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringPythondbtSnowflakeAWSAzureGCPETLELTdata transformation
Soft Skills
communicationcollaborationdecision-makingAgile development