
Data Engineer – Tech Lead
Carelon Global Solutions Philippines
full-time
Posted on:
Location Type: Office
Location: Bengaluru • India
Visit company websiteExplore more
Job Level
About the role
- Enforces coding standards, reviews code, and ensures maintainability.
- Coaches developers on technical skills and problem-solving.
- Helps team members grow in their technical careers.
- Develop ETL/ELT processes to ingest and transform data from various sources including MongoDB, APIs, Snowflake and flat files (JSON, ORC, Avro, Parquet, CSV).
- Implement data loading strategies into Snowflake and other data warehouses.
- Write and optimize complex SQL queries for data extraction, transformation, and analysis across multiple database platforms.
- Works with Product Managers, Architects, and other teams to align technical goals with business needs.
- Leverage AWS services to manage and orchestrate data workflows, ensuring high availability and scalability.
- Implementing the Job in AWS EMR, AWS Glue , AWS Lambda using pyspark.
- Perform performance tuning on Spark jobs and SQL queries to ensure efficient data processing.
- Participate in Agile ceremonies and contribute to sprint planning, story grooming, and retrospectives.
- Collaborate with cross-functional teams including data scientists, analysts, and DevOps engineers to deliver data solutions.
- Ensure data accuracy, consistency, and integrity through validation and quality checks.
- Maintain documentation for data pipelines, schemas, and data flow processes.
- Work with Kafka or similar technologies to build and maintain real-time data streaming solutions.
- Develop and maintain data ingestion processes from external/internal APIs, ensuring secure and reliable data flow.
Requirements
- 5+ years’ experience in Spark ecosystem, Python/Scala programming, MongoDB data loads, Snowflake and AWS platform (EMR, Glue, S3)
- 6+ years’ IT experience and good expertise in SDLC/Agile
- 6+ years’ experience in SQL, complex queries, and optimization
- Coaches developers on technical skills and problem-solving.
- Hands on experience in writing advanced SQL queries, familiarity with variety of databases.
- Experience in coding solutions using Python/Spark and performing performance tuning/optimization.
- Experience in building and optimizing ‘Big-Data’ pipelines in Cloud.
- Experience in handling different file formats like JSON, ORC, Avro, Parquet, CSV.
- Hands on experience in data processing with NoSQL databases like MongoDB.
- Familiarity and understanding of jobs scheduling.
- Hands on experience on working with APIs to process data.
- Understanding of data streaming, such as Kafka services.
- Certification on Snowflake (Snow PRO certification), and AWS (Cloud Practitioner/Solution Architect).
- Hands-on experience on Kafka streaming pipelines implementation.
Benefits
- Extensive focus on learning and development
- An inspiring culture built on innovation, creativity, and freedom.
- Holistic well-being
- Comprehensive range of rewards and recognitions
- Competitive health and medical insurance coverage
- Best-in-class amenities and workspaces
- Policies designed with associates at the center.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
SparkPythonScalaSQLETLELTdata processingperformance tuningdata ingestiondata transformation
Soft Skills
coachingproblem-solvingcollaborationcommunicationteamworkagile methodologiesleadershipdocumentationvalidationquality assurance
Certifications
Snow PRO certificationAWS Cloud PractitionerAWS Solution Architect