Tech Stack
AirflowAmazon RedshiftAWSCloudEC2ETLNoSQLPySparkPythonSparkSQL
About the role
- Job location: Remote in India
- About the role: We are seeking a Data Engineer with expertise in data infrastructure, pipeline development, and scalable data solutions to play a pivotal role in enabling data-informed decision-making across the organization.
- The successful candidate will have a deep knowledge of data architecture and engineering best practices and will work closely with cross-functional teams to ensure data is accurate, reliable, and accessible.
- This role requires strong technical skills, a solid grasp of business needs, and the ability to bridge the gap between raw data and actionable insights through robust engineering solutions.
Requirements
- Bachelor's degree in Computer Science or a related quantitative field; relevant professional experience may be considered in lieu of formal education.
- Minimum 2 years of experience working as a Data Engineer.
- Strong proficiency in Python, SQL, Spark, and EC2, with hands-on experience in the AWS ecosystem.
- Practical experience with PySpark Glue jobs, Lambda functions, NoSQL databases, job orchestration using Airflow, and/or managing Redshift databases is a strong plus.
- Detail-oriented with a keen interest in examining data transformations and their impact on business outcomes.
- Excellent problem-solving and time management skills.
- Flexible and able to work effectively in a fast-paced, dynamic environment.
- Prior experience in project or team management is preferred; aptitude for mentoring and assisting others is a plus.