Salary
💰 $165,000 - $270,900 per year
Tech Stack
ApacheAWSAzureCloudETLGoogle Cloud PlatformIoTKafkaPythonSparkSQL
About the role
- Architect and develop cloud data warehouse solutions leveraging Snowflake with integration to Databricks
- Design and implement robust data pipelines to ingest, process, and store IoT data from diverse sources.
- Integrate and manage large-scale datasets using Apache Iceberg for efficient, reliable, and scalable data lake operations.
- Design and build an Operational Data Store (ODS) to function as a speed layer, enabling rapid access for near real-time Use Cases.
- Collaborate with data engineers, analysts, and business stakeholders to understand requirements and deliver scalable solutions.
- Optimize data models, storage, and compute resources for performance and cost efficiency.
- Ensure data quality, security, and compliance across all cloud data platforms.
- Develop and maintain documentation for architecture, processes, and best practices.
- Setting up monitoring tools and dashboards to track pipeline health, diagnose issues, and optimize performance.
- Mentor and support junior engineers through guidance, coaching, and learning opportunities.
- Stay current with industry trends and best practices in data management, ODS technologies, and API development.
Requirements
- Bachelor’s in software engineering, Computer Science, Information Technology, or a related field or equivalent experience.
- Proven experience architecting and implementing cloud data warehouses using Snowflake or any other data platform.
- Hands-on experience with Apache Spark and/or Apache Iceberg for data lake pipelines.
- Experience designing and building Operational Data Stores (ODS) as speed layers for analytic environments.
- Experience with IoT data ingestion, processing, and analytics.
- Strong proficiency in SQL, Python, and ETL tools.
- Proven cloud experience and strong familiarity with at least one cloud platform ( Microsoft Azure - preferred , AWS, GCP).
- Strong problem-solving and analytical skills along with strong interpersonal and communication skills.
- Experience with data visualization tools preferred.
- Experience in proactive issue detection leveraging anomaly detection techniques.
- Good understanding and experience with CI/CD practices.
- Ability to prioritize and manage multiple tasks and projects.