Tech Stack
AirflowETLJavaPythonScalaTableauTerraform
About the role
- Design and implement scalable and efficient data models and database structures within Snowflake (tables, views, materialized views)
- Develop, manage, and optimize ETL/ELT data pipelines to ingest, transform, and load data into Snowflake using Snowpipe, streams, tasks, and third-party integrators
- Monitor, troubleshoot, and tune Snowflake queries and virtual warehouses for speed, concurrency, and resource utilization
- Implement and enforce data governance and security protocols including RBAC, data masking, and encryption
- Manage and administer Snowflake account with focus on cost optimization and credit consumption monitoring
- Establish and maintain secure data sharing with internal and external partners using Snowflake's native sharing features
- Automate data management tasks and database deployments using CI/CD practices and tools like dbt or Schema change
- Write and maintain clear and comprehensive technical documentation
- Collaborate with cross-functional teams and contribute to improving data handling across the organization
Requirements
- Deep understanding of Snowflake architecture (virtual warehouses, caching, storage)
- Strong experience designing analytical data models (star, snowflake, etc.)
- Solid background in ETL/ELT design, query tuning, and performance optimization
- Hands-on experience with Snowflake features: Snowpipe, streams, tasks, materialized views
- Experience implementing RBAC, data masking, encryption, and data security best practices
- Experience managing and optimizing Snowflake cost and virtual warehouse usage
- Experience with CI/CD for database deployments and tools like dbt or Schema change
- Degree in Computer Science (Bachelor's or Master's)
- Experience with orchestration tools (dbt, Airflow, Dagster) - nice to have
- Knowledge of Snowpark (Python/Java/Scala) - nice to have
- Familiarity with BI platforms (Power BI, Tableau, Qlik, MicroStrategy) - nice to have
- Exposure to Terraform or other IaC tools for Snowflake provisioning - nice to have
- Understanding of advanced Snowflake features (Streams, Dynamic Tables, Unistore)
- Experience with semi-structured data formats (JSON, Parquet, Avro)
- Snowflake certifications (SnowPro Core / Advanced) - nice to have
- Experience with data quality and observability tools (Monte Carlo, Great Expectations) - nice to have
- Background in Agile/Scrum environments
- Strong communication and collaboration skills (English: Upper-Intermediate or higher)
- Analytical mindset, adaptable, proactive, able to work independently and in cross-functional teams