Tech Stack
AirflowAWSAzureCloudETLGoogle Cloud PlatformInformaticaPythonSQL
About the role
- Design and build scalable, secure, and cost-effective data architectures in Snowflake, aligned with industry and client best practices
- Implement data pipelines using ETL/ELT tools such as dbt, Python, CloverDX, and cloud-native services
- Conduct technical discovery sessions to understand client goals and translate them into architectural designs and project plans
- Collaborate with pre-sales and account teams to scope work, define solutions, and estimate effort for Statements of Work (SOWs)
- Serve as a Snowflake SME, providing guidance on performance tuning, cost optimization, access control, data sharing, and workload management
- Lead and execute data platform modernization and migration initiatives, moving clients from legacy systems to Snowflake-based solutions
- Integrate Snowflake environments with BI tools, data governance platforms, and AI/ML frameworks
- Contribute to internal IP, including templates, frameworks, accelerators, and proofs of concept (POCs)
- Support delivery engagements by implementing scalable solutions and continuously improving architectures based on business needs
- Mentor junior team members and contribute to ongoing knowledge sharing across the engineering team
- Hybrid pre-sales and delivery, client-facing consultative role
Requirements
- 7+ years of experience in data engineering, data warehousing, or data architecture roles
- 3+ years of hands-on Snowflake experience including advanced features such as performance tuning, access control, data sharing, Snowpark, or Snowpipe
- Strong experience with SQL, Python, and DBT in production environments
- Proficiency in cloud infrastructure such as AWS, Azure, or GCP
- Experience with modern data tooling such as Airflow, Fivetran, Power BI, Looker, or Informatica
- Demonstrated success in client-facing delivery roles or solution architecture in consulting or services environments
- Deep understanding of cloud-native architectures, data modeling, and data pipeline orchestration
- Experience with streaming data, unstructured data, or real-time analytics is a plus
- Excellent verbal and written communication skills and the ability to present to both technical and executive stakeholders
- Proven ability to influence client architecture decisions and lead data modernization initiatives
- SnowPro Core Certification (Required)
- One or more SnowPro Advanced Certifications such as Data Engineer, Architect, Snowpark, or Data Analyst
- Additional cloud or data platform certifications such as AWS, Azure, dbt, or Databricks are a plus