Tech Stack
AirflowAWSAzureCloudETLGoogle Cloud PlatformInformaticaPythonSQL
About the role
- Data Ideology is seeking a Snowflake Data Engineer to support our growing partnership with Snowflake in a pre-sales and delivery hybrid role.
This position is a client-facing, consultative role that bridges the gap between technical strategy and practical implementation.
Design and build robust, scalable, and secure data architectures in Snowflake, aligned with industry and client best practices.
Implement ETL/ELT pipelines using tools such as DBT, Python, CloverDX, and cloud-native services.
Conduct technical discovery sessions and assessments to understand client requirements and translate them into solution designs.
Collaborate with pre-sales and account teams to scope work, define solutions, and estimate effort for Statements of Work (SOWs).
Serve as a Snowflake SME, advising on architecture, performance tuning, cost optimization, security, and workload design.
Partner with clients to modernize legacy data platforms and migrate to Snowflake-based data ecosystems.
Integrate with data visualization, governance, and AI/ML frameworks to support advanced analytics and business intelligence use cases.
Contribute to reusable assets such as templates, frameworks, reference architectures, and POCs.
Lead technical delivery for client engagements, supporting both implementation and ongoing solution refinement.
Mentor junior engineers and participate in internal knowledge-sharing and technical upskilling initiatives.
Remote work from home. Hours of work and days are generally Monday through Friday. Specific business hours will depend on client needs.
Requirements
- 7+ years of experience in data engineering, data warehousing, or data architecture roles.
3+ years of hands-on Snowflake experience including advanced features such as performance tuning, access control, data sharing, Snowpark, or Snowpipe.
Strong experience with SQL, Python, and DBT in production environments.
Proficiency in cloud infrastructure such as AWS, Azure, or GCP and modern data tooling such as Airflow, Fivetran, Power BI, Looker, or Informatica.
Demonstrated success in client-facing delivery roles or solution architecture in consulting or services environments.
Deep understanding of cloud-native architectures, data modeling, and data pipeline orchestration.
Experience with streaming data, unstructured data, or real-time analytics is a plus.
Excellent verbal and written communication skills and the ability to present to both technical and executive stakeholders.
Proven ability to influence client architecture decisions and lead data modernization initiatives.
SnowPro Core Certification (Required).
One or more SnowPro Advanced Certifications such as Data Engineer, Architect, Snowpark, or Data Analyst.
Additional cloud or data platform certifications such as AWS, Azure, dbt, or Databricks are a plus.