Snowflake

Specialist Architect, Data Engineering/ML

Snowflake

full-time

Posted on:

Origin:  • 🇺🇸 United States

Visit company website
AI Apply
Manual Apply

Salary

💰 $134,000 - $187,400 per year

Job Level

Mid-LevelSenior

Tech Stack

ApacheAWSAzureCloudETLGoogle Cloud PlatformJavaPandasPythonPyTorchRDBMSScalaScikit-LearnSparkSQLTensorflow

About the role

  • Be a technical expert on all aspects of Snowflake in relation to data engineering workloads.
  • Work with Professional Services Practice Directors and Managers on sales pursuits to understand customer requirements, present high level architecture solutions using Snowflake, and scope project plans and effort estimates to deliver data engineering solutions.
  • Consult with customers in sales workshops using SQL, Python, Java and/or Scala to understand their data engineering workload/use case.
  • Understand best practices related to Snowflake data engineering capabilities.
  • Maintain deep understanding of competitive and complementary technologies and vendors within the data engineering space, and how to position Snowflake in relation to them.
  • Understand partner system relationships to properly position them in the scoping and delivery of data engineering use cases.
  • Provide guidance on how to resolve customer-specific technical challenges.
  • Assist in writing Statements of Work.
  • Identify selling patterns within the data engineering space and create go to market options for PS Sellers to leverage.
  • Enable PS Sellers with technical knowledge to scale our ability to sell data engineering solutions.
  • Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing.
  • Follow company confidentiality and security standards for handling sensitive data and keep customer information secure and confidential.

Requirements

  • Minimum 5 years experience working with customers in a pre-sales or post-sales technical role.
  • Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos
  • Thorough understanding of the complete Data Engineering life-cycle
  • Experience and understanding of at least one public cloud platform (AWS, Azure or GCP)
  • Experience with Databricks/Apache Spark
  • Hands-on scripting experience with SQL and at least one of the following; Python, Java or Scala.
  • Experience with libraries such as Pandas, PyTorch, TensorFlow, SciKit-Learn or similar.
  • University degree in data science, computer science, engineering, mathematics or related fields, or equivalent experience
  • Bonus: Experience with Snowflake Snowpark
  • Bonus: Experience with SAS (Statistical Analysis System)
  • Bonus: Experience with Apache Nifi
  • Bonus: Experience with dbt
  • Bonus: Experience in Data Science
  • Bonus: Experience implementing data pipelines using ETL tools
  • Bonus: Experience working with RDBMS data warehouses
  • Bonus: Proven success at enterprise software
  • Bonus: Vertical expertise in a core vertical such as FSI, Retail, Manufacturing etc.