Snowflake

Principal AL/ML Solution Engineer

Snowflake

full-time

Posted on:

Location Type: Remote

Location: ArizonaCaliforniaUnited States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $189,000 - $248,062 per year

Job Level

About the role

  • Apply your multi-cloud data architecture and AI/ML expertise while presenting Snowflake technology and vision to executives and technical contributors at strategic prospects, customers, and partners
  • Work hands-on with prospects and customers to demonstrate and communicate the value of Snowflake technology throughout the sales cycle, from demo to proof of concept to design and implementation, in support of our customer-aligned solution engineers.
  • Immerse yourself in the ever-evolving industry, maintaining a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them
  • Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing around AI/ML features and capabilities.

Requirements

  • 5 - 10+ years of data engineering experience within the Enterprise Data space
  • 5+ years experience working with AI/ML technologies
  • Outstanding presentation skills to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos.
  • Ability to connect a customer’s specific business problems and Snowflake’s solutions
  • Ability to do deep discovery of customer’s architecture framework and connect those with Snowflake Data Architecture.
  • Broad range of experience within large-scale Database and/or Data Warehouse technology, ETL, analytics and cloud technologies. For example, Data Lake, Data Mesh, Data Fabric
  • Hands on Development experience with technologies such as SQL, Python, Pandas, Spark, PySpark, Hadoop, Hive and any other Big data technologies
  • Deep experience in AI/ML tools, services and architectures in the enterprise technology space.
  • Clear understanding of data integration services and tools for building ETL and ELT data pipelines such as Apache NiFi, Matillion, Fivetran, Qlik, or Informatica.
  • Familiarity with streaming technologies (ex. Kafka, Flink, Spark Streaming, Kinesis) and real-time or near real time use cases (ex. CDC)
  • Experience designing interoperable data lakehouse architectures and experience working with Iceberg, Delta, and Parquet
  • Strong architectural expertise in data engineering to confidently present and demo to business executives and technical audiences, and effectively handle any impromptu questions
  • Bachelor’s Degree required, computer science, engineering, mathematics or related fields, or equivalent experience *preferred*
Benefits
  • Health insurance
  • 401(k) matching
  • Flexible work hours
  • Paid time off
  • Remote work options
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringAI/ML technologiesSQLPythonPandasSparkPySparkHadoopHiveETL
Soft Skills
presentation skillscommunicationcustomer discoveryproblem-solvingcollaborationarchitectural expertise