DPR Construction

Snowflake Architect – Admin

DPR Construction

full-time

Posted on:

Location Type: Office

Location: Raleigh-Durham • Florida, North Carolina, South Carolina, Virginia • 🇺🇸 United States

Visit company website
AI Apply
Apply

Job Level

SeniorLead

Tech Stack

AWSAzureCloudETLPythonSQL

About the role

  • Design, build, and own the overall data architecture across the Snowflake data platform — including the data lake, data warehouse, and data consumption layers
  • Monitor and optimize Snowflake performance, including query performance tuning, resource allocation, and cost management
  • Develop, optimize , and manage conceptual and logical data architectures and integrations across both internal and external systems
  • Collaborate closely with engineering, data, and analytics teams to deliver business-critical data solutions
  • Drive high priority data initiatives using Azure /AWS as well as Snowflake & DBT
  • Leverage Snowflake Cortex to enable natural language query experiences, document understanding, and AI-driven insights directly within the Snowflake environment
  • Implement and manage vectorized data pipelines for semantic search and retrieval-augmented generation (RAG) within Snowflake
  • Stay current with evolving Snowflake AI capabilities (Cortex, Snowpark Container Services, Document AI, and Feature Store) and apply them to improve data accessibility and intelligence
  • Design scalable, secure, and high-performance data pipelines to support evolving business needs
  • Partner with strategic customers to understand their vision and ensure future requirements are incorporated into the platform roadmap
  • Participate in all phases of the project lifecycle and lead data architecture initiatives

Requirements

  • 10+ years of experience in data architecture and engineering
  • Hands-on experience with secure and scalable enterprise data architectures using Microsoft Azure or AWS
  • Deep knowledge of Snowflake and DBT, with experience building robust data ingestion and ETL/ELT pipelines
  • Experience in designing data structures for data lakes and cloud data warehouses to support analytics and reporting
  • Hands-on experience with Snowflake Cortex, Snowpark ML, or Snowflake’s AI/ML features for model training, deployment, or inference
  • Understanding of vector embeddings, model governance, and prompt-driven analytics within Snowflake
  • Strong proficiency in SQL, python, git and working with Snowpark DataFrames and UDFs for AI model integration
  • Familiarity with agile methodologies, and experience working closely with cross functional teams to manage technical backlogs
  • Skilled in orchestrating and automating data pipelines within a DevOps framework
  • Strong communicator with the ability to present ideas clearly and influence stakeholders
Benefits
  • Health insurance
  • Professional development opportunities

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data architecturedata engineeringSnowflakeDBTETLELTSQLPythonvector embeddingsdata pipelines
Soft skills
communicationcollaborationinfluenceleadershipproblem-solvingproject managementstakeholder engagementagile methodologiescross-functional teamworktechnical backlog management