
Principal Data Engineer
Brillio
full-time
Posted on:
Location Type: Hybrid
Location: San Francisco • California • United States
Visit company websiteExplore more
Salary
💰 $140,000 - $150,000 per year
Job Level
About the role
- Design and implement scalable Snowflake data architectures to support enterprise data warehousing and analytics needs
- Optimize Snowflake performance through advanced tuning, warehousing strategies, and efficient data sharing solutions
- Develop robust data pipelines using Python and DBT, including modeling, testing, macros, and snapshot management
- Implement and enforce security best practices such as RBAC, data masking, and row-level security across cloud data platforms
- Architect and manage AWS-based data solutions leveraging S3, Redshift, Lambda, Glue, EC2, and IAM for secure and reliable data operations
- Orchestrate and monitor complex data workflows using Apache Airflow, including DAG design, operator configuration, and scheduling
- Utilize version control systems such as Git to manage codebase and facilitate collaborative data engineering workflows
- Integrate and process high-volume data using Apache ecosystem tools such as Spark, Kafka, and Hive, with an understanding of Hadoop environments
Requirements
- 12 - 15 years of experience, including significant hands-on expertise in Snowflake data architecture and data engineering
- Advanced hands-on experience with Snowflake, including performance tuning and warehousing strategies
- Expertise in Snowflake security features such as RBAC, data masking, and row-level security
- Proficiency in advanced Python programming for data engineering tasks
- In-depth knowledge of DBT for data modeling, testing, macros, and snapshot management
- Strong experience with AWS services including S3, Redshift, Lambda, Glue, EC2, and IAM
- Extensive experience designing and managing Apache Airflow DAGs and scheduling workflows
- Proficiency in version control using Git for collaborative development
- Hands-on experience with Apache Spark, Kafka, and Hive
- Solid understanding of Hadoop ecosystem
- Expertise in SQL (basic and advanced), including SnowSQL, PLSQL, and T-SQL
- Strong requirement understanding, presentation, and documentation skills; ability to translate business needs into clear, structured functional/technical documents and present them effectively to stakeholders.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
SnowflakePythonDBTApache AirflowApache SparkKafkaHiveSQLSnowSQLPLSQL
Soft Skills
presentation skillsdocumentation skillsrequirement understandingstakeholder communication