Salary
💰 $42 - $60 per hour
Tech Stack
AWSAzureC++CloudDistributed SystemsHadoopJavaOpen SourceSparkSpringSQL
About the role
- Where Data Does More. Join the Snowflake team.
- Snowflake started with a clear vision: develop a cloud data platform that is effective, affordable, and accessible to all data users. Snowflake developed an innovative new product with a built-for-the-cloud architecture that combines the power of data warehousing, the flexibility of big data platforms, and the elasticity of the cloud at a fraction of the cost of traditional solutions. We are now a global, world-class organization with offices in more than a dozen countries and serving many more.
- We’re looking for dedicated students who share our passion for ground-breaking technology and want to create a lasting future for you and Snowflake.
- What You Will Learn/Gain:
- How to build enterprise grade, reliable, and trustworthy software/services
- Exposure to SQL or other database technologies (e.g., Spark, Hadoop)
- Understanding of database internals, large-scale data processing, transaction processing, distributed systems, and data warehouse design
- Implementation, testing of features in query compilation, compiler design, query execution
- Experience working with cloud infrastructure, AWS, Azure, and/or Google Cloud in particular
- Learning about cutting edge database technology and research
Requirements
- Must be actively enrolled in an accredited college/university program during the time of the internship
- Desired class level: 3rd/4th year Undergraduates, Masters, or PhD
- Desired majors: Computer Science, Computer Engineering, Electrical Engineering, Physics, Math, or related field
- Required coursework: algorithms, data structures, Object-oriented programming
- Recommended coursework: cloud computing, compilers, database systems, distributed systems, operating systems, cryptography & authentication, networking
- Bonus experience: research or publications in databases or distributed systems, and contributions to open source
- Experience working with big data (engineering / processing) and data migration
- When: Spring 2026
- Eligible start date options: January 5, January 20
- Eligible end date options: March 27, April 10, April 24
- Duration: 12 week minimum, 16 weeks recommended (12 month maximum)
- Excellent programming skills in C++ or Java
- Preferred knowledge of C++20 or C++17, Java 20 or Java 17
- Knowledge of data structures and algorithms
- Systems programming skills including multi-threading, concurrency, etc.
- Strong problem solving and ability to learn quickly in a dynamic environment
- Experience with working as a part of a team
- Dedication and passion for technology