About the role
- Build and maintain data pipelines that connect different data sources
- Help design foundational datasets that are trusted, well-documented, and easy to use
- Optimize pipelines for performance and scalability as Figma grows
- Partner with cross-functional teammates to understand data needs and deliver solutions
- Contribute to the tools and practices that make working with data at Figma easier
- Develop new ETL jobs to integrate product or business data into our warehouse
- Improve the performance and reliability of core pipelines
- Build reusable datasets that support experimentation, product & business analytics, or cross-functional stakeholders
- Document and simplify complex data workflows for broader use
Requirements
- Experience coding in Python (or a similar language) and writing SQL queries
- Worked on projects that involved building or working with data pipelines, ETL, or large datasets
- Curious about how data systems work and want to make them more scalable and reliable
- Communicate well and enjoy collaborating with others
- Are excited to learn, get feedback, and grow as an data engineer
- Internship will be based out of our San Francisco or New York hub
- Candidates required to keep cameras on during video interviews and attend in-person onboarding
- Hourly base pay rate $44.71 USD
- Housing stipend
- Travel reimbursement
ATS Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonSQLETLdata pipelinesdata integrationdata warehousingperformance optimizationscalabilitydata documentationdata workflows
Soft skills
communicationcollaborationcuriosityadaptabilityfeedback receptivenessproblem-solvingteamworkorganizational skillsattention to detailgrowth mindset