Data Pipeline Design, Implementation, Optimization and productionization in snowflake.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Create and maintain datasets that support the needs and products.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
Implementing processes oriented to improve quality, consistency, and reliability through the various pipelines (monitoring, retry, failure detection).
Requirements
MS or BS in CS, Engineering, Math, Statistics, or a related field or equivalent practical experience in data engineering.
Proven track record within a Data Engineer or engineering environment where you have developed and deployed software / pipeline.
3-5 years of experience working in data engineering using Snowflake.
2-4 years of experience working in data engineering using Python or any other language programming known for data engineering (Scala, Go, R, Java, etc).
Proven SQL Skills.
Experience using the data warehousing tool: Snowflake.
Understanding about several tools for data transformation and pipelining, like Airflow, DBT, Spark, Pandas.
Cloud experience: Proficient in AWS, with expertise in data and analytics services such as Redshift, Kinesis, Glue, Step Functions, Sagemaker, RDS, etc.
Knowledge to build processes and infrastructure to manage lifecycle of datasets: data structures, metadata, dependency and workload management.
You have worked in an Agile environment or open to adopting this culture.
Excellent English communication skills.
Extra Experience with Technologies like Kubeflow, EKS, Docker.
Experience with stream-processing systems: Kafka, Storm, Spark-Streaming, etc.
Statistical analysis and modeling experience.
Experience with machine learning algorithms.
Data-driven approach to problem solving.
The ability to visualize and communicate complex concepts.
Benefits
The chance to work in innovative projects with leading brands, that use the latest technologies that fuel transformation.
The opportunity to be part of an amazing, multicultural community of tech experts.
The opportunity to grow and develop your career with the company.
A flexible and remote working environment.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data pipeline designdata engineeringSnowflakePythonSQLdata transformationdata warehousingstatistical analysismachine learningdata-driven problem solving
Soft skills
communication skillsprocess improvementproblem solvingcollaborationadaptability