Salary
💰 $180,000 - $212,000 per year
Tech Stack
AirflowAWSCloudDockerETLGoogle Cloud PlatformKubernetesPythonSQLTableau
About the role
- Develop and maintain foundational data models that serve as the single source of truth for analytics across the organization.
- Transform raw data into actionable insights through robust pipelines, well-designed data models, dashboards, and tools.
- Partner with engineering, data science, product, and business teams to ensure alignment on priorities and data solutions.
- Build frameworks, tools, and workflows that maximize efficiency for data users while maintaining high standards of data quality and performance.
- Deliver outcome-focused solutions using modern development and analytics tools, ensuring long-term maintainability.
- Quickly build subject matter expertise in specific business areas and data domains and understand end-to-end data flows.
- Interface with stakeholders to generate commercial value from data, build new data models, and enable downstream teams.
- Take initiative and accountability for identifying and fixing data gaps and issues anywhere in the stack.
Requirements
- Data Modeling Expertise: Strong understanding of best practices for designing modular and reusable data models (e.g., star schemas, snowflake schemas).
- Prompt Design and Engineering: Expertise in prompt engineering and design for LLMs (e.g., GPT), including creating, refining, and optimizing prompts.
- Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization.
- Intermediate to Advanced Python: Expertise in scripting and automation, with experience in OOP and building scalable frameworks.
- Collaboration and Communication: Strong ability to translate technical concepts into business value and manage cross-functional projects.
- Data Pipeline Development: Experience building, maintaining, and optimizing ETL/ELT pipelines using modern tools like dbt, Airflow, or similar.
- Data Visualization: Proficiency in Looker, Tableau, Superset, or Python visualization libraries (Matplotlib, Plotly).
- Development Tools: Familiarity with version control (GitHub), CI/CD, and modern development workflows.
- Data Architecture: Knowledge of modern data lake/warehouse architectures (e.g., Snowflake, Databricks) and transformation frameworks.
- Business Acumen: Ability to understand and address business challenges through analytics engineering.
- Data savvy: Familiarity with statistics and probability.
- Bonus Skills: Experience with cloud platforms (e.g., AWS, GCP); Familiarity with Docker or Kubernetes.
- Passion for Coinbase mission and belief in the power of crypto and blockchain technology.
- In-person participation is required throughout the year for team and company-wide offsites.