Aimpoint Digital

Senior Analytics Engineer

Aimpoint Digital

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Become a trusted advisor working together with our clients, partnering with data owners, analysts, business users, and executive stakeholders to translate business needs into scalable analytics solutions
  • Work independently as part of a small team to solve complex analytics engineering use-cases across a variety of industries
  • Design and develop the analytical layer, including curated data models, semantic layers, metrics definitions, and transformation pipelines that power reporting, dashboards, self-service analytics, and AI use cases
  • Work with modern tools such as Snowflake, Databricks, Fivetran, and dbt and credentialize your skills with certifications
  • Write high-quality code in SQL, Python, and Spark, and use software engineering tools and best-practices such as Git and CI/CD
  • Contribute to best practices around analytics engineering - data modeling, metric standardization, data quality, and governance
  • Accelerate analytics development velocity and improve solution quality by using AI-assisted engineering workflows and tools such as Codex, Claude, Copilot, and platform-native assistants like Snowflake Cortex Code and Databricks Genie Code
  • Partner closely with data engineers, analysts, data scientists to support production-grade analytics and AI applications

Requirements

  • Degree educated in Computer Science, Engineering, Mathematics, or equivalent experience
  • Experience with managing stakeholders and collaborating with customers
  • Strong written and verbal communication skills required
  • 3+ years working with relational databases and query languages, with deep expertise in SQL
  • 3+ years of data modeling experience, including dimensional modeling, star schema, entity-relationship design, and analytics-focused modeling best practices
  • 3+ years building data pipelines in production, with the ability to work across structured, semi-structured and unstructured data
  • 3+ years writing clean, maintainable, and robust code in SQL and/or Python
  • 2+ years' experience with dbt Core and/or dbt Cloud required
  • Ability to manage an individual workstream independently
  • Expertise in software engineering concepts and best practices, including version control, testing, documentation, and CI/CD
  • Experience working with cloud data warehouses (Databricks, Snowflake, Google BigQuery, AWS Redshift, Microsoft Synapse) preferred
  • Experience working with other cloud ETL/ELT tools (Fivetran, Matillion, Informatica, Talend, etc.) preferred
  • Experience working with cloud platforms such as AWS, Azure, or GCP preferred
  • Experience defining metrics, semantic layers, and reusable business logic for analytics consumption preferred
  • Familiarity with modern BI and visualization platforms such as Power BI, Tableau, or Sigma preferred
  • Experience enabling or accelerating analytics engineering workflows with AI tools such as Codex, Claude, Copilot, Snowflake Cortex Code, and/or Databricks Genie Code preferred
  • Consulting experience strongly preferred.
Benefits
  • N/A 📊 Check your resume score for this job Improve your chances of getting an interview by checking your resume score before you apply. Check Resume Score
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
SQLPythonSparkdata modelingdimensional modelingstar schemaentity-relationship designdata pipelinesCI/CDanalytics engineering
Soft Skills
stakeholder managementcollaborationwritten communicationverbal communicationindependent workproblem-solvingconsulting