
FinOps Engineer, Product Analytics
Intapp
full-time
Posted on:
Location Type: Hybrid
Location: Lisbon • Portugal
Visit company websiteExplore more
About the role
- Design and maintain analytical data models — schemas, views, and aggregations — that correctly represent FinOps data including cloud cost, cost allocation, anomaly signals, infrastructure usage, and unit economics, with additional domains to follow.
- Build ingestion and normalization pipelines that clean and unify data from heterogeneous sources — AWS and Azure billing, SaaS platforms, product telemetry, and internal systems — into a consistent, queryable analytical layer.
- Develop AI-powered reporting skills in Python using MCP (Model Context Protocol) that return structured, reliable answers to stakeholder questions — covering cost anomalies, team-level spend breakdowns, usage trends, and more as new domains come online.
- Implement prompt engineering and output validation layers that make AI-generated answers trustworthy — handling empty results, ambiguous time ranges, and conflicting data gracefully.
- Build and maintain dashboards in Metabase covering FinOps metrics (showback, chargeback, anomaly signals, unit costs) and expanding to additional domains as the platform grows.
- Collaborate with FinOps experts, product managers, and other domain leads to translate requirements into accurate data models and query patterns that reflect how the underlying data actually behaves.
- Help evaluate and contribute to architecture decisions as new data domains and source systems are brought into the platform.
- Document data models, AI skill patterns, and pipeline logic so the broader team can understand, extend, and trust what you build.
- Continuously improve data quality, query performance, and AI output reliability as the platform scales in volume, variety, and scope.
Requirements
- Strong SQL skills — complex aggregations, window functions, joins across large datasets, and the instinct to investigate when a number looks wrong before shipping.
- Python proficiency — clean, maintainable scripts for data transformation, normalization, and tooling (not just notebooks).
- Experience with analytical or columnar databases — such as ClickHouse, BigQuery, Redshift, DuckDB, or similar.
- Demonstrated experience building with an LLM API (Anthropic, OpenAI, or similar) — you understand tool use, structured outputs, and what it takes to make AI responses consistent and reliable.
- Ability to work from well-defined requirements: translating a subject matter expert’s question into a data model and query that accurately answers it.
- Comfort in a greenfield environment — able to make sound architectural decisions when patterns are still being established.
- Familiarity with MCP (Model Context Protocol) or similar function-calling / tool-use patterns for LLMs (preferred).
Benefits
- A state-of-the-art facility with a fully stocked kitchen – only a 5-minute walk to/from Gare do Oriente.
- A hybrid work system supporting agile and flexible hours.
- Attractive compensation – including competitive base pay and performance-based variable pay.
- Equity/Stock in Intapp.Opportunity to travel to other development centers for product training and cross-site collaboration.
- One-time home office stipend.
- Generous paid parental leave (including adoptive leave), marriage leave, bereavement leave, carer's leave, and paid sick days.
- Meal allowance.
- Reimbursement for training towards continuing education.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
SQLPythondata transformationdata normalizationanalytical databasesClickHouseBigQueryRedshiftDuckDBLLM API
Soft Skills
collaborationproblem-solvingattention to detailcommunicationadaptabilityarchitectural decision-makingrequirement translationdata quality improvementtrust buildingdocumentation