G2i Inc.

Cost Effectiveness Engineer

G2i Inc.

full-time

Posted on:

Origin:  • 🇺🇸 United States

Visit company website
AI Apply
Apply

Job Level

Mid-LevelSenior

Tech Stack

AWSAzureCloudGoogle Cloud PlatformGrafanaKubernetesPythonSplunkTableau

About the role

  • Lead efforts in ensuring cloud and tooling spend stays lean, optimized, and clearly understood across the organization, redesigning cost-effectiveness structures using AI-driven strategies, building dashboards, systems, and processes.
  • Thrive at the intersection of technology, data, and people, building solutions that save money without slowing innovation.
  • Track and analyze spend across AWS, Azure, GCP, Snowflake, FiveTran, DBT, Datadog, Splunk, Grafana, and other SaaS platforms.
  • Build dashboards in Power BI, Qlik, or similar tools to visualize spend and usage patterns.
  • Apply AI/ML techniques for anomaly detection, forecasting, and automated insights.
  • Integrate cost management data via APIs (AWS Cost Explorer, Datadog, Kubernetes cost APIs, etc.) for real-time monitoring.
  • Compare actual spend vs. budgeted forecasts and escalate when thresholds are exceeded.
  • Support root cause analysis using usage data and AI-driven insights.
  • Design automated processes for forecast expansion or budget increase requests.
  • Lead weekly cost usage reviews with engineering, DevOps, and product leads.
  • Drive a proactive culture of cost awareness and accountability.
  • Partner with teams to optimize data workflows and observability configurations without sacrificing performance.
  • Define and enforce tagging and labeling strategies to ensure accurate cost attribution.
  • Collaborate with DevOps to enforce compliance and traceability.
  • Influence vendor negotiations, license optimization, and procurement cycles with AI-informed forecasting.
  • Manage incidents related to data ingestion, analytics anomalies, and platform access.
  • Coordinate real-time response efforts with Engineering, Product, DevOps, and Operations.
  • Ensure uptime and health metrics align with SLAs across data pipelines and services.

Requirements

  • Strong knowledge of cloud platforms and cost models (AWS, GCP, Azure).
  • Hands-on experience with dashboards and BI tools (Power BI, Qlik, Tableau, etc.).
  • Developer-level comfort with scripting, APIs, and automation (Python preferred).
  • Experience integrating and analyzing data from SaaS and observability tools.
  • Proven ability to apply AI/ML techniques for forecasting, anomaly detection, and automation.
  • Excellent communication and cross-functional collaboration skills.
  • FinOps or DevOps experience managing budgets, chargebacks, or tooling strategy.
  • Familiarity with tagging governance, approval workflows, and cost attribution models.
  • Exposure to platforms like AWS Cost Explorer, GCP Billing, CloudHealth, or Datadog Cost Management