Advanced12 min readAI StrategyChris Tansey

ROE vs ROI: The Right Way to Measure AI Value

Why 95% of AI projects 'fail' ROI—and why Return on Efficiency (ROE) is the metric that actually captures AI value.

Published December 1, 2024

ROEROIAI metricsdigital capitalenterprise AI

ROE vs ROI: The Right Way to Measure AI Value

Here's a statistic that has paralyzed boardrooms across the Fortune 500: 95% of generative AI projects fail to deliver measurable return on investment within six months.

Ninety-five percent. Despite $30-40 billion in enterprise AI investment.

If this stat is accurate, AI is the largest collective delusion in business history. Except that's not what's happening. The stat is real. The interpretation is wrong.

The Measurement Trap

The problem isn't that AI doesn't work. The problem is that ROI, as traditionally calculated, is the wrong metric for transformation investments.

ROI asks a simple question: Did you make more money than you spent? That question makes sense for a new production line or a marketing campaign. You invest, you measure incremental revenue, you calculate return.

AI doesn't work that way. The value of AI is capability expansion—things you can now do that you couldn't before:

  • Capacity gain — Your 3-person dev team ships like a 6-person team
  • Process acceleration — Two sprints become one
  • Error reduction — Catch defects humans miss
  • Skill amplification — Junior analysts produce senior-level work

These are real. They drive real business outcomes. But they don't register as "revenue" in the six-month window that ROI calculations demand.

The Capacity Question

Stop asking "what's the ROI?" Start asking "what can you now do that you couldn't before?"

The answer to that question is the real measure of AI value.

When you deploy a coding agent, the ROI calculation asks whether the agent generated revenue. But that's not the value. The value is that your three-person development team now ships features at the rate of a six-person team. Your engineering capacity doubled.

Consider what happens when AI automates a back-office process:

  • You don't hire the three new analysts you would have needed
  • You don't outsource to a BPO firm
  • You don't miss deadlines that would have cost you contracts

All of this is real value. None of it shows up as "revenue generated by AI" in an ROI calculation.

Enter ROE: Return on Efficiency

The framework you need has a name: Return on Efficiency (ROE).

ROE shifts the measurement from "did we make money?" to "are we operating better?" It captures the real value that AI creates—value that ROI systematically misses.

The ROE Framework: Five Dimensions

1. Efficiency Metrics — Operational improvements

  • Time saved per process
  • Throughput per employee
  • Cycles eliminated
  • If your analysts spent 4 hours on a report and now spend 40 minutes, that's measurable efficiency that compounds

2. Quality Metrics — Output improvement

  • Error reduction rates
  • Customer satisfaction scores
  • Decision accuracy
  • If your AI system catches defects that humans missed, that quality improvement has real value

3. Capability Metrics — Expansion of what's possible

  • New tasks enabled
  • Skill amplification
  • Complexity handled
  • If your team can now take on analysis that was previously outsourced, that's capability expansion

4. Strategic Metrics — Positioning improvement

  • Market responsiveness
  • Innovation velocity
  • Competitive differentiation
  • If you're releasing features twice as fast as competitors, that strategic advantage compounds

5. Human Metrics — Workforce impact

  • Employee satisfaction
  • Retention rates
  • Learning velocity
  • If your best people stay because they're doing more interesting work, that retention value is enormous

The Gym Membership Analogy

Buying AI tools is like buying a gym membership. The membership itself produces no value. Going to the gym produces value.

What does "going to the gym" look like for AI?

  • Wiring the model into a ticket routing system
  • Connecting the coding agent to your CI/CD pipeline
  • Building the workflow that runs every time a customer submits a form

The tool is not the asset; deployment is the asset. An organization that "bought AI" but hasn't deployed it into workflows has purchased potential, not capacity.

Potential depreciates rapidly. Capacity compounds.

Practical ROE Tracking

Track These Metrics Weekly

MetricWhat to MeasureWhy It Matters
Time SavedHours saved per workflowDirect efficiency gain
ThroughputTasks completed per personCapacity expansion
Error RateDefects caught vs. missedQuality improvement
Cycle TimeStart-to-finish durationSpeed advantage
CapabilityNew tasks now possibleStrategic positioning

The Envelope Test

Before deploying any AI, answer these questions:

  1. What capacity does this add? (Not revenue—capacity)
  2. Where will the efficiency show up? (Time, quality, throughput)
  3. How will we measure it? (Specific metrics, tracked weekly)
  4. What becomes possible that wasn't before? (Capability expansion)

From Expense to Asset

The implication is operational, not theoretical:

Stop tracking AI as an IT expense. Start tracking it as capacity on your balance sheet.

The numbers will tell a different story—one where 95% of your AI investments are actually working.


Key Takeaways

  • ROI is the wrong metric for AI—it measures too early with the wrong yardstick
  • ROE (Return on Efficiency) captures what AI actually produces: capacity, not direct revenue
  • Five dimensions: Efficiency, Quality, Capability, Strategic, Human metrics
  • The gym analogy: Buying AI is buying potential; deploying AI is building capacity
  • Track weekly: Time saved, throughput, error rates, cycle time, new capabilities

This framework is from Chapter 2 of Scaling Digital Capital: The Architect's Blueprint by Chris Tansey. Get the full framework for building AI-augmented organizations.