Total Innovation Podcast

36. Expected Value - Act 4 Chapters 12 & 13

The Infinity Loop Season 3 Episode 36

In this episode, we move to Act 4, where the spotlight moves from frameworks to people. Freya's team can score ideas, prioritize bets, and balance the portfolio. But now they face the harder challenge, embedding innovation into the culture itself. 

We'll see how they bridge the gap between potential and performance, turning XV into realised value, and we'll explore the table of justice, a transformative way to judge decisions by evidence, not emotion. 

This is where innovation becomes lived, not just measured. 

SPEAKER_01:

What's a party?

unknown:

Uh-huh.

SPEAKER_01:

Uh-uh. Uh-uh. What's a party?

unknown:

Uh-uh.

SPEAKER_00:

Welcome back to our journey through expected value. In this episode, we move to Act 4, where the spotlight moves from frameworks to people. Freya's team can score ideas, prioritize bets, and balance the portfolio. But now they face the harder challenge, embedding innovation into the culture itself. We'll see how they bridge the gap between potential and performance, turning XV into realised value, and we'll explore the table of justice, a transformative way to judge decisions by evidence, not emotion. This is where innovation becomes lived, not just measured. Let's dive in.

SPEAKER_02:

We've been calping all the pilots, posting metrics on the wall, but when the CFO is asking, where's the value in it all?

SPEAKER_00:

Act four building a total innovation culture.

SPEAKER_03:

You can't get to courage without walking through vulnerability.

SPEAKER_00:

Tools help us decide, frameworks help us focus, metrics help us reflect, but only culture can sustain. By this point in the journey, Freya and her team have built a full spectrum innovation system. They can evaluate ideas, prioritize bets, align to strategy, balance the portfolio, govern decisions, and adapt through learning. But even the best system will erode over time unless the people who use it believe in it.

SPEAKER_03:

The human elements.

SPEAKER_00:

The transformation from innovation theater to innovation performance isn't complete when the frameworks are in place. The hardest part and the most crucial is embedding these approaches into the organization's cultural DNA. This is where most transformation efforts falter. After the initial excitement of new tools and approaches, old habits resurface. Emotional attachment to ideas overrides data-driven decisions. Political considerations seep back into resource allocation. The pressure for certainty undermines honest confidence assessment. Without addressing these deeper cultural patterns, even the most elegant innovation system will gradually revert to the mean, becoming another well-intentioned initiative that failed to stick. Act four is about making innovation not just something we do, but something we are.

SPEAKER_03:

From metrics to mindsets.

SPEAKER_00:

The final frontier of innovation performance isn't technical, it's psychological. It's about the language leaders use when discussing uncertainty. The stories they tell about failure and learning, the behaviors they reward in public and in private, the courage they demonstrate when faced with difficult truths. It's about creating environments where people feel safe being honest about confidence levels rather than inflating them for political protection. Where teams celebrate learning as much as launching. Where kill decisions are recognized as evidence of discipline, not defeat. This cultural layer isn't soft or optional, it's the invisible infrastructure that allows everything else to work. Without it, frameworks become empty rituals, metrics become manipulation tools, and governance becomes bureaucratic theatre. The integration challenge Beyond culture itself lies integration, the degree to which innovation performance is woven into the broader organizational fabric rather than existing as a parallel system. This integration happens at multiple levels. It's in how innovation insights inform strategy rather than merely executing it. How innovation metrics connect to business metrics rather than competing with them. How innovation talent flows between core operations and edge exploration rather than being siloed in either. True integration means innovation becomes less visible as a separate function and more apparent as a distributed capability. A way of working that permeates the organization rather than residing in a specific department or team. As we look toward the future of innovation performance, we must resist the temptation to focus solely on more sophisticated metrics, more powerful AI, or more complex frameworks. The limiting factor in most organizations isn't analytical sophistication, it's the quality of the interface between analysis and action. As we've seen in fields like sports analytics, having the best data in the world generated and analyzed by the brightest minds in the sport is of little or no advantage if the signals it can send are not communicated well to the managers, the coaches, the scouts, and the players who can benefit from it. What matters most is how effectively the system bridges the gap between evidence and experience, between measurement and meaning. The organizations that will excel in innovation performance won't necessarily be those with the most advanced metrics, they'll be those that create the most effective human interfaces around those metrics, the cultures, relationships, and communication approaches that turn analytical insights into better decisions. In the chapters that follow, we'll explore how to build this cultural foundation for sustainable innovation performance. We'll see how Freya's team moves from metrics to mindsets, confronting the subtle but powerful psychological barriers to honest innovation assessment. We'll examine how they navigate the relationship between innovation culture and corporate culture, finding ways to create supportive microclimates without requiring wholesale organizational change. This is the final and perhaps most important piece of the innovation performance puzzle. Because the difference between organizations that occasionally innovate and those that consistently create value through innovation isn't just in their processes, metrics, or governance, it's in how innovation lives in the organizational psyche, how it shapes decision making at all levels, how it influences assumptions about what's possible and what's worth trying. In short, it's about building a total innovation culture, one where performance isn't just measured, but lived, day by day and decision by decision.

SPEAKER_03:

Let's begin. Act four Chapter twelve From Expected to Realised Value.

SPEAKER_00:

Innovation doesn't create value when it launches. It creates value when it lands. Freya studied the quarterly executive dashboard with a quiet sense of satisfaction. For the first time it wasn't just displaying XV scores, confidence movements, or portfolio distributions. There, in a simple visual, was a new metric she'd been working toward for months, realized value, RV. The data showed that six innovation initiatives had moved from potential to practice, collectively delivering$1.9 million in measured value, some through cost reduction, others via revenue growth, and two through quantifiable risk mitigation. More importantly, each had a clear trail connecting its original XV projection to what it had actually delivered. David, the CFO, who once asked that pivotal question, what's all this actually worth, had been the first to notice. So we're tracking not just what we think things might be worth, he observed during the review, but what they actually deliver when implemented. Exactly, Freya responded. Expected value tells us where to place our bets. Realized value tells us when we've won them. But as she looked at those numbers now, Freya knew this wasn't the end of something, it was just the beginning. Value realised wasn't a finish line, it was the starting point for the next cycle of innovation, learning, and growth. The journey from XV to R V represented the culmination of everything they had built. The confidence scoring from chapter four, the strategic fit radar from chapter five, the S curve positioning from chapter seven, and the governance systems from chapters eight and nine. Now the circle was complete.

SPEAKER_03:

The value realization gap.

SPEAKER_00:

For most organizations, innovation suffers from a critical disconnect between potential and practice, between the value an idea might deliver and the value it actually creates when implemented. This is the value realization gap. The gap exists for many reasons. There are implementation challenges, adoption resistance, capability limitations, shifting priorities, or simple execution fatigue. But the result is the same. Promising innovations that showed strong expected value failed to deliver corresponding realized value RV. This gap isn't just disappointing, it's existentially threatening to innovation functions. When innovation consistently fails to bridge this gap, it reinforces the perception that innovation is all promise and no performance. It undermines the credibility of value projections, even when they're thoughtfully constructed using the XV model, and it creates a cycle of diminishing investment as decision makers lose faith in innovation's ability to deliver measurable returns. We've gotten much better at predicting what innovations could be worth, Freya explained to her team. Now we need to get equally good at ensuring they actually deliver that value. This was the next frontier for Freya's innovation system, transforming it from a mechanism for identifying high potential opportunities into an engine for realizing tangible value from those opportunities. Not occasionally, not accidentally, but systematically and predictably.

SPEAKER_03:

From XV to RV, the missing link.

SPEAKER_00:

The journey from expected value to realized value isn't automatic. It requires deliberate bridges that many innovation systems lack. The first step in building these bridges was developing a clear definition of realized value that could be consistently applied across different types of innovation. Realized value is the measurable contribution an innovation makes to business outcomes after implementation, measured against strategic, financial, operational or risk dimensions using predetermined metrics and time frames. This definition established several critical principles. One measurability RV isn't abstract or anecdotal, it requires concrete measurement against predetermined metrics. two multidimensionality value can be realized across different domains strategic, financial, operational risk, rather than reduced to a single financial figure. three time bound. Value realization occurs within specific time frames, not indefinitely or theoretically. four outcome focused. RV measures contribution to business outcomes, not just innovation outputs. With this definition as a foundation, Freya and Axel developed a structured approach to connecting expected value projections with realized value tracking. Every XV score is a hypothesis. The formula XV equals confidence times. Predicted value times time, sensitivity time, strategic fit reflects what we believe an idea could be worth based on what we know. But innovation doesn't live in the theoretical. At some point, every pilot either delivers or it doesn't.

SPEAKER_03:

Every promise becomes a performance. Tracking the customer intelligence platform. A value journey.

SPEAKER_00:

To illustrate how this works in practice, let's follow the journey of one significant innovation through Freya's system, the customer intelligence platform, initial XV calculation, confidence zero point eight, strong evidence from lead users, validated prototype, predicted value six hundred thousand dollars, annual revenue opportunity. Time sensitivity one point three markets window closing as competitors develop similar capabilities. Strategic fit 0.9 strong alignment with customer intimacy strategy. Resulting XV equals five hundred sixty one thousand six hundred dollars. This score placed the customer intelligence platform in the top quartile of innovation opportunities, securing the resources needed for development and implementation. Implementation journey. As the platform moved from concept to deployment, the team tracked both the evolving XV as they gained new information, and the emerging RV as initial benefits materialized. Month three, early adoption by sales teams revealed higher than expected efficiency gains but slower customer adoption. Month six, XV adjusted to five hundred twenty thousand dollars based on revised timeline, initial RV of ninety thousand dollars measured from operational improvements. Month nine integration with existing systems completed, RV increased to two hundred ten thousand dollars. Month twelve, full customer adoption achieved. RV reached four hundred eighty thousand dollars. Value conversion analysis when they compared the final RV four hundred eighty thousand dollars to the initial XV five hundred sixty one thousand six hundred dollars, they calculated a value conversion rate of eighty five percent, well above their portfolio average of seventy four percent. Analysis revealed several key insights. Strategic fit assessment had been accurate, teams readily adopted the platform because it aligned with their priorities. They had slightly overestimated initial confidence, particularly regarding integration complexity. The time sensitivity estimate had been accurate as competitors launched similar capabilities within the projected window. This detailed tracking provided both accountability for the specific innovation and learning to improve future XV assessments. It transformed XV from a one-time prioritization tool into an ongoing forecast discipline, constantly calibrated by reality. The shift from expected to realize value is where innovation meets accountability. It's also where the real learning happens. That's why Freya added a new column to her dashboard, RV, realized value. For launched initiatives, RV replaced the XV forecast. For initiatives in scale up mode, they tracked emerging RV alongside updated XV scores. For every project they recorded the gap, XV versus RV. This did two things. One, it showed where they were over or underperforming expectations. Two, it turned XV into a continuous learning engine. When XV and RV were closely aligned, the team gained confidence in their scoring. When the gap was large, they dove into why, were they overestimating confidence, misjudging predicted value, missing strategic fit. XV wasn't just a metric, it was now a forecast discipline constantly calibrated by RV. Value typology beyond financial returns. One of the most significant insights in Freya's value realization approach was recognizing that innovation creates different types of value that cannot all be measured in the same way or on the same timeline. Working with David from Finance, Freya developed a value typology to categorize and track different forms of innovation value. You can access this at the website xvbook.com. As you will see, the framework distinguished six different value types, each with its own measurement approach, metrics, and timeline. The typology transformed how innovation value was discussed, moving beyond the false binary of financial ROI or nothing to a more sophisticated understanding of innovation's multidimensional impact. The most straightforward category encompassed direct financial impacts, revenue generation, cost reduction, margin improvement, asset utilization, optimization, working capital reduction. This type of value was measured through traditional financial metrics and could be directly incorporated into standard financial reporting.

SPEAKER_03:

Strategic value.

SPEAKER_00:

This category captured contributions to long-term competitive position, market share growth, brand strength enhancement, new market entry enablement, competitive advantage building, capability development, strategic value often required longer time frames to measure and involved both quantitative and qualitative assessments.

SPEAKER_03:

Operational value.

SPEAKER_00:

This focused on improvements to internal operations, efficiency gains, quality improvements, cycle time reduction, error rate decreases, process simplification. Operational value was typically measured through operational key performance indicators and process metrics.

SPEAKER_03:

Risk value.

SPEAKER_00:

This captured risk reduction or mitigation, compliance improvement, security enhancement. Resilience building, vulnerability reduction, regulatory requirement satisfaction, risk value required specialized metrics often developed in collaboration with risk management, legal and compliance functions.

SPEAKER_03:

People value.

SPEAKER_00:

This addressed benefits to the workforce. Employee experience enhancement, productivity improvement, skill development, well-being promotion, retention improvement. People value was measured through HR metrics and employee feedback systems, sustainability value. This focused on environmental and social impact, carbon footprint reduction, waste elimination, resource efficiency improvements, community impact enhancement, social responsibility fulfillment. Sustainability value use specialized environmental and social impact metrics. Different innovations create different forms of value, Freya explained to the portfolio committee. To measure realized value effectively, we need to match the measurement approach to the value type. This typology transformed how innovation value was discussed, measured, and reported. It moved the conversation beyond the false binary of financial ROI or nothing to a more sophisticated understanding of innovation's multidimensional impact. The value realization system to bridge the gap between XV and RV systematically, Freyer and Axel developed what they called the value realization system, a structured approach to ensure that potential value became actual value. As illustrated above, the system had six core components, each playing an essential role in transforming potential into performance. Together, these components created a comprehensive framework to guide innovations from concept to measurable impact. One, value specification For each innovation that passed initial governance thresholds, the team created a detailed value specification that identified the specific value types the innovation was expected to deliver, defined clear, measurable metrics for each value type, established measurement time frames and milestones, assigned value tracking responsibility to specific individuals or teams, and documented key assumptions underlying value projections. This specification created a clear contract between the innovation team and the business about what value would be measured and how. Take our procurement automation tool, Freya explained during an implementation briefing. We've specified three value types. Financial value of$500,000 in cost savings over 18 months, operational value through a 40% reduction in processing time, and risk value via a 30% decrease in compliance exceptions. Each has defined metrics, measurement points, and clear ownership. 2. Implementation pathways For innovations moving toward deployment, the team developed explicit implementation pathways that mapped handoff points between innovation teams and operational units, resource requirements and commitments across the transition, key stakeholder engagements required for successful implementation, critical capability gaps that needed to be addressed, and potential barriers and mitigation strategies. These pathways acknowledged that the journey from prototype to production was often where value leaked away, as innovations lost fidelity or momentum during implementation. Freya had seen too many promising pilots fail during the transition to operations. Implementation isn't an event, it's a journey with critical waypoints, she reminded her team. Our job isn't done until the value is flowing. three Adoption Acceleration Recognizing that value only emerges when innovations are actually used, the team created adoption acceleration strategies for each significant initiative, user onboarding and enablement approaches, internal communications and change management plans, incentive alignments to encourage adoption, early feedback mechanisms to address barriers, success measurements and storytelling. These strategies addressed the common pattern where technically successful innovations failed to deliver value because people didn't adopt them effectively. The team learned this lesson the hard way with an early analytics dashboard that had impeccable technical implementation, but languished unused by its target users. A perfect solution that no one uses creates zero value, Axel observed. Now every implementation included structured approaches to drive adoption. four. Value capture mechanisms To ensure that realized value was properly attributed and measured, the team established value capture mechanisms, baseline measurements before implementation, tracking systems for relevant metrics during and after deployment, regular measurement checkpoints aligned with projected value timelines, value attribution methodologies to isolate innovation, impact from other factors, value reporting templates and communication approaches. These mechanisms created accountability for measuring and reporting the actual value delivered, preventing the common scenario where successful innovations never quantified their impact. If we can't measure it, we can't claim it became a mantra for Freya's team. This discipline ensured they could demonstrate credible, defensible value results rather than relying on anecdotes or assumptions. Five. Learning integration To turn value realization into a dynamic learning system, Freya integrated feedback loops that compared projected XV and realized RV value to improve estimation accuracy. Identified common implementation barriers to improve future pathways, analyzed adoption patterns to enhance change management approaches, documented value delivery timelines to refined future projections, captured key insights to improve the value realization process itself. These learning loops ensured that the organization got progressively better at converting potential value into realized value. When their customer onboarding innovation delivered 120% of projected value but took twice as long as expected, the team didn't just celebrate the outcome, they analyzed why their timeline estimates had been so inaccurate. This insight improved planning for subsequent initiatives, gradually increasing their value conversion rate. 6. Value amplification. Finally, the system included deliberate value amplification approaches to maximize the impact of successful innovations, including scaling strategies to extend innovations to new contexts, enhancement roadmaps to build on initial success, knowledge transfer mechanisms to apply insights elsewhere, success storytelling to build momentum and cultural reinforcement, portfolio adjustment based on realized value patterns. This component addressed the common failure to fully capitalize on successful innovations by scaling them effectively. One operations innovation that reduced processing time by 35% in a single department was systematically extended to five additional departments within six months, multiplying its initial value by 400%. Our best value lever isn't just finding new innovations, Freya told the executive team. It's maximizing the value of what's already working. Together, these six components created a systematic approach to value realization. Closing the loop, the vitality index, but Freya still faced a bigger question. How to show the cumulative impact of all these initiatives on the business. Enter the Vitality Index. This deceptively simple KPI asks What percentage of our revenue is coming from products or services launched in the last three years? The formula. It appears in analyst calls. It shows how much of your business is being powered by innovation, and most importantly, it links directly to the RV of your pipeline. Vitality equals aggregate RV divided by total revenue. That made it the perfect top level KPI for Freya's system. XV measures potential. RV measures delivery. Vitality measures strategic contribution. Now every idea in the portfolio had a traceable path to the P and L.

SPEAKER_03:

The strategic fit connection.

SPEAKER_00:

Freya noticed something particularly interesting about their highest performing innovations. Projects with strategic fit scores above 0.8 in the XV formula consistently showed higher conversion rates from expected to realize value. Strategic fit isn't just about prioritizing the right things, she explained to the executive team. It's a powerful predictor of successful implementation. Analysis of their portfolio revealed that innovations with high strategic fit scores achieved an average value conversion rate of 84%, compared to just 63% for innovations with low strategic fit scores, regardless of their confidence or predicted value ratings. This pattern made intuitive sense. When an innovation aligned strongly with strategic priorities, it naturally received more organizational attention, resources, and executive sponsorship during implementation. People were more motivated to drive adoption when they could see the connection to broader strategic goals. Strategic fit becomes a powerful gravitational force pulling innovations towards successful value realization, Axel observed. It creates a natural path of least resistance through the organization. This insight led Freya to establish a minimum strategic fit threshold of zero point six for all innovations entering implementation, ensuring they only invested in opportunities with a reasonable chance of delivering their expected value.

SPEAKER_03:

From pipeline to prediction.

SPEAKER_00:

With this link established, Freya took it one step further. She began forecasting the predicted vitality contribution PVC of active XV projects. PVC equals XV divided by forecasted revenue times one hundred. For each high XV initiative they estimated when it might launch, how much of its XV could realistically be realized based on past conversion rates and its expected contribution to the future vitality index. This gave them a rolling forecast of future innovation contribution. They now had two lenses actual vitality from realized value, forecasted vitality from higher XV initiatives in the pipeline. Together these painted a complete picture of innovation's contribution, both current and future.

SPEAKER_03:

The XV RV Vitality Cha.

SPEAKER_00:

To communicate this clearly, Freya created a simple chain which you can access at xvbook.com. This chain illustrates the progression from innovation concept to business impact. XV is where belief lives. RV is where proof lives. Vitality is where business impact lives. This created a closed loop performance system, and for the CFO, it offered something rare, a traceable line from innovation decisions to commercial results. Measuring conversion efficiency. The final piece tracking how well the system turned potential into payoff. Conversion rate equals RV divided by XV. This showed which types of initiatives were most likely to deliver, where the system might be overconfident, how innovation effectiveness changed over time. It also revealed improvement opportunities in execution, scaling and selection. Over time, Freya began publishing quarterly stats. XV delivered per quarter. RV realized from past XV, vitality index trend, rolling, and conversion efficiency. This was innovation telemetry at its most strategic.

SPEAKER_03:

Beyond Vitality, measuring the innovation system's value.

SPEAKER_00:

As the value realization system matured, Freya recognized the need for portfolio level metrics that would measure not just individual innovation outcomes, but the overall health and impact of the innovation system itself. Working with David and the finance team, she developed a comprehensive innovation value dashboard that extended beyond the traditional vitality index to capture multiple dimensions of innovation impact. Again, you can view this at 15book.com. The dashboard tracked six essential metrics that together provided a holistic view of innovation performance. These metrics went far beyond traditional innovation measures like idea counts or launch rates, focusing instead on value creation and delivery. This metric compared realized value to expected value across the portfolio, showing how effectively the organization was converting potential into actual value. It was calculated as value conversion rate equals total realized value divided by total expected value. This rate, tracked over time, revealed whether the organization was getting better at delivering on innovation promises. When they first began tracking this metric, the value conversion rate was just 47%. Less than half of expected value was actually materializing. Through systematic application of the value realization system, they increased this to 78% within 18 months, a dramatic improvement in value capture. Analysis of patterns in the conversion data exposed specific value leakage points, particularly during the transition from innovation teams to operational units, leading to targeted improvements in implementation processes. A value conversion rate above 70% is considered strong performance, David noted during an executive review. Pushing beyond 80% requires extraordinary execution across the entire value chain. Value velocity. This measured how quickly innovations moved from approval to value delivery, calculated as value velocity equals realized value divided by time from approval to delivery. This metric highlighted the organization's ability to implement innovations efficiently, preventing value erosion through delays. By comparing value velocity across different types of innovations, they discovered that operational innovations typically delivered value 2.3 times faster than customer facing ones, despite similar complexity. This insight led to differentiated implementation approaches that reduced the value delivery timeline for customer innovations by forty percent.

SPEAKER_03:

Value diversity.

SPEAKER_00:

This assessed the balance of value types across the portfolio, showing the percentage of total value coming from each category financial, strategic, operational, risk, people, sustainability. Value diversity revealed whether the innovation portfolio was delivering a healthy mix of short-term financial returns and longer term strategic value. Initial analysis showed their portfolio was heavily skewed toward financial and operational value eighty two percent, with minimal contribution to strategic people and sustainability dimensions. This imbalance helped explain why some executive stakeholders remained lukewarm about innovation despite good financial results. The portfolio wasn't addressing their priority value domains. Challenge Resolution Index This measured the percentage of strategic challenges that had been substantially addressed through innovation initiatives. The challenge resolution index connected innovation directly to strategic impact, showing how effectively innovation was solving the organization's most important problems. We were innovating successfully but not always on our most pressing challenges, Freya acknowledged. Their initial challenge resolution index of thirty-four percent revealed significant gaps between innovation activity and strategic priorities, driving portfolio adjustments that increased the index to 65% within a year.

SPEAKER_03:

Value amplification factor.

SPEAKER_00:

This calculated how much additional value was generated through scaling successful innovations beyond their initial context, measured as value amplification equals total value from scaled innovations divided by initial implementation value. This metric revealed the organization's ability to maximize return on successful innovations through effective scaling. Analysis showed that while they were good at creating initial value, they were capturing less than half the potential scaling value. Our innovations were like seeds that sprouted but never fully grew, Axel observed. By implementing systematic scaling. Scaling approaches they increase their value amplification factor from 1.8 times to 4.2 times within a year.

SPEAKER_03:

Innovation investment efficiency.

SPEAKER_00:

This compared total innovation value delivered to total innovation investment calculated as innovation investment efficiency equals total realized value divided by total innovation investment. This efficiency metric provided a portfolio level view of innovation ROI while acknowledging the multi-year nature of innovation returns. Together, these metrics created a comprehensive view of the innovation system's performance that went far beyond traditional measures like number of ideas or pilots launched. They focused explicitly on value creation and delivery, providing executives with clear signals about the return on their innovation investment. The continuous value cycle. As Freya studied the value dashboard one afternoon, she realized something important. Value realization wasn't a linear process from idea to outcome. It was a continuous cycle where realized value created the foundation for the next wave of innovation. Value isn't an endpoint, she told Axel. It's a loop. This insight led to the development of what they called the continuous value cycle, a model that connected realized value back to new innovation opportunities, which you can see at xvbook.com. As visualized above, the cycle moves through eight interconnected stages, flowing from challenge identification through implementation and value realization, and then back to new opportunities. This cyclical approach transformed how the team thought about innovation value, positioning it not as a one-time achievement but as part of an ongoing system of value creation. This cycle created a sustainable engine for ongoing value creation, where each round of innovation built on the insights and capabilities developed in previous rounds. In traditional innovation approaches, success is seen as the end of a process, Freya explained to her team. In the continuous value cycle, success is just the beginning of the next wave of innovation. This perspective transformed how the team thought about portfolio management. Instead of focusing solely on moving individual innovations from concept to implementation, they began thinking about how to maintain momentum across multiple cycles of value creation. From vanity to vitality, innovation teams have long relied on vanity metrics like ideas submitted, workshops held, engagement scores, but these do little to show how innovation contributes to growth. The Vitality Index changes the conversation. It elevates innovation to the level of business performance. It links effort to impact, and when used in tandem with XV and RV, it becomes the scoreboard for innovation maturity. As Freya stood before her team at the next quarterly review, she clicked through the slides. Pipeline X V up eighteen percent this quarter. Realized value RV plus twenty three percent year over year. Vitality Index twenty seven percent of total revenue. Conversion efficiency zero point nine one. No one asked about idea volume, no one asked about pilot counts. They saw the signal, they saw the contribution, they saw performance. David, who had once questioned the worth of all their innovation activity, now nodded in agreement. The difference isn't just in what we measure, he observed. It's in how we think. We're not just doing innovation anymore, we're creating value through innovation. And that's a fundamental shift. What changed your perspective, David? asked the CMO. David considered the question carefully. Three things, he replied. First, connecting XV to measurable outcomes made innovation concrete, not abstract. Second, seeing the strategic fit component in action showed me how innovation directly supports our core priorities. And third, he gestured to the vitality index on screen. This number gives us a genuine compass for innovation's contribution to our future. The CMO nodded. For me it's the pipeline visibility. I can now see three years into our innovation future with a clear picture of how today's XV becomes tomorrow's business performance. As the meeting ended, Freya made a note in her journal. Value realised isn't just about numbers on a dashboard, it's about creating something that matters to our customers, our colleagues, our community. The metrics help us see it, but the meaning comes from the impact itself. That was the true transformation their work had enabled. Not just a better measurement system, but a deeper connection between innovation activity and meaningful impact. Not just better numbers, but better contributions to what mattered most. And for the first time, innovation wasn't asking for belief.

SPEAKER_03:

It was delivering proof. Too long didn't read.

SPEAKER_00:

The journey from innovation concept to business impact requires a systematic approach to bridge the gap between expected value and realized value, RV. Freya's value realization system transforms potential into performance through six interconnected components value specification, implementation pathways, adoption, acceleration, value capture, learning integration, and value amplification. This system recognizes that innovation creates multiple value types financial, strategic, operational, risk, people, and sustainability, each requiring different measurement approaches. By connecting RV to the Vitality Index, percentage of revenue from recent innovations, organizations establish a clear line from innovation decisions to commercial results. Advanced metrics like value conversion rate, value velocity, and value amplification factor provide comprehensive performance visibility, while the continuous value cycle ensures innovation isn't a one-time event, but an ongoing engine for value creation. Strategic fit proves to be a crucial predictor of implementation success, with high fit innovations achieving significantly better conversion rates. This systematic approach transforms innovation from promises into proof, elevating it from a creative function to a strategic capability that consistently delivers measurable business impact. Chapter thirteen The Table of Justice FC Majitiland would have fired their manager, Axel said, half smiling, if they hadn't been paying attention to the data. Freya looked up from her coffee. What? she said. FC Majitiland, Danish football club, they were down the bottom of the league, poor results, all the usual drama. And she said, they held their nerve because when they looked at the underlying data, XG, passes completed, high value chances. They saw something else, the team was doing the right things, the results hadn't landed yet, but the process was working. Freya leaned back in her chair. And were they right? Axel nodded. Turned it around. Qualified for Europe, people called them geniuses. Freya smirked.

SPEAKER_03:

Lucky geniuses. He grinned. Still counts. The off site conversation.

SPEAKER_00:

They were sitting in the small corner booth in the cafe across from their office, a place they had started calling the off site, despite being forty-seven steps from their door. It had become their space for reflective conversations. The moments between the big moves. Today's topic the kill list. They'd just wrapped their quarterly reallocation cycle. Five ideas had been shut down, two had been reframed, several others were stalled, it wasn't a crisis. But Freya could feel the weight of it. We're getting better at moving fast, she said, staring into her cup. But are we getting better at learning? Axel opened his laptop and flipped it around about. On screen was a table, rows of projects, their original XV scores, their final scores, confidence deltas, time sensitivity curves, and qualitative notes on why they were killed or paused. At the top in bold letters, a title The Table of Justice. Freya laughed. Seriously? Axel shrugged. If it's good enough for football. The reference wasn't random. FC Majitiland, the Danish club that had been crowned champions for the first time in its history in twenty fourteen slash fifteen, later found itself in a slump. As pressure mounted on their coach, Jess Thorup, with fans calling for his dismissal, the club's chairman Rasmus Ankerson refused to blink. While everyone else looked at the league table showing disappointing results, Ankerson revealed that the club gave more weight to an alternative table they had devised, one that measured not just points accrued, but the number and quality of chances created and the nature of opportunities conceded. In this alternative assessment Mitchelland was flying. The club's models showed that Mitchelland was the best team in Denmark, Axel explained. Their table of justice proved that Thorup's team should by rights have had fifteen more points than they'd mustered in reality. And they stuck with the coach based on that? Freya asked. They did. Because what the data said was close to sacrosanct.

SPEAKER_03:

Connecting challenges to outcomes. The missing link.

SPEAKER_00:

As Freya studied Axel's screen, she realized that while the data was compelling, something was missing. This tracks what happened to our ideas, she said, but it doesn't connect back to why we were working on them in the first place. What challenges were we trying to solve? Axel nodded thoughtfully. You're right. We're tracking solution journeys but not challenge resolution. Do you want to add that dimension? I think we need to, Freya replied. If we're truly focusing on challenge driven innovation, our retrospective analysis should start with the challenges, not just the solutions we attempted. This insight led to a significant enhancement of the table of justice. They added a new column that explicitly connected each innovation initiative to the original business challenge it was designed to address. This created a more meaningful way to evaluate outcomes, not just whether the specific solution succeeded, but whether progress was made on the core challenge. This changes how we think about success and failure, Freya observed. An individual solution might not work, but if it generates insights that help us address the challenge in a different way, that's still valuable progress. As they refined this approach, they found that challenges that had appeared stalled when viewed through the lens of specific solutions often showed meaningful advancement when viewed through the lens of challenge resolution. Multiple failed attempts actually represented productive learning toward addressing complex problems. Challenge-driven retrospectives give us a more honest picture of our impact, Freya told her team. They prevent us from confusing solution failure with challenge abandonment. This perspective transformed how the organization thought about innovation performance from a binary view of solution success or failure to a more nuanced understanding of progress toward resolving the challenges that truly mattered to the business. It also strengthened the connection to the strategic layering approach they had established in chapter eight, ensuring that retrospective analysis remained tied to strategic priorities.

SPEAKER_03:

From results to process.

SPEAKER_00:

In football, the table of justice shows where teams would be in the league table if outcomes aligned with expected performance. It doesn't rewrite the results, but it helps make sense of them. It recognizes that randomness, timing, and look all play a role, that effort and process matter. Axel's idea was simple. Could they build a similar table for innovation? Not to rewrite history, but to understand it, not to defend decisions but to learn from them. The concept struck Freya immediately. Their current systems tracked active projects well, those still in the portfolio with momentum and resources. They had become adept at measuring learning velocity, tracking confidence deltas, and optimizing active investments, but they had no systematic way to learn from ended initiatives. When projects were killed or completed, they essentially vanished from the system. The team might remember them individually, but their patterns and lessons weren't captured in a way that informed future decisions. We're like a team that only studies the games we won, Freya realized. What about all the other matches? The ones we lost, the ones we drew, the ones where we played well but got unlucky? There's just as much to learn there, maybe more. They looked at each kill project and asked what was the original XV? What changed over time? When did we know it was off? Did we act at the right moment? Was the kill decision made by data or fatigue? Did we reuse the learnings?

SPEAKER_03:

Building the table.

SPEAKER_00:

Over the next week, Freya and Axel reconstructed the history of every innovation initiative that had been ended in the past year, whether through formal kill decisions, quiet abandonment, or successful completion. It wasn't easy, documentation was scattered, memories were fuzzy, and in some cases, the people involved had moved on, but gradually they assembled what became the first version of the table of justice. For each initiative they captured one initial hypothesis and value proposition two original confidence assessment and rationale three resources invested, time, budget, attention. four key learning inflection points when and how confidence shifted. five final status and termination rationale six. Counterfactual assessment what might have happened with different decisions. seven knowledge transfer effectiveness where learnings were reapplied. This approach wasn't without precedent. In football analytics, pioneers had developed metrics that effectively measured expected goals, assessing the quality of chances a team created regardless of results, long before such approaches became mainstream. They had developed sophisticated models that could identify when teams or players were over or underperforming their underlying metrics. What made these analytics valuable wasn't just their mathematical rigor, it was their ability to see beyond immediate outcomes to the patterns that predicted future performance. Just as football analysts could identify that a team creating high quality chances would eventually start scoring more goals, Freya and Axel could spot initiatives that were building strong evidence foundations even if they hadn't yet delivered tangible results. Information is power, Axel pointed out. The more accurate and relevant your information, the more likely you are to win more often than you lose. What emerged from their analysis was powerful. Some ideas had been killed too late, XV had dropped steadily, but no one pulled the plug. Others had been killed early, but when reviewed were found to have untapped fit in another part of the business. In one, an internal mobility platform had been paused six months ago, but when they reviewed the data, it looked like it was now ready for revival. The timing was off before, not the idea. Look at this pattern, Axel pointed out, highlighting a cluster of initiatives. Every single one of these was killed after confidence plateaued for exactly three months. That's not coincidence. That's our invisible decision rule. If something doesn't show progress for a quarter, we kill it. But that's arbitrary, Freya replied. Some innovations need longer incubation periods, especially those with complex stakeholder landscapes. Exactly. We've been applying a one size fits all rule without even realizing it. Other patterns emerged as they dug deeper. Ideas with strong executive sponsorship lasted an average of four months longer before being killed than those without, despite similar confidence trajectories. Technical innovations were typically killed based on performance data, while business model innovations were more often killed based on subjective assessment. Nearly forty percent of killed initiatives contained components that could have been salvaged and repurposed, but fewer than 10% saw their elements reused. The table of justice wasn't just revealing individual decisions, it was exposing the unwritten rules and invisible biases that shaped their entire innovation system. Visually, the table took the form of a comprehensive dashboard with colour-coded trajectories, showing how XV, confidence, and strategic fit had evolved over time for each initiative. Projects were arranged chronologically with visual indicators showing their kill points, learning extraction events, and any subsequent reuse of components or insights. This visualization made patterns immediately apparent that would have been invisible in traditional spreadsheet formats.

SPEAKER_03:

For deeper retrospective analysis.

SPEAKER_00:

As the table of justice concept matured, Freya saw an opportunity to enhance it through artificial intelligence. Working with the technology team, she developed AI capabilities specifically designed to identify patterns in innovation outcomes that might be invisible to human analysis. AI can help us see across hundreds of decisions and thousands of variables simultaneously, she explained to her team. It can identify subtle patterns that shape our outcomes but aren't obvious in case by case review. The AI enhancement took several forms. Decision pattern analysis identified common characteristics of initiatives that were killed too early, too late, or at the optimal time. It revealed that initiatives with visual prototypes tended to survive longer, despite declining XV, while text-based concepts were killed more quickly despite stable confidence signals. Counterfactual modelling simulated alternative scenarios for past decisions, suggesting how different timing or resource allocations might have affected outcomes. This helped the team understand not just what happened, but what could have happened with different approaches. Bias detection identified systematic patterns in decision making that suggested unconscious biases. For instance, the AI revealed that initiatives led by more junior team members were killed faster than those led by senior staff despite similar evidence trajectories. Learning transfer mapping tracked how insights from completed or killed initiatives influenced subsequent work. It could identify when a concept that failed in one context was successfully adapted in another, creating visibility into knowledge flows across the portfolio. One powerful example emerged when the AI analysis identified a recurring pattern that had completely escaped human observation. A set of features that had been included in three separate failed initiatives had actually been critical to the success of two later projects when implemented in a different context. Without the AI-powered pattern recognition, the team would never have made this connection. This insight led them to more systematically extract and test feature components across different contexts rather than treating each initiative as an indivisible whole. The AI doesn't judge our past decisions, Axel explained. It helps us understand the patterns that shape them so we can make more conscious choices going forward. This approach transformed retrospective analysis from an occasional anecdotal practice to a systematic learning capability that continuously improved decision quality across the innovation system. It also complemented the learning loops framework they had established in chapter 10, creating a more comprehensive approach to organizational intelligence.

SPEAKER_03:

Finding the edges.

SPEAKER_00:

The deeper they looked, the more they recognized the similarity between their work and what the data driven football clubs were doing. It wasn't just about measurement, it was about finding hidden advantages. There are always edges, Axel said, quoting Ankerson. Just like Mitchell and Brentford see it as their task to find those edges in football, whether in nutrition, injury prevention, set pieces, or player recruitment, we need to find the edges in innovation decision making. They discovered one such edge in how they evaluated talent acquisition. They found that certain innovation managers consistently generated strong evidence patterns even when their projects didn't reach implementation. We've been evaluating talent based on launch success, Freya realized, but maybe we should be looking at evidence generation quality instead. Another edge emerged in how they structured experiments. Like Midchilin's set piece manual with sixty meticulously designed routines, they began creating an experiment playbook, standardized approaches to test specific types of assumptions more efficiently. But statistics as an academic and scientific exercise to see which ones actually help predict things. Open innovation, external wisdom for internal decisions. The table of justice analysis revealed another critical insight. The organization often struggled with the same challenges repeatedly, cycling through similar solution approaches without fundamentally advancing. This pattern led Freya to expand their retrospective analysis beyond organizational boundaries. We need to learn not just from our own history, but from how others have addressed similar challenges, she explained to the governance board. Open innovation isn't just about finding external solutions, it's about learning from external experiences. Working with research partners, Freya developed what she called challenge resolution benchmarking, a systematic approach to studying how other organizations had tackled challenges similar to their own. This wasn't competitive intelligence in the traditional sense, but a broader study of problem-solving approaches and outcomes across industries. For almost every challenge we face, someone somewhere has already attempted multiple solutions, Axel observed. If we can learn from their experience, we can make more informed decisions about our own approaches. This benchmarking included several components. One, cross-industry pattern analysis identified how similar challenges had been addressed in different contexts, revealing which solution approaches tended to succeed or fail across varied environments. Two, outcome trajectory mapping tracked how solution value had evolved over time in other organizations, helping predict how current initiatives might perform beyond their initial implementation. Three, failure pattern recognition studied common failure modes across industries, identifying early warning signals that might help predict and prevent similar outcomes. And four, solution adaptation analysis examined how successful approaches from other domains had been modified to work in new contexts, providing models for cross-industry knowledge transfer. This open learning approach dramatically expanded the range of insights available for decision making. Rather than being limited to their own experience, the team could draw on a much broader base of evidence when evaluating potential approaches to key challenges. We're not just learning from our own mistakes and successes, Freya told her team. We're learning from everyone's. Initially, this external focus faced resistance from team members who worried about not invented here syndrome or questioned the relevance of other industries' experiences to their specific context. Freya overcame this by starting with small, concrete examples where external insights clearly solved internal problems, gradually building credibility for the broader approach. This implementation strategy helped integrate the concepts with the trust building framework established in chapter nine.

SPEAKER_03:

Justice for process, not ideas.

SPEAKER_00:

Freya looked at the dashboard and felt something shift. This wasn't about nostalgia. It was about justice, not for the ideas, but for the effort, for the learning, for the belief that underpins every good idea, even the ones that don't land. When they saw their decisions reflected, understood, and revisited, not judged, they began to speak more freely, share more honestly, own their calls without defensiveness. The table of justice didn't track performance. It tracked wisdom, and wisdom, as it turned out, had a compounding effect.

SPEAKER_03:

Lead user wisdom, capturing grassroots decision intelligence.

SPEAKER_00:

As the table of justice approach matured, Freya recognized that some of the most valuable insights about innovation outcomes weren't coming from formal analysis, but from the informal observations of frontline employees, the same lead users who often created their own solutions to pressing challenges. The people closest to the work often see patterns in what succeeds or fails long before they show up in our formal metrics, she told Axel. We need to systematically capture that grassroots wisdom. Working with the knowledge management team, Freya developed a lead user wisdom network specifically designed to tap into this distributed decision intelligence. The approach included several key elements. Decision story collection created structured opportunities for employees to share their observations about why certain innovations succeeded or failed in their domains. These narratives often contained insights about contextual factors that weren't captured in formal data. Patent community forums brought together employees from different functions who had observed similar outcomes in different contexts. These cross-functional exchanges revealed common factors that influenced success or failure across domains. Implementation. Ethnography involved observing how innovations were actually used in practice, often revealing gaps between intended and actual usage that explained unexpected outcomes. Frontline prediction challenges invited employees to forecast which current initiatives would succeed or fail, and why tapping into their implicit understanding of what worked in their contexts. Lead users don't just create innovations. They often have the most accurate read on which innovations will work in their environment, Freya explained to the portfolio committee. They've seen numerous formal and informal solutions come and go, and they've developed pattern recognition that our systems don't capture. This approach transformed how the organization thought about innovation intelligence, recognizing that wisdom about what worked and why was distributed throughout the organization, not concentrated in formal innovation or analytics functions. Some of our best decision insights come from people who would never describe themselves as innovation experts, Axel observed. They just know their domain deeply enough to see what will and won't work there.

SPEAKER_03:

The second justice, reconsideration.

SPEAKER_00:

As they refined the table of justice approach, Freya and Axel introduced a powerful new component. Structured reconsideration. Every quarter they identified three to five killed initiatives that warranted a fresh look based on change circumstances, new capabilities, or shifting strategic priorities. These weren't casual reviews, they were rigorous reassessments using current data and perspectives. The selection process for reconsideration wasn't arbitrary. The team used a systematic approach that combined both data-driven triggers and human judgment. Projects became candidates for reconsideration when there were significant changes in market conditions or competitive landscape, internal technical capabilities or infrastructure, strategic priorities or risk tolerances, resource availability or constraints, stakeholder support or sponsorship. Additionally, they monitored for weak signals like informal employee comments about missed opportunities, customer requests that echoed previously abandoned concepts or patterns in competitor moves that validated earlier internal hypotheses. This process had three possible outcomes. One confirmation. Two revival. The initiative was reactivated with appropriate adjustments. Three harvest. Specific elements were extracted for use in other initiatives. The revival of the internal mobility platform proved particularly significant. Six months earlier, it had been killed due to technical integration challenges and lukewarm stakeholder support. But the landscape had changed. The company had since implemented a new data architecture that resolved the technical barriers, and a recent talent retention crisis had created urgent demand for internal career development solutions. The reassessment showed that not only were the original barriers removed, but the strategic value had substantially increased. The platform was revived with executive sponsorship and went on to become one of the most successful HR innovations in the company's history. This wasn't an isolated case. Roughly 20% of reconsidered initiatives qualified for some form of revival, while another 30% yielded valuable components for other projects. Even those that remained killed provided valuable insight when the reconsideration process confirmed the original decision with fresh analysis. Accelerating value realization through retrospective insight. Perhaps the most powerful impact of the table of justice was how it accelerated the pathway from expected to realized value. By analyzing patterns in past innovation outcomes, the team gained insights that directly improved their ability to deliver value from current initiatives. The table doesn't just help us understand what happened in the past, Freyer explained to the executive team. It helps us identify the specific factors that either enable or obstruct value realization across different types of innovation. This connection between retrospective analysis and value delivery took several forms. Implementation pattern recognition identified common barriers that prevented otherwise promising innovations from delivering their expected value. By recognizing these patterns early in current initiatives, teams could proactively address potential obstacles before they affected outcomes. Adoption acceleration insights revealed which approaches to change management and user enablement most effectively supported value capture across different contexts. These learnings were directly applied to current implementation strategies, significantly improving adoption rates. Value leakage detection highlighted where and how value was typically lost between concept and implementation. By addressing these common leakage points, the team was able to increase the percentage of expected value that was actually realized. Scaling pattern analysis showed which types of innovations scaled successfully beyond their initial context and which tended to remain localized. This helped the team design scaling strategies that were matched to innovation characteristics, rather than applying one size fits all approaches. The table of justice has become one of our most powerful tools for value realization, Axel observed. It helps us anticipate and address the specific barriers that might prevent current initiatives from delivering their full potential. This application transformed retrospective analysis from a learning exercise into a direct contributor to value creation, using insights from past patterns to increase the success rate and impact of current innovations.

SPEAKER_03:

From metrics to movement.

SPEAKER_00:

How so? Axel asked. We evaluate initiatives based on whether they succeeded or failed, whether they delivered value or didn't. But the real metric that predicts long-term innovation performance isn't success rate. It's learning mobility, how effectively learning moves across the organization. This insight led to a new dimension of analysis in the table of justice. Beyond tracking individual initiative outcomes, they began measuring how effectively learning from each initiative influenced subsequent decisions and approaches. They created what they called learning mobility mapping, a visual representation of how insights flowed through the organization. The patterns were revealing. Some initiatives that had failed by traditional measures had generated learning that influenced numerous subsequent projects, creating far more cumulative value than many successful initiatives whose learning remained isolated. The most valuable initiatives aren't necessarily those that succeed directly, Freya told her team. They're the ones that generate learning that improves our overall system performance. This perspective transformed how the organization thought about innovation value, from the direct impact of individual initiatives to their contribution to the collective intelligence that shaped future outcomes. It created a much more nuanced view of success and failure, where an initiative's ultimate value was determined not just by its immediate results, but by its influence on the organization's evolving capability. In sports analytics, they've learned that some statistics predict future performance better than others, Axel noted. We're discovering the same thing about innovation metrics. Learning mobility may be a better predictor of long-term value creation than immediate success rates.

SPEAKER_03:

Visualization in action.

SPEAKER_00:

Two AI projects were competing for resources with passionate advocates on both sides. On paper, both had similar X V scores around$700,000. But when Freya displayed the X V momentum chart, The pattern was striking. One project showed a steady upward trajectory with consistent confidence growth over three months. The other had flatlined despite additional investment. When she overlaid the fit radars, another distinction emerged. The first project had a well-balanced profile, while the second showed significant weakness in company advantage. The data's telling a clear story, she told the team. One project is gaining momentum and fits our strategic profile. The other has stalled and sits outside our core advantages. What could have been a contentious, opinion-driven debate became a straightforward, evidence-based decision. They redirected resources to the first project and explored partnership options for the second. Like the data-driven football clubs that had become known as moneyball teams, Freya's approach involved looking beyond conventional measurements to find hidden value. These clubs, Micheland in Denmark, Brighton, and Brentford in England, had transformed their performance by recruiting undervalued talent and playing styles shaped by data analysis, enabling them to compete with teams having far greater resources. In the same way, the table of justice gave Freya's team the ability to spot innovation value that traditional metrics missed. It helped them identify initiatives that were creating the equivalent of high quality chances, strong evidence patterns that indicated future success, even when immediate results hadn't materialized. As one team member later noted, I came in skeptical of both projects, but seeing the patterns visually, I got it immediately. The decision wasn't just defensible, it was obvious.

SPEAKER_03:

Beyond innovation, mainstream acceptance.

SPEAKER_00:

A week later, Freya presented the table in an executive meeting. She didn't use it to justify the past. She used it to demonstrate culture. These aren't failures, she said. They're decisions, and our ability to revisit them, learn from them, and reuse what we've learned, that's the asset. David, the CFO, nodded slowly. So this is like a postmatch analysis. Freya smiled. Exactly. Except the match never really ends. What had begun as an internal tool was gaining broader recognition. Just as Michelin's approach had eventually earned a regular feature in one of Denmark's leading newspapers, a league table based on underlying performance called the Table of Justice, Freya's methodology began spreading beyond her team. The finance department adopted confidence scoring for major investment decisions. The product team implemented structured retrospectives based on the table of justice model. The executive committee began asking for confidence trajectories alongside projected outcomes. As with the forward thinking football clubs that had embraced analytics, there was occasional skepticism about Freya's approach. Some viewed the table of justice as a touch overblown, as though it had been lifted straight from a comic book, a fancy way of dressing up subjective judgments in the sophistication of data and analytics. But Freya brushed such criticism aside. Just as Ankerson had stood firm in his data driven defense of his coach at Mitchelland, she knew that the insights from their retrospective analysis were too valuable to ignore, even if they sometimes contradicted conventional wisdom about which initiatives had succeeded or failed. The system wasn't about creating a parallel reality that conveniently justified past decisions. It was about seeing more deeply into the patterns that predicted future outcomes, recognizing that some apparent failures contained the seeds of future success, while some apparent victories masked fundamental weaknesses. What Freya realized was that innovation, like football, doesn't immediately embrace new methods. As the book described, football is rather more brutal. It has a zero sum approach. Vindication of any idea requires undisputed success, not relative achievement. It is a sport that wants to see medals before it buys into a theory. Innovation often suffered from the same conservative tendencies, but just as Micheland, Brentford, and Brighton had demonstrated the value of analytics in football, Freya's team was proving that a more sophisticated approach to innovation assessment could deliver real advantages. And like those pioneering clubs, they were determined to find the edges others had overlooked. Because in innovation, just like in football, you can't control the bounce of the ball, but you can control how you play. And the teams that learn fastest, not just from wins but from all experiences, are the ones that get better, not by luck, by design.

SPEAKER_03:

The systemic contribution to value realization.

SPEAKER_00:

As the table of justice became an integrated part of the organization's innovation approach, its contribution to value realization became increasingly clear. By systematically analyzing patterns across initiatives, the team gained insights that directly improved their ability to deliver value from innovation investments. The table isn't just a retrospective learning tool, Freya explained to the executive team. It's a forward-looking value acceleration mechanism. This connection between retrospective analysis and future value creation manifested in several key ways. Implementation pathway optimization drew on patterns from past initiatives to design more effective approaches to putting new innovations into practice. By understanding common implementation barriers and success factors, teams could develop more realistic and effective plans for moving from concept to realized value. Resource allocation refinement used historical patterns to improve how resources were distributed across the portfolio. Understanding which types of initiatives typically required more support at specific stages helped prevent both under-resourcing of promising opportunities and overinvestment in concepts unlikely to deliver value. Timing sensitivity enhancement improved the organization's ability to determine when an innovation was truly ready for implementation. Historical patterns revealed that premature scaling was a common cause of value erosion, while delayed implementation often resulted in missed market opportunities. Value capture strategy development drew on past experiences to create more effective approaches to ensuring innovations delivered their intended value. This included improved methods for measurement, adoption support, and ongoing optimization. The table of justice has become a critical part of our value realization system, Axel observed. It helps us anticipate and address the specific factors that have historically prevented innovations from delivering their full potential. This application transformed retrospective analysis from a purely reflective practice to an active contributor to value creation, using insights from past patterns to increase both the likelihood and magnitude of value realization from current and future innovations.

SPEAKER_03:

The integration into strategic decision making.

SPEAKER_00:

Perhaps the most significant impact of the table of justice was how it transformed strategic decision making beyond the innovation function. As the approach gained credibility, its insights began influencing broader business decisions about market entry, capability development, and resource allocation. What started as a tool for innovation retrospectives has become a strategic decision support system, Freya noted to the Governance Board. The patterns we've identified help inform not just how we innovate, but where and when we invest more broadly. This strategic integration took several forms. Investment domain prioritization drew on historical innovation outcomes to identify which areas consistently showed higher success rates or learning value. This helped shape strategic decisions about which markets, technologies, or capability areas warranted increased investment. Timing signal recognition used patterns from past initiatives to improve the organization's ability to identify when market conditions were right for specific types of innovation. This prevented both premature commitments to emerging opportunities and delayed responses to clear market signals. Capability gap identification leveraged historical innovation outcomes to pinpoint specific organizational capabilities that either enabled or constrained success. This informed strategic investment in capability development beyond the innovation function itself. Risk calibration used documented patterns to help executive teams more accurately assess the risks associated with different strategic options. Rather than relying on general risk perceptions, they could draw on specific historical patterns showing where similar bets had succeeded or failed. This broader strategic influence marked a significant evolution in how innovation intelligence contributed to organizational decision making. The table of justice had moved from an innovation-specific tool to a resource that informed leadership thinking about strategic direction and investment priorities. In the most sophisticated sports organizations, analytics doesn't just inform tactical decisions, it shapes strategic direction, Axel observed. We're seeing the same evolution here, where our innovation intelligence is increasingly influencing strategic choices about where and how the business competes.

SPEAKER_03:

The table of justice as a living system.

SPEAKER_00:

As Freya reflected on the evolution of the table of justice, she recognized that its greatest strength wasn't in any particular metric or framework, but in its capacity to evolve with the organization's changing needs and circumstances. The table isn't a fixed methodology, she explained to her team. It's a living system that continuously adapts based on what we're learning about our own decision patterns. This adaptive quality manifested in several ways. Metric evolution regularly refined which measures proved most predictive of future performance. Some indicators that initially seemed promising were dropped as their predictive value declined, while new measures were incorporated as their significance became apparent. Context sensitivity adapted how outcomes were interpreted based on changing market conditions, organizational priorities, or capability landscapes. The same outcome might be evaluated differently in different strategic contexts. Cross-functional expansion extended the approach to decisions beyond the core innovation portfolio, including strategic partnerships, capability investments, and market entry choices. The fundamental principles remained consistent, but the specific applications evolved to suit different decision contexts. Learning cycle compression progressively shortened the time between experience, analysis, insight generation, and application. What had initially been quarterly retrospectives evolved into more continuous learning systems that identified patterns and generated insights in near real time. The table of justice isn't valuable because it gives us a perfect methodology, Freya told the governance board. It's valuable because it helps us get progressively better at learning from our own experience. This perspective positioned the table not as a fixed analytics approach, but as a foundational capability for organizational adaptation and evolution. A system that helped the business not just evaluate past decisions but continuously improve its decision-making capacity over time. As Freya often noted, the greatest competitive advantage isn't knowing the right answers, it's getting better at learning the right questions. If you go to xvbook.com, you will see a visual of the table of justice, Freya's Innovation Performance League table for 20, 25 to 25. Key insights from it are internal mobility platform shows the highest absolute XV value, but is currently paused despite this exceptional potential. Supplier integration API shows a dramatic variance, seven positions, performing well below what its fundamentals would predict. Digital documentation is performing at 150% of expected value despite being predicted to finish last. Green Energy Initiative was likely terminated prematurely, with an expected position five places higher than its actual rank. Predictive maintenance and HR self service portal are delivering significantly more value than their expected positions would suggest critical intervention recommendations one urgent revival assessment for internal mobility platform, highest potential value in the portfolio but currently paused. two Emergency Intervention Protocol for Suppliation API catastrophic underperformance despite strong potential three Expansion Style for digital documentation opportunity to scale this exceptionally efficient initiative four reconsideration analysis for green energy initiative significant evidence of premature termination five pattern analysis across all terminated projects to identify systemic decision biases Root cause investigation priorities Why is supplier integration API failing despite excellent fundamentals? What organizational factors led to the pausing of the highest potential initiative? What can be learned from digital documentation's exceptional efficiency? What decision process led to the termination of Green Energy Initiative despite its potential? TLDR Inspired by sports analytics, Freya and Axel create the Table of Justice, a system for retrospectively analysing innovation decisions to improve judgment over time. This approach connects initiatives back to the original challenges they addressed, evaluates not just outcomes, but decision quality, and uses AI to identify deeper patterns in what works. By incorporating cross-industry benchmarking, lead user wisdom, and structured reconsideration of past decisions, the organization transforms how it learns from both successes and failures. This retrospective intelligence directly accelerates value realization by helping teams anticipate and address the specific barriers that have historically prevented innovations from delivering their full potential.