Here’s a question worth sitting with.
If you genuinely believe people are your greatest asset, and if you’d say it in a boardroom, print it in an annual report, repeat it at every all-hands, then why does your finance system classify them as a cost?
Not an asset – a cost. A variable cost, in fact, and the first number the finance department reaches for when margins compress, and something needs to give.
Michael Clark asked that question from inside Mastercard. And the answer he arrived at wasn’t what most people expect.
“The problem with data, and people, was never a regulatory problem – it was an accounting problem.”
That single insight set him on a years-long journey, through JP Morgan, through building Mastercard’s first open banking product, through advising C-suite executives worldwide on data, AI and the future of money – that eventually became Our Moment, his groundbreaking modern fable about where the world is, where it’s headed, and what it means for anyone trying to lead in the age of modern technology.
In my conversation with Michael on The Wisdom Of … Show, we didn’t just talk about the future, we mapped it. And the map looks different to almost everything else being published about AI and the next economy right now.
The New Economy Is Not What Most People Think It Is
Michael makes a distinction that most commentary on the digital economy quietly avoids.
The internet didn’t create a new economy. The digital revolution didn’t create a new economy. The knowledge economy wasn’t new. Not in any true structural sense.
The measurement system never changed. The foundations that were laid two hundred and fifty years ago through Adam Smith’s original model were added to, adapted, and adjusted. Wages were added, GDP was added and hybrid capitalism emerged. But the underlying architecture of how we count value, classify assets, and measure labour stayed essentially intact.
“Yes, we’ve used words like the digital economy and knowledge economy, but they’re not economies. Because the measurement system of the economy never changed.”
What’s happening now is different. The concept of labour has changed, the concept of capital has changed, and the concept of measurement has changed. These are structural gaps, and they’re widening, not narrowing.
And at the centre of those gaps sit two assets that the current system cannot adequately count – data and people.
Data Is Not Undervalued - It's Mispriced.
Here’s a clarification that changes how you think about the entire data conversation.
For years, Michael heard the same thing … data has no value. He always asked the same question in response. If it has no value, why do companies build legal moats around it? Why do they go to court over it? Why does every major institution guard it like a sovereign resource?
“It’s not that it’s not valued. It’s just mispriced, or just not priced from the market’s point of view.”
The implication is enormous, because something being mispriced is a very different problem to something being worthless. A mispriced asset is an opportunity. And data, once you understand what it actually is, is one of the largest mispriced assets in the history of commerce.
In 2018, Michael discovered in research that data was more valuable than gold on the stock market if you valued it. Not as a curiosity, but as a genuine asset with intrinsic and extrinsic value.
But the insight that stayed with me most came from this – data isn’t a transaction. It’s not a by-product of commerce … it never was.
“Data actually isn’t a transaction. It’s somebody’s stories and their memories. It’s the modern-day cave painting.”
We’ve been treating a civilisation’s worth of human record as operational exhaust … and we’ve deleted enormous amounts of it. Michael describes this as the equivalent of setting fire to the New York library.
Watch the full conversation with Michael Clark on The Wisdom Of … Show now
Your Customer Is No Longer Human
This is the insight from this conversation that I believe every founder and CEO needs to hear, and hear clearly.
Michael put it plainly. In a world where AI intermediates purchasing decisions, the customer is no longer the human. The human is the objective setter. The machine makes the decision. The machine completes the transaction. The human receives the outcome.
Think about what that means. A machine has no emotional attachment to your brand. It has no loyalty to your story, your values, your history. It reads what it can see, processes what it can understand, and bypasses what it can’t interpret. And it is, as Michael described it, fundamentally lazy. Energy-intensive to operate, and therefore built to find the path of least resistance.
If your business, your products, and your value proposition cannot be clearly understood by a machine, a machine will move on … not reluctantly, instantly.
This isn’t a future consideration but a clear present-tense leadership challenge. And most organisations are nowhere near ready for it
Why He Refuses To Call It Artificial Intelligence
There is a word choice in Our Moment that deserves attention.
Michael doesn’t use the term artificial intelligence. He’s renamed it.
“I don’t call it artificial intelligence. I renamed it to be collaborative intelligence. It actually is the best of the human and the best of the machine.”
The AI-as-replacement framing sends organisations down a path of defence. Protecting existing roles, guarding existing structures, and minimising exposure to something they experience as a threat. The collaborative intelligence framing asks a completely different set of questions.
What does human capability bring that machine capability cannot?
What does machine capability unlock that human capability alone could never reach?
How do we structure the collaboration?
And here’s the consequence of getting this wrong – if you fire your people to cut costs, you lose the very interpretive intelligence that makes the machine’s outputs useful.
“AI needs the abilities and the knowledge of those people to interpret the outputs of the machine. Because really, these outputs of the machine are not outputs. They’re inputs, input into the human to be able to interpret, to turn into action.”
This is the logic that holds Michael’s entire framework together. The human isn’t being replaced. The human is being asked to operate at a different level, and the crisis is that most humans haven’t been equipped to do that.
The Live Model - Meta Decision Making
One of my favourite moments in every episode of The Wisdom Of … Show is the live model build. And in this conversation, what emerged organically, through the dialogue, was one of the most practically useful frameworks I’ve built on the show.
The Meta Decision-Making Model – Three Cs: Context, Concept, Consequence.
Michael and I built this together around a question that frustrates both of us … why do smart leaders make consistently narrow decisions?
The answer, Michael argues, is structural. Most problem-solving frameworks are linear. They start with the project, move through the capabilities, and eventually arrive at an outcome. The leader is involved at the start and the end, but rarely in the deep architecture of what the decision actually means.
Meta decision-making is different. It starts with consequence, not in a risk-management sense, but in a visionary sense. What outcomes do I actually need to produce? What impact do I genuinely want to have? And then it uses that clarity to inform context, the nature of the opportunity or problem in front of you.
“Clarity before consequences. Because if you go into any decision without any clarity, you will have consequences. And they can be short-lived, or they can be very hard to unwind.”
The third element, concept, is the lever. What are we actually doing? What levers are we pulling? Is this a cost lever, a strategic positioning lever, a people lever, a branding lever? And crucially, what is this decision, and what is it not?
That last question is the one most leaders skip. And it’s the one that eliminates the second and third-order consequences that come back to cause the most damage.
Michael’s own example landed with precision. The UK government, he observed, raised taxes to address national debt while simultaneously raising the minimum wage to support lower-income workers. The consequence? Youth unemployment is at a historic high because businesses couldn’t afford to hire at the new wage level under the new tax burden.
“There’s a perfect example of not understanding the impact of a narrow decision … without understanding the second and third order effects. It’s littered all over the world.”
The model we built isn’t complicated – it’s clear. And clarity, as Michael reminded me, is what separates a decision that compounds positively from one you spend years unwinding.
Watch the full conversation with Michael Clark on The Wisdom Of … Show now
The One Question Michael Asks Every Day
At the end of the conversation, I asked Michael about an unexpected consequence he’d experienced during his career. He told a story about deploying a full back-office system at an auto finance company,/ A technically successful project that missed something critical. People were still running the old system manually, in parallel. He hadn’t factored in human behaviour with technology. He’d solved the problem he was given, but hadn’t seen the larger system it sat inside.
It changed how he works, and also gave me what I think is the most transferable insight from this entire conversation.
“It’s not what I see. I ask myself constantly, what am I not seeing?”
That question, “What am I not seeing?”, is the operating system of a leader who thinks in second and third order effects. It’s the discipline that prevents narrow decisions. It’s the habit that builds the kind of meta clarity Michael’s entire framework is built around.
And it’s a discipline our education system has never taught, our MBA programmes have barely acknowledged, and our corporate environments rarely reward.
Why This Moment Is Different To Every Moment Before It
The title of Michael’s book is Our Moment. Not the digital moment, not the AI moment, our moment. Present tense, collectively held and already passing.
The reason Michael chose that title is worth understanding. It felt, he said, like the first genuine inflection since the industrial age. Not because of any single technology, but because of the structural gaps that have accumulated in how we account for assets, how we measure people, how we educate children, and how we define value. These are gaps that can no longer be papered over by simply adding new terms to an old model.
And the biggest debate emerging from that moment? Michael pointed to Davos and said it clearly.
“I think the biggest debate of 2026 is the role of humanity alongside AI … human-centric AI rather than machine-driven AI.”
Not which AI model wins, not which company builds the best agent … the role of humanity. That’s what leaders need to be thinking about. Not for philosophical reasons, but for deeply practical ones. Because what you believe about that question will determine every significant decision you make about your organisation, your people, and your own leadership in the decade ahead.
Why This Conversation Matters For Business Leaders
The best leaders I know have stopped outsourcing their thinking about the future to consultants with slide decks and started asking harder questions. Questions like what is the accounting model that actually values what we’re building? What does it mean that the customer is now a machine? What problem am I solving, and what second-order consequence am I not seeing?
Michael Clark operates at exactly that level. He’s not a commentator, he’s a builder. He built the open banking product that started to reshape how Mastercard thought about data. He wrote the school’s AI curriculum after Our Moment. He helped write data regulations in Dubai and he’s the person you call when it’s time to lead, not follow.
And in this episode, we built a model, live, that I think every leader in a room should have on their wall.
“Not what do I see … what am I not seeing?”
Watch the full conversation with Michael Clark on The Wisdom Of… Show now

