
Clarity Is So Underrated
Clarity Is the Most Underrated Competitive Advantage And Almost Nobody Treats It That Way
TL;DR
99% of change & transformation programs could use more on one thing, and that one thing will have a positive cascading effect downstream. And that is Clarity. Clarity is treated as something that arrives with time, belongs to someone else, and lives in communication plans. It doesn't, and it isn't. Clarity is a deliberate act that requires real tools and shared accountability. You don't need 100% clarity to move -but there are fundamentals no program should skip. KPIs and OKRs don't get you there. Meetings don't get you there. A structured, practical approach to defining outcomes, success, and guiding principles does. And the programs that get this right aren't luckier - they're more deliberate.
In this blog:
The myths we've built around clarity
The real cost of substandard clarity
Clarity is everyone's job and nobody's responsibility
How much clarity is enough -and what good actually looks like
Why has this been so hard to solve - until now
A different way to think about clarity
Something happens in organisations every single day that I've never quite gotten used to seeing.
An initiative kicks off. There is energy in the room. Leadership is aligned -or at least, they all nodded at the same slide. The program team is sharp, experienced, and ready to move. And somewhere in the first few weeks, quietly and without declaring it, the program starts building on assumptions.
Clarity - real, structured, grounded clarity - was never actually established. It was assumed.
By the time the gap becomes visible, the cost of going back is already enormous. And the uncomfortable truth is that this isn't a story about a program that went wrong. It's the story of how most programs start.
The Myths We've Built Around Clarity
Clarity has a reputation problem. Somewhere along the way, the profession decided that clarity was soft. That it was the job of communication plans, workshops and alignment sessions. That it would "evolve naturally" as the program matured.
These are myths, and they are expensive ones to hold onto.
Myth one: Clarity comes with time (This is also not about waiting to be 100% clear).Time without structure produces drift, not clarity. What looks like emerging clarity is often just people getting tired of disagreeing and settling on language vague enough for everyone to accept. That is conflict avoidance with a shared deck -not the same thing.
Myth two: If leadership is aligned, we have clarity.Leadership alignment is necessary but nowhere near sufficient. Two executives can nod at the same outcome statement and be imagining completely different futures. Until clarity is tested -through questions, through use cases, through "what does this mean for how we build" -alignment is a social contract, not a strategic one.
Myth three: Clarity is about communication.Clarity is a definition problem, not a communication problem. You cannot communicate your way to a clear outcome. Communication is what you do once you have clarity. Running more workshops to close a gap that lives in an undefined outcome is like rearranging furniture in a house built on sand.
Myth four: More detail means more clarity.A two-hundred-page business case is not clarity. A strategy document full of carefully worded aspirations is not clarity. Length and clarity have almost no relationship to each other.
The Real Cost of Substandard Clarity
We talk about the cost of transformation failure in financial terms. Budget overruns. Timeline extensions. Scope creep. Those costs are real.
But the cost of unclear outcomes runs so deep that most post-mortems never look through the human experience of the people actually doing the work.
Teams build the wrong thing and call it done because what"done" looked like was never clearly defined.
Change resistance spikesbecause people don't understand what they're being asked to move toward.
Leadersmake conflicting decisions because each one is working from a slightly different interpretation of an outcomethat was never precise enough to be unambiguous.
And then there is the practitioner cost- the one nobody writes the case study about.
The change lead spending their weekendreworking a plan because what leadership said on Monday meant something different by Friday.
The program manager is sitting in a review explaining "changing requirements" when the real issue is thatrequirements were never anchored to anything solid enough to hold under pressure.
The consultant who did genuinely excellent work and watched itget measured against a success definition that shifted mid-flight.
Substandard clarity doesn't just slow programs down. It positions good people as executors of others' confusion rather than as shapers of direction.
Clarity Is Everyone's Job, and Nobody's Responsibility
In most organisations, clarity is treated as a diffuse responsibility. It belongs to the sponsor - until the sponsor is too busy. It belongs to the program director until the program director is too deep in delivery. It belongs to the change team - until the change team is told they don't have a seat at the strategy table. It belongs to the business - until the business says they hired the program team to figure this out.
So clarity is expected to emerge from the collective without anyone being accountable for producing it. When it doesn't - when the program runs into ambiguity that stalls decisions, creates rework, or sends teams in the wrong direction -there is no one person whose job it is to prevent that.
What I've observed across almost every program I've been part of is this:clarity is assumed at the start, diagnosed as missing in the middle, and mourned at the end.And the professionals caught in the middle of this dynamic typically do one of three things.
They work with what they have, making their best assumptions and hoping the gaps close as the program matures.
They wait to be told - deferring to leadership for direction that leadership is also waiting for someone else to provide.
Or they go through the motions of a discovery process, producing documents that look like clarity but are never stress-tested against the questions that would expose the gaps.
None of these is a bad response to a bad situation. But none of them produces what the program actually needs.
How Much Clarity Is Enough?
This is one of the most important questions in strategy execution work, and it almost never gets asked directly.
The answer is not 100%. Waiting for complete clarity before moving is paralysis dressed up as diligence. Programs operate in complex environments; some things genuinely cannot be known at the start, and iteration is real and necessary.
But that is not a licence to start with nothing.
Here is what good looks like across the fundamentals -and a question to sit with honestly for your current program.
1. An integrated outcome that covers all three lenses.A good outcome is not a business performance goal with a project name attached to it. It integrates what changes for the business, what changes for the customer, and what the organisation needs to sustain to keep that change. All three together. Not separately, not sequentially.
Reflective question: Can you describe your outcome in a way that tells you what success looks like for the business, for the customer, and for the organisation's internal capability - in one coherent statement?
2. Success measures that go beyond financials.A good definition of success goes beyond revenue, costs, and delivery timelines. It includes capability building, customer experience, cultural and behavioural change, agility, and the organisation's ability to sustain what has been built. Programs measured only on financials routinely deliver the numbers and miss the point entirely.
Reflective question: If your program hit every financial KPI but people weren't using the solution, capabilities hadn't shifted, and the change team had to keep propping it up -would you call that success?
3. Design principles that guide delivery decisions.Good clarity includes a set of guiding principles -explicit, agreed, tested -that tell delivery teams how to make trade-off decisions without escalating every ambiguous question back up the chain. These are not value statements. They are practical guardrails: what this program will always prioritise, and what it will not compromise on.
Reflective question: When your delivery team faces a design trade-off today, do they have something specific enough to guide the decision -or do they default to the loudest stakeholder in the room?
4. A clear line of sight from outcome to scope.Good clarity tells you what is in scope and - just as importantly - what is not. An outcome that cannot be used to defend a scope decision is not clear enough. If anyone can add work to the program without testing it against the outcome first, the outcome is not doing its job.
Reflective question: In the last month, was any work added to your program without being explicitly tested against the stated outcome?
5. A shared understanding specific enough to be stress-tested.Good clarity means two people in different parts of the program, asked independently what this program is trying to achieve, will give you answers that are substantively the same. Not word-for-word identical -but heading toward the same future, building on the same foundation.
Reflective question: If you asked three people in your program right now -without warning, without the deck in front of them -what this program is trying to achieve for the business and for the customer, would their answers be compatible?
Why This Has Been So Hard to Solve - Until Now
For as long as I've been in this space, clarity has been treated as a communication challenge. The prescription has always been the same: more meetings, better workshops, moving information from one deck to another.
What I saw was a different problem. Clarity wasn't failing because of poor communication. It was failing because the profession lacked structured tools, a shared language, and a practical framework for deliberately producing it. It was a high-value problem sitting at the root of almost every strategy execution challenge I'd encountered - and almost nobody was treating it that way.
So I built the tools, the structure and the framework. And I evolved those and applied them over the course of working with more than 17 different transformation programs across 11 organisations.
I developed a set of frameworks and methods to do what the profession had been trying to do through meetings and decks. To move from the vague north star that everyone can nod at to a clear, integrated outcome that tells you exactly what you're building and why. To define success not just through financial KPIs but through a comprehensive, multi-dimensional picture that protects the program from being measured by the wrong things. To establish the design principles - the guiding rails -that help delivery teams make the right trade-off decisions without needing to escalate every ambiguous question back up the chain.
A Different Way to Think About Clarity
Clarity is not a hope. You cannot wish your way to a clear program outcome. You cannot meet your way to it either.
Clarity is not a hot potato -something to pass from the sponsor to the program director to the change team and back again, waiting for someone to do something with it.
And clarity is definitely not the CEO's job alone. It never was. It is a shared responsibility that requires shared accountability - named, structured, and built into how the program is run from the start.
What clarity is not -and this matters -is a set of KPIs. KPIs measure outcomes after the fact. They tell you how you did. They do not tell you what you are trying to achieve, for whom, or what needs to be true about the organisation for that achievement to hold. OKRs come closer, but they are still a measurement framework sitting on top of a definition gap that was never closed.
Real clarity comes before the metrics. It is what gives the metrics something worth measuring.
If you've read this far, you probably already know where your program's clarity gaps are. The question worth sitting with isn't whether clarity matters -it's who on your team is currently accountable for producing it, with what tools, and by when. If that answer isn't clear, that's where to start.
I'll leave you with three thoughts.
First,if you think you have clarity, think again. Even when a program is sitting at a good level of clarity, there is almost always a greater level available.
Second,organisations routinely spend significant time and money on programs, knowing that some will fall short. And yet, almost none of that budget goes toward building the clarity that would prevent the shortfall in the first place. Most organisations have more capacity to redo the work than to make the time to avoid the rework. That is a choice, and it is worth naming it as one.
Third,if you are the leader or practitioner trying to create clarity because you already know what happens when it's left to chance, you also know that more meetings and more presentations moving information from one room to the next are not going to get you there. There is a much better way. Not just for the program, but for how you are seen as a leader and where your career goes from here.
Email me and tell me where you're at with clarity on your current program. I read every response.

