A rabbit hole prompted by a LinkedIn post I interacted with

The illusion of communication doesn’t just persist in the AI era — it accelerates.
“The single biggest problem in communication is the illusion that it has taken place”
George Bernard Shaw
Is the attribution to George Bernard Shaw correct?
To my surprise as I asked the great AI etherverse – about whether this is correctly attributed to GBS, to my surprise the short answer is “NO!” Whelp, time to adjust a personal email signaturte of mine! Plot twist – ADAPT! While the phrase is widely attributed to Shaw, A reminder as I age daily of just how little I know.
“There is no definitive written record of him having said or written it in any of his published works or speeches.”
Thus, the Shaw attribution is a modern misattribution, likely because the sentiment feels like something he might have said.
“The great enemy of communication is the illusion of it.”
AI, illusion of communication, and the need for discernment
In the age of AI, the illusion of communication grows even stronger. Systems can produce fluent, confident responses that give a false sense of shared understanding. But as boundaries, constraints, and assumptions shift, the burden persists – humans remain required and an essential part of the heuristic to bring wisdom, discipline, and discernment. In fact the important burden increases—not decreases. AI is a powerful amplifier, accelerant, enzyme and/or catalyst, but without intentional clarity, it can also amplify misunderstanding at unprecedented scale. Like the calculators and computers of old – the gate is proper inputs and or prompts – or as they say,
“Garbage in, garbage out!”
The illusion of communication doesn’t just persist in the AI era — it accelerates.
Regardless of its origin, the insight is more relevant today than ever—especially in the age of AI. We are entering a world where communication appears instantaneous: summaries, messages, meetings, briefs, and decisions are now being generated in seconds. And that speed creates a new illusion:
If something coherent was produced, communication must have occurred.
AI can amplify clarity, but it can also amplify misunderstanding at unprecedented scale. This is where human judgment becomes non‑automatable.
The real risk isn’t that communication fails. It’s that we believe it succeeded when it never actually happened.
Consider: Eliminate the Illusion of Understanding
Humans face a new communication risk: the illusion that alignment exists simply because information has been exchanged—verbally, digitally, or through AI-generated content. In reality, speed and fluency can mask misunderstanding.
Effective leadership communication requires intentionally verifying—not assuming—shared understanding.
To operationalize this principle effectively, those shaping and managing systems must consistently rely on grounded, wisdom‑based practices such as these:
Deliberate Clarity: Moving Beyond Polished Outputs
In an age where AI can generate articulate responses in seconds, humans must resist the temptation to treat fluency as accuracy and polish as understanding.
A well‑formatted message, a neatly bullet-pointed summary, or a concise restatement can give the false impression that clarity has been achieved. Clarity is not the artifact — it is the shared comprehension it produces.
True clarity arises from:
- dialogue
- questioning
- reciprocal explanation
- moments of pause
- and the co‑creation of understanding
We must remember:
Artifacts inform. Dialogue clarifies.
Verification Over Assumption: Replacing Hope With Evidence
Assuming alignment is one of the most pervasive silent risks inside any organization. Teams nod, agree, acknowledge—and then walk away with five different interpretations.
To eliminate the illusion of understanding, humansmust normalize verification. Not as micromanagement, but as good leadership hygiene.
Simple questions can transform communication from assumption-driven to evidence-based:
- “What did you hear?”
- “What does this mean for you?”
- “What actions will you take next
These questions are not tests—they are tools. They reveal misalignment early, when it is cheap to correct rather than costly to repair. Alignment is not produced by repetition; it’s revealed through interpretation.
Context as a Leadership Responsibility
AI can surface and provide information, but humans provide meaning. Information is abundant—meaning is scarce. That distinction becomes more important the more powerful these systems become. AI excels at telling us what is happening. But humans must explain why it matters.
In a world where answers are instant and information is abundant, it’s easy to mistake density for depth, or fluency for understanding. AI can deliver facts, figures, summaries, and even well‑structured narratives—but it cannot determine what truly matters. That responsibility remains human. Meaning is constructed, not delivered. And humans sit at the center of that construction.
Humans must articulate the why behind the work, the constraints that shape decisions, the context in which priorities shift, and the boundaries that define what is acceptable, viable, or desirable.
When these elements move—and they often do—humans must communicate that movement explicitly. Unspoken shifts breed confusion. Clear articulation breeds alignment. Information alone doesn’t tell the whole story. Humans gather insight not just from data, but from:
- Body language
- Tone
- Hesitation
- Silence
- Tension
- Excitement
- Emotional undercurrents
Humans bridge this gap—not by saying more, but by ensuring what’s said is truly understood.
Joint Human–AI Sensemaking: Pairing Intelligence With Interpretation
It is more relevant and quite clear that AI accelerates communication, but it also accelerates the risk of subtle misalignment slipping by unnoticed. What once unfolded slowly in conversations now races through summaries, drafts, and decisions generated in seconds. Humans provide and bring valuable discernment to this accelerated terrain, ensuring that speed never masquerades as understanding. It’s tempting to believe that efficiency is the same as clarity—but that’s the trap. When we move too fast, we risk out‑kicking our coverage, leaving our understanding lagging behind our output.
A system can do many impressive things. It can summarize a meeting with uncanny neatness. It can rearrange a plan into something that looks even better than what we originally drafted. It can stitch together a coherent narrative from fragments of inputs, giving the appearance of orderly thought. But beneath that polished surface, something essential is missing.
A machine cannot sense tension in the room—those quiet, charged moments where unspoken disagreement hums just beneath the surface. It cannot detect the hesitation embedded in a half‑finished sentence or catch the subtle misalignment tucked behind polite agreement (compulsory niceness). It cannot recognize fear, uncertainty, or the soft forms of resistance that reveal themselves only through tone or body language. Nor can it absorb cultural nuance or relational dynamics—those invisible threads that shape human connection and meaning.
And so humans become the sensemakers in this dual environment. We allow AI to lift the load: to draft, summarize, organize, and accelerate. But we keep our hands on the reins, applying discernment to ensure the outputs reflect reality, not merely language patterns. We read between the lines where the machine sees only the lines themselves. We notice the texture of interaction, the emotional undercurrent, the meaning behind the words. In doing so, we make sure that speed never outruns accuracy and that understanding never becomes an accidental casualty of efficiency.
This is the quiet art of leadership in the age of AI—moving quickly without losing depth, and letting technology assist without allowing it to define what is true.
(Don’t out-kick your coverage).
Wisdom, Discipline, and Discernment: Fine-tuning the Discipline of Human Competencies
- Wisdom to question easy answers, even when they sound right.
- Discipline to seek confirmation, not shortcuts – slow down enough to verify, refine, and realign
- Discernment to identify and notice the nuance, context, and subtlety that algorithms flatten.
Humans are now pressured to interrogate their prompts, question interpretations, check for shared understanding with intention, and validate assumptions with clarity and purpose on all reasonable sides. Without this level of deliberate engagement, miscommunication scales at the exponential rate of an avalanche rapidly—moving from a simple 1‑to‑1 misunderstanding to a 1‑to‑enterprise failure in milliseconds. The payload expands as it gathers traction and density at astonishing speed, accelerating unchecked until heuristics and guardrails intervene to stabilize and direct it. My observation is that it behaves much like fire and water: we may learn to shape its direction and coexist with its power, but we cannot truly control its full intensity.
The Human’s Call: Move From Information to Meaning
The human challenge of our time is not the management of information—it is the stewardship of understanding.
So, let’s continue to sharpen our skills with discipline and long‑suffering in order to interpret, contextualize, synthesize, question, humanize, and ultimately unify information into a narrative people can trust and act on with confidence and clarity. The illusion of understanding is one of the most dangerous traps in modern systems precisely because it hides in plain sight. It feels like progress. It sounds like agreement. It looks like alignment. But appearances can be deceiving—especially when speed, fluency, and automation make misunderstanding look polished and complete.
What truly sets humans apart from AI is our ability to transform the appearance of communication into the reality of shared understanding. When we eliminate the illusion, that’s when real alignment—and real meaning, purpose, and growth—begin to flourish. And yet, this is where the hardest work begins: learning how to cultivate this garden with patience and intention, without letting our bias for execution or our emotional reactions drag us back into friction, haste, or avoidable conflict. The real challenge is tending the space between information and understanding—nurturing it with clarity, care, and courage so that what grows there is worthy of the people we lead and the futures we’re shaping.
Link to the post that caused this “follow the white rabbit” moment.
- Share:
You may also like

When the Map Doesn’t Match the Terrain: Notes From the Asylum Between My Ears
- April 22, 2026
- by #ZT
- in 2026 Check Ins
Top of Mind – Work Sphere Related

