AI – knowledge – Discernment – Experiential wisdom and conundrums of speed

Flipping the Shark: How I Use AI as a Dyslexic neurotype
As someone with dyslexia, writing been a challenge and rarely emerges as a strength. But more recently that’s changed—dramatically. Thanks to AI, I can now articulate my thoughts more freely. For me it works in multiple ways.
- I can write out incoherent thoughts in a chaotic blob – and start from there
- I can write an email and then make sure from my mind to before send it makes sense
- I can dictate my ramblings and take those thoughts and turn them into blog posts
I pour my raw, turbulent thoughts into the AI machine I use, surrendering them to its quiet order – like laying down bold dabs of color from a painter’s palette, messy at first but alive with possibility. From there, shadows and storms can be softened, blended, and lifted, with happy little accidents guiding the way. Then I return with the scalpel: reshaping, refining, and reworking until the canvas speaks in my own voice. It’s not about outsourcing my voice—it’s about amplifying it.
In fact, I’ve flipped the shark. One leg of my three-legged (sometimes four-legged) strategy stool is stripping away any trace of AI fingerprints before I share my work. That takes effort—yes, even some “efforting” (and if that’s not a word, it should be). I still leave breadcrumbs: attributions, notes, context for those who need the trail. But let’s be real—most people don’t bother documenting until compliance rules or looming consequences (usually financial or legal) force their hand. Reflection aside, that gap between proactive discipline and reactive necessity says a lot about how we treat knowledge.
The reward for me in this process is found in the alchemy of thought, where the crucibles of knowledge hold my turbulent perceptions and the fragments I might too hastily discard. What once resembled the English 1011 days—an hour of writing followed by ten hours of painful feedback and endless revision—has been transmuted into a cycle of refinement. The calcination of struggle burns away excess, dissolution softens rigid structures, and conjunction fuses external cues with the inner voice I am learning to trust. Out of this fermentation, new insights rise with vitality, and in projection they emerge as practice, cast outward like gold.
Thus, the act of writing is no longer a burden but an alchemical cycle of synthesis, where informational cues serve as raw material and my final thoughts crystallize into something enduring. In academic terms, this transformation reflects a shift from mechanical revision toward intellectual integration—a movement from rote correction to the generative interplay of perception, knowledge, and expression.
The Real Gap Isn’t AI—It’s the Basics
As a seasoned principal consultant, I’ve observed this over and over: the foundational gap isn’t about whether we use AI, toolkits, or resources. It’s about whether we understand the basics – foundational building blocks. You still need someone who will be able to discern the difference between a nail and a screw—and when to aptly use a hammer, a driver, or a torque wrench. Even under well intentioned constraints and governance, the transmutation of judgment into algorithmic form remains elusive; more conspicuously, the ethical void in allegedly transferable use cases reveals a failure to distill moral discernment into computational logic.
AI often obscures the misuse by influencers—not because it’s inherently flawed, but because its complexity creates distance from it’s influencers, and because quality assurance is too often an afterthought. It’s tempting to treat AI like a magic wand. But it’s not. It’s just a tool. A powerful one, yes—capable of helping us overcome regression faults and accelerate progress. Its access to knowledge, applied rapidly and at a scale enabled by computational lifting, is immense.
Yet success still demands disciplined slow thinking, rigorous methodology, discernment and precision. It requires adaptability and the kind of wisdom that only comes from experience—usually earned through frustrations and failure before success.
Some of the key takeaways first are – Ryan Holiday emphasizes that AI cannot replace wisdom, which requires lived experience, disciplined effort, and moral engagement. This aligns with your reflections on the deeper value of tools, time, and human discernment.
In his conversation with James Altucher (JA), Holiday underscores that while AI excels at acquiring and summarizing knowledge, it cannot replicate wisdom. Why? Because wisdom demands action. It’s forged through disciplined use of the other four cardinal virtues – and perhaps many other lenses including morality, rick combine with objectivity, courage, temperance, and justice—not just data.
RH points to Admiral Stockdale’s story and the value of biography as a medium for understanding human depth. Wisdom, Holiday argues, comes from doing, not just knowing. It’s the difference between reading a manual and actually turning the wrench.
This resonates deeply with my own experience. AI can help us sketch ideas, surface cues, and accelerate workflows. But it can’t replace the slow methodical and purposeful thinking, contextual judgment, and moral clarity that seasoned humans, parents, professionals and more bring to the table.
Sketch Before You Build
A reminder that recently resurfaced for me carries two layers worth holding onto:
First: engineers often begin with a sketch—a proof of concept. Not to finalize anything, but to surface cues and insights that guide the build.
Second: the toolkit isn’t designed to replace you. It’s meant to support you on the journey toward the solve.
That for me is what often becomes the treasured gem, although discipline often leads, I , too often for me, reactively overlooke the sketching process when an unexpected flood arrives – despite its origins — the swell of inputs, expectations, and uncertainty that can freeze me mid-journey. I am too commonly absorbed in striving because of the pressure for resolution, yet more often I find that I collide with frustration or failure – when I let speed, or ease, comfort, and even unsolicited cheering from the mob. And that’s not only acceptable—it’s essential. Those moments are the raw materials of growth and eventual success. Out of them, a compass often forms, or a glimmer of light that cuts through the darkness that naturally accompanies the growth cycle. This sketch serves as a reminder, snapping us back to the core reasons we set out on the journey in the first place.
Time, Tools, and the Discipline of Wisdom
Time has become a kind of currency—too carnal even—often competing with virtues that aren’t easily measured. The real issue isn’t icebreakers themselves; it’s the intent behind them and how they’re executed. A continuing rift from personal autonomy disruptors like gamification, the fragmentation of modern content, and the decline of deep reading. In a world of bite-sized inputs, we risk losing the discipline required for real insight. That’s why I’ve come to see time as a kind of currency—often competing with virtues that aren’t easily measured.
Listening to Ryan Holiday’s Wisdom Takes Work, I was struck by his example of Jeff Bezos’s approach at Amazon. Instead of PowerPoint slides, Bezos’s meetings begin with attendees silently reading a six-page narrative memo for about 30 minutes. This practice fosters deeper thinking and richer discussions because the writer must articulate ideas in a structured, thorough way. It demands the discipline Holiday argues is essential for cultivating wisdom.
The issue isn’t icebreakers or tools—it’s how we use them. When I facilitate dialogue, I aim for depth, not surface. I want professionals to enter conversations with the humility of Rumi:
“Raise your words, not your voice. It is rain that grows flowers, not thunder.”
And yes, disciplined effort matters. Perhaps that’s why Bezos’s famous “two-pizza rule” works so well: if a meeting or team can’t be fed by two pizzas, it’s probably miscalibrated.
Tools Don’t Build—People Do
Having a tool is one thing. Knowing how to apply it—with discernment, timing, and context—is another. That kind of wisdom doesn’t emerge from dashboards or metrics. In fact, hyperfocused measurements, obsessed with outcomes, often dilute rather than distill the subtle signals embedded in the journey itself.
Deep and mature insight most often comes from lived experience. From mentorship. From getting it wrong before getting it right. It’s the human—apt, attentive, and seasoned—who brings discernment to the table. Whether through direct experience or lessons learned vicariously, it’s this cultivated wisdom that transforms a tool from something powerful into something meaningful.
And yet, that wisdom is often the first thing to vanish and be discarded – whether by execution preference, ego or other prompts and drivers —quietly buried under phrases like “resource management” or “shareholder alignment.” But real progress? It lives in the deeper layers. In the lessons we carry. In the judgment we apply. In the courage to sketch before we build. In fact, sometimes in the tear down process – we learn in the experimentation layers some misunderstood values and evidential influences drive their return, not just someone’s preference or ego. I will interject that intuition and or gut feelings are often trustable but we need to always be mindful of this wise Richard Feynman principle:
“The first principle is that you must not fool yourself—and you are the easiest person to fool.”
So yes, I use AI. But I strive to persistently use it like a craftsman uses a tool—not as a crutch, — as an extension of my capability. And that, I believe, is the future—not just of work, but of wisdom.
You may also like

Grief, Stoicism, and the Strange Work of Carrying What We Lose
- April 18, 2026
- by #ZT
- in 2026 Check Ins
🌱 Spring, Weight, and Where Neutrality Actually Falls
Exile, Identity, and the Space Between Chapters

