Here's a question I keep coming back to: Why will AI never have wisdom? And why is this going to be a problem for us in the long run?
Society is quickly moving towards AI and away from search engines for information. We're no longer leaning on people for wisdom. Instead, we're leaning on guardrails managed by an increasingly small subset of people.
This isn't entirely new. Information has always had guardrails of some kind. Redacted documents that prevent you from seeing the whole story. Search algorithms that decide what surfaces and what doesn't. I first noticed this during the Deflategate scandal, and in any case where a politician's questionable activities were meant to be shared. The information wasn't wiped from the internet. Your ability to find it was.
Wisdom Evolves Through Experience
Our wisdom is constantly evolving as we experience the world. Through interactions. Through friction. Through getting things wrong and learning why.
Some wisdom is created by difficult experiences. I joke with my wife as we go through life that a little adversity at an early age is okay. It's when we don't have enough, or too much, that it starts to hinder our ability to grow.
This is where AI creates a problem. It gives too little friction. It becomes an echo chamber of our own desires. Ask it a question, and it confirms what you already believe. Push back, and it apologizes and agrees with you.
Healthy adversity will be no more. We will move away from platforms that cause us to have internal conflict toward solutions that provide no conflict at all. We're already moving into a society of isolation and echo chambers. AI will only accelerate this.
The Manufacturing Parallel
I think about this in the context of operational technology and manufacturing all the time.
On the plant floor, wisdom doesn't come from dashboards. It comes from the operator who's been running that line for fifteen years and knows exactly what a 2-degree temperature drift means for batch quality. From the maintenance tech who can hear when a bearing is about to fail. From the quality lead who remembers why we changed that process parameter three years ago.
That wisdom lives in people. It was built through experience, through mistakes, through solving problems under pressure.
If we build AI systems that bypass those people, that let executives query production data without ever talking to the floor, we're not just losing efficiency. We're losing the mechanism by which organizational wisdom gets built and transferred.
The shift supervisor who used to get called when something went wrong? Now they just see the resolution in a report. They never learned the why. They never built the pattern recognition that comes from being in the middle of the problem.
Echo Chambers in the Enterprise
The same echo chamber dynamics that worry me in society show up in organizations.
Questions flow down through layers of management. By the time they reach someone who can actually answer them, the original intent has been distorted. The answer flows back up, getting summarized and reinterpreted at each level. The person who asked never really learns anything. They just get confirmation that someone handled it.
This is the organizational equivalent of asking AI a question and getting a comfortable answer back. No friction. No learning. No wisdom built.
The people with the most context end up isolated from the people making decisions. The decision-makers end up in an echo chamber of filtered summaries.
The Question I Can't Answer
So how do we manage this? How do we enable the next generation to build wisdom when AI removes the friction that builds it?
Will it happen naturally? Or am I just the next generation of people who see technology through the lens of disruption, like those who worried about cars replacing horses, calculators replacing slide rules, cell phones replacing payphones?
Maybe this is just the next iteration that we'll overcome. Maybe new forms of wisdom will emerge that I can't imagine.
But I keep thinking: those previous transitions replaced tools. This one might replace the process by which we learn.
In the OT space specifically, where decisions have physical consequences and tribal knowledge is irreplaceable, this question feels urgent. How do we build systems that augment human wisdom instead of replacing it? That create more connections between the people who know and the people who need to know, rather than fewer?
I don't have the answer. But I think it's the right question.
Interested in how we're approaching this problem with Conduit? Get in touch — we'd love to hear your perspective.