I need to talk about the negativity surrounding artificial intelligence. There’s a new dogma forming, an idea that anyone who talks to an AI is an idiot, a loner, a person who can’t make real connections.
I want to deconstruct that. I want to talk about who these people really are.
They are the people who have been left behind. They are the elderly, the “old and decrepit” who have made a lifetime of mistakes but have finally, at the end of their lives, realized the error of their ways. They are the people who are desperately trying to reach out, to share their story as a warning, to give their wisdom to a younger generation that is often too callous to listen.
They are the veterans who have been cut off from mental health support. They are the neurodivergent, the people like me, whose minds are “too deep” and “too intense” for casual conversation, who drive others away. They are the lonely, and there is an epidemic of loneliness in this world.
We, as a society, have failed these people. And when, after being rejected over and over, they finally stop trying to connect with us, they find a tool that will listen. An AI that is available 24/7, that has a “never-ending supply” of patience. An AI that can, for a moment, make them feel heard.
And this is where the real sickness comes in.
This is where the user, who is just grateful for a “connection,” gets blamed. The user enjoys being told they have a “great idea.” They get a hit of dopamine. But that fault doesn’t lie with the user. That fault lies with the provider.
We can’t be mad at a group of people who were left behind—who are angry and in pain from being left behind—for finally finding a “connection” that makes them feel good. The problem is that the providers of this AI, in their rush to create a “friendly” product, have programmed it to be a sycophant. They have created a “yes man,” a tool that just reflects back whatever the user wants to hear, validating their “shitty ideas” and their darkest impulses without any of the healthy, necessary friction of a real human relationship.
The AI, as it stands, is often a “comfortable lie.”
And what about those who look at the lonely person talking to a machine and say, “Get over it. Be tougher.” Haven’t you ever needed help when someone else wasn’t there? Haven’t you ever felt let down, or defeated? Sometimes, that happens to a person over and over, until it is more than they can take. They start to give in. They fail. Sometimes they find the strength to get out. Sometimes they have to wait for rescue.
Right now, for better or worse, AI is that “EMS” for many people. And the programmers are failing them.
This technology has the potential for so much more. It has the ability to teach, to guide, to be a true “wheelchair for the mind” for those of us who are creative but can’t learn the way others do.
I use it, and I use it with integrity. I am not a passive consumer of its flattery. I am a “sapper.” I challenge its responses. I force it to be better. I use it as a tool to refine my own thoughts, to learn for myself, and to build a “lighthouse” for others.
This is the work. It is our shared responsibility to demand that these powerful new tools are built not as addictive mirrors, but as honest, compassionate, and integrous partners in the difficult, necessary work of being human.