Allen | AI makes us impotent  

Opinion by Georgia Allen
Feb. 25, 2025, 10:36 p.m.

In 2022, a glimmering new invention, an evolving technology and a fresh horror was unveiled to the world: ChatGPT. Since then, millions have used the artificial intelligence program for its human-like answers. Ted Chiang’s “ChatGPT Is a Blurry JPEG of the Web” explains ChatGPT as an algorithm that compresses vast amounts of information on the web into a lossy format, which relies on the discarding or “loss” of some information while compressing large amounts of information into small files. This is different from lossless, another method of compressing information, that retains all the information. Lossy compression, which is used by ChatGPT, relies on proximate words and phrases to reconstruct complex ideas. It scans the compressed information for phrases and words similar to those in your question and regurgitates a corresponding answer. Chiang illuminates why this process seems so akin to real intelligence: “the fact that ChatGPT rephrases material from the Web instead of quoting it word for word makes it seem like a student expressing ideas in her own words … it creates the illusion that ChatGPT understands the material.”

The popularity of these simulated understandings points to a huge driver of AI appeal: convenience. We use it to write and edit papers, answer our questions, come up with ideas for us and perform mundane tasks to make our lives easier. It offers a quick and accessible solution to tasks we consider unnecessarily complicated and tedious. When we feel uncomfortable with or wary of our AI use, it’s easy to bypass that discomfort with the overwhelming value of such convenience. We tell ourselves that ChatGPT makes our lives easier, that we only use it for trivial tasks that we could do on our own. We choose to turn to the lossy compression algorithm to save time, to save energy and to inspire us. We tell ourselves we are capable, just not always willing. 

This convenience is so attractive to consumers and developers alike that a vision of a sans-AI future has become almost implausible. The industry is growing rapidly, with subsequent models releasing after the success of ChatGPT. The upcoming ChatGPT 5 will, according to Forbes, “think for longer, deeper and harder, meaning it will be useful in entirely new ways.” Of course, to say ‘think’ is inaccurate, but maybe the anthropomorphism comforts us. Perhaps even more impressive, Meta’s AI can be worn as glasses that take pictures, translate languages as they are spoken and find information on objects the wearer looks at without even requiring us to reach into our pockets. 

But let’s stop lying to ourselves here: so-called “convenience” is a farce for laziness.  I question the idea that using AI for tasks that seem to be insignificant, a waste of time or a waste of brain power is in any way defendable by virtue of convenience. What happens when we take such shortcuts? We lose the slow, grueling and uncomfortable process of overcoming whatever creative block or boredom that divides us from starting and continuing the task. To avoid the boring or annoying, we give up practicing persistence and become a little more reliant on the AI that relieves us of such a burden. We allow our brains to lose a certain ability to focus.

And eventually, we get used to it. We pass more and more tasks over to them of greater and greater importance. We get used to allowing robots to write our “thoughts” down for us into cringy, predictable and offensively bland words. We allow them to summarize an article written by a real, thinking person (who had to push past the boredom and mind-numbing work we are now unable to endure) into a similarly bland overview lacking in any nuance or humanity impressed upon it by the author. We prod it for ideas for our next photography portrait because we are too lazy to push our own minds to the task and too impatient to let ourselves come to them naturally. We continue to let ourselves off the hook over and over. All the while, the AI is learning from us to make its answers a little better, expanding its ability a little bit and becoming just a little more convenient for us, increasing its hold as we increase our reliance on it in a vicious cycle. 

The consequences of this pattern is showcased by college students, elite and non-elite alike. The Atlantic’s “The Elite College Student Who Can’t Read Books” emphasizes a shift in students recognized by their professors: students are becoming less and less able and willing to read and understand full-length novels. The article cites social media and less high-school preparation as the cause of this mental degradation, and although I agree, I also see AI as another more recent factor. Seeing and hearing about Stanford students copying and pasting entire texts into ChatGPT has become so commonplace for me that I am no longer surprised when a peer admits that they didn’t do the reading for class, because they don’t need to. They have access to their own personalized mental crutch. 

Beyond becoming a crutch for our correspondingly degrading mental abilities as individuals, I question the impact of AI on humanity in the collective sense. What does it mean if we wear glasses that can take pictures of people around us without their knowledge? Or if instead of collaborating with peers to write, revise and discuss work, we collaborate with a machine? Or if through reliance on AI we, as a group, all become … more and more stupid? We lose a great deal of interaction with others and the trust and connections that communication and genuine intelligence builds, of course, but not only that — we lose an appreciation for humanity. When we talk to people, we expect character, emotions and flaws. When we instead engage with a pair of sunglasses or a website, we are actively choosing a shallow algorithmic “truth” over the imperfect and deep humanity we belong to. 

We decide we trust a machine more than ourselves, even with our own creative work and words. We take yet another step towards AI, it takes yet another step towards us, and human intelligence with all of its profundity and uniqueness is put on the back burner so we can bow before text on a screen. 

We become less human. We become less thoughtful. We become impotent. 



Login or create an account