Do you feel stuck- It’s time to simplify your thinking
In our quest for clarity, our minds often lead us astray. However, we possess the ability to recognize this tendency, both in ourselves and others. Moya Sarner, a psychotherapist, writes about the complicated relationship we have with communication, particularly when it comes to AI-generated content.
“Psychodynamic psychotherapists generally don’t give their patients advice, which is a principle I try to adhere to in my writing as well. But I’m breaking that rule today. My advice? Steer clear of anything written by ChatGPT. Or, to do my own version of ChatGPT, after considering all factors involved, I would strongly recommend you never, ever read anything produced by it. Just don’t do it.”
Sarner criticizes the convoluted language often found in AI-generated text, which she refers to as “the thickening agent.” This term describes the unnecessary complexity added to writing that does little to enhance understanding—essentially, it muddles the message rather than clarifying it. “In journalism school, I learned to KISS—Keep It Simple, Stupid. Unfortunately, it seems AI chatbots missed that lesson.”
She argues that it’s not merely a matter of poor writing; the same overwhelming complexity can be felt in therapeutic sessions. “In my practice, when a patient is on the brink of uncovering a painful emotional truth, there’s an unmistakable shift when the thickening agent enters the conversation. The clarity we’re seeking dissipates, replaced by confusion and cognitive distance.”
Drawing on the work of psychoanalyst Wilfred Bion, Sarner explains that in any therapy session, one can be “in K” (holding a clear understanding) or “in -K” (unable to grasp their situation). “ChatGPT operates firmly in the realm of -K, and I can sense it from afar.”
She reflects on her own experiences as a patient. “I’ve been caught in those -K moments, watching the clock tick by while I remain ensnared in my own web of words. That’s frustrating.”
Sarner connects this phenomenon to AI, noting: “The slop produced by AI chatbots is reminiscent of bland potato soup. They learn from human writing, echoing our tendencies. AI operates in -K because we do.”
Unlike AI, however, we humans have the capacity to reflect on our thoughts and feelings, allowing us to engage more meaningfully. “We can recognize when communication turns sour, becoming convoluted instead of enlightening.”
This came to the forefront for her recently when she heard the phrase “flim-flam artists” used to describe officials delaying compensation for survivors of the Post Office scandal. “That eloquent terminology reveals how clarity can shine through even in murky situations. It embodies the courage needed to confront the lies and obfuscation around us.”
Sarner concludes by observing that while there may be algorithms designed to simplify AI writing, true emotional depth can never be fully replicated. “The essence of good writing and leading a fulfilling life centers on the emotional connections we forge—everything else is just filler.”
Moya Sarner is an NHS psychotherapist and the author of “When I Grow Up – Conversations With Adults in Search of Adulthood.”