Creeped Out by AI’s Writing? It’s the Uncanny Valley at Work
I often side hustle providing feedback on screenplays. It's a great way to increase my exposure to the medium. Plus, I really understand what works so I can refine my own pratice.
But there was one that recently that left me so unsettled. The words flowed well enough, the structure was clean, and on the surface, everything seemed in order. And no, it wasn’t the content of the narrative that unsettled me.
This something gnawed at me. Something deep down inside said that this script was wrong. It was subtle, but I could not avoid that feeling.
Once I had done my first pass, I reached out to the client. He mentioned that he’d used Gemini, the generative AI, to craft the entire script. And there it was. The answer I dreaded.
This wasn’t just an ordinary script.
It was an imitation. A carbon copy mixed with a complicated blender of words.
This experience brought me face-to-face with something I’d only read about before: The Uncanny Valley. Now, we usually talk about it in terms of robotics or animation, where a figure looks almost human but is not quite there. This leaves us with a sense of unusual discomfort.
But here it was, in the realm of writing, challenging me. It wasn’t just about the mechanics of storytelling or the technicalities of scriptwriting. Something more profound was at play. It felt vulgar, but it's far more subtle than that. The script felt like a hollow echo of what should have been.
This unsettling experience made me realize that the Uncanny Valley isn’t just a phenomenon we encounter visually. It has creeped into writing, creating a subtle yet undeniable dissonance that is felt.
The Uncanny Valley in writing manifests when AI-generated content approaches but fails to fully achieve human-like expression. This evokes a sense of unease, discomfort and perhaps disgust in readers. This experience of the Uncanny Valley violates human norms and threatens human distinctiveness.
Etymology of the Uncanny Valley
The Japanese roboticist Masahiro Mori coined the term “Uncanny Valley” in 1970. Mori understanding explores that as robots and other artificial entities become more human-like, they initially elicit positive emotional responses from observers. However, when these entities approach a level of realism that is almost, but not quite human, they provoke a strong sense of discomfort. Or eeriness. This is a dip in emotional response that Mori metaphorically described as a “valley” seen on the graph.
It's part of the reason the visual elements of a zombie and Frankenstein really challenge us.
It's so close to us, but not quite… and that makes it so uncomfortable.
The German term “unheimlich” is the origin of the word “uncanny.” It can be translated to mean “unfamiliar,” “strange,” or even “creepy.” In the Uncanny Valley, the “unfamiliar” aspect relates to the near-human likeness of an entity that fails to fully convince the observer of its humanity. This creates a disquieting sense of wrongness.
The “valley” in Mori’s analogy represents this sharp drop in comfort and the concurrent rise in eeriness. When an entity is almost, but not quite, human, it unsettles us. Deeply. This concept has been most widely studied in visual fields, like robotics and animation, but as AI advances, it’s becoming increasingly relevant in writing. When AI-generated text mimics human writing just closely enough to be recognizable but falls short in subtle ways, it can provoke a similar discomfort, drawing us into a new kind of… Uncanny Valley.
But Now In Writing
Authenticity and emotional connection are the currency of writing. Regardless of the writing, be it an Ebook on how to increase your sales, a religious text, or a Pulitzer Prize nominee. The two threads that bind the hallucination that comes from words are always going to be authenticity and emotional connection.
Well, at least I believe so.
When we read something genuine, we feel understood and understand. That connection can do anything and everything from inspiration to comfort. Even provoke thought. We may not always be able to define what makes writing authentic, but we know it when we see it because it resonates with our own lived experiences.
Think about the last piece you read that moved you. For me it was The Dark Forest by Liu Cixin. I found a narrative that didn’t just tell a story, but challenged my perceptions of humanity and the universe. It wasn’t just the intricate plot or the grand ideas about the cosmos that disturbed me. The resonating aspect was the way the characters grappled with their fears, hopes, and the consequences of their decisions. It’s that kind of connection, where fiction mirrors the complexities of real life, that the activity from passive to transformative.
As AI-generated content is increasingly prevalent, maintaining that human touch in writing is essential. The absence of it could cause writing lacking the genuine significance it holds, losing its depth and subtlety.
How much of a touch? I don’t know.
After reading the script that was generated by AI, I really started to ask myself, why was I in such a discomfort? Why was I so unsettled?
Now, I’ve read bad scripts before. It's part of the side hustle to read and help others. So naturally there are going to be a lot of bad ones that come through the inbox. Look, even my article writing here is subpar.
But this AI generated script really made me uncomfortable. Perhaps even disgusted. Why? There might be a few theories to speak to that. It’s possible that the script, though technically competent, lacked the nuances and emotional depth that we instinctively look for in human writing. Or maybe the way the words were strung together, almost too perfectly, created an eerie sense of dissonance.
Let’s begin by exploring why the Uncanny Valley leaves me so uncomfortable. I believe that the theories behind this phenomenon carry a certain universality, meaning you might find yourself feeling the same way too.
Mortality Salience in Writing
Mortality salience is a concept which refers to the awareness of one’s own mortality. Reminding people of their inevitable death triggers a range of psychological responses.
The theory suggests that this awareness of mortality leads to anxiety. Individuals then manage by clinging to their cultural beliefs, values, and sense of self-worth. Reminding people of their mortality often leads to a stronger defense of their worldview. When this happens, people might find themselves more motivated to achieve something meaningful or develop a stronger commitment to the ideas that gave their lives a sense of purpose.
In the context of the Uncanny Valley and AI-generated writing, mortality salience can be evoked when people encounter something that feels almost human… but isn’t. This triggers a subconscious fear of being replaced or losing what makes us uniquely human.
I could be extremely biased with this.
Was I feeling this sense of anxiety and dread when I read the script? Hell yes I was! I want to be a scriptwriter and novelist, then here comes AI with promises of efficiency and speed. This discomfort stems from deeper existential fears about being replaceable and forgotten.
Readers might not consciously realize it, but there’s often a subconscious sense that they’re engaging with something that threatens to dehumanize the art form they hold dear. It hits even harder when it threatens a dream you have.
Violation of Human Norms and Threat to Human Distinctiveness in Writing
As AI continues to advance, our perceptions are also being challenged. Perhaps violated. People have long considered writing a uniquely human endeavor. Lately, that uniquely human endeavor is no longer entirely human.
When we encounter AI-generated writing that mimics human expression but falls short, it feels wrong. It can feel like an unsettling breach of the unspoken rules that govern our communication. These aren’t just technical rules about grammar and syntax; they’re the subtle norms that dictate how we convey meaning, emotion, and intent. Those rules are held quite close to our hearts, so having them violated is really challenging.
We experience cognitive dissonance when we recognize the form of human communication but are jarred by the lack of that something else. This tension doesn’t just make us uneasy. It forces us to confront what it means to be human. After all, we have long considered creativity to be a hallmark of humanity. AI-generated content encroaches on this sacred ground, challenging the idea that creativity is an exclusively human trait.
It's not the fact that AI writing is good, it is the fact that it challenges our exclusive ownership of creativity that warrants the feeling of violation.
The fear of being replaced by AI isn’t just about job security, technological advancement, or personal dreams. It’s about the erosion of human uniqueness. As AI gets better at imitating what we do, it raises really uncomfortable questions that it's almost violating. Are we really so easily replicated? Is there something inherently valuable that only we can do?
The violation of human norms and the threat to human distinctiveness posed by AI-generated writing raise these concerns. When AI-produced content that nearly mimics human writing but falls short, it disrupts our expectations for genuine communication. We experience this unease and cognitive dissonance.
But that line is blurring.
Sorites Paradoxes and the Blurring of Human and Nonhuman in Writing
The Sorites Paradox asks, “at what point does a collection of grains become a heap?” This paradox challenges the understanding of scaling. Now if we put that into AI and writing, we can ask, “at what point is this written by AI?” The problem lies in the incremental changes that don’t seem to alter the status of the object until, at some undefined point, the change becomes significant. Yet, we have not defined that for writing.
With AI-generated writing, the Sorites Paradox is distinguishing between human and machine-created content. As AI tools become more sophisticated, the line becomes increasingly blurred. Each minor improvement in AI’s ability to replicate human writing brings it closer to passing as authentic. Yet, we struggle to pinpoint the exact moment it stops being machine-generated. And it's a problem with transparency too.
I use ProWritingAid to help my ghastly first drafts. I also sometimes use generative AI to ask questions, seek suggestions, and get a better sense of my dreams. So where do I draw the line between what is my creation and what is the machine’s contribution? At what point does my work, enhanced by AI, cease to be entirely mine?
We have not figured this out. Moreover, the dialogue surrounding this paradox is inherently difficult. Does the involvement of AI diminish the authenticity of our work? Should it all be paper and pens? Many writers face this dilemma, as they incorporate AI tools more and more into the creative and editing process. The Sorites Paradox in writing underscores this discomfort.
But what do we do with this discomfort? What do we do knowing that AI is creeping ever closer to replicating something we’ve long considered uniquely ours? What do we do with this Uncanny Valley of Writing?
We have opened Paradora’s box with AI and there is no way things are ever going back to the way they were. And there have been some good things that have emerged with the accessibility of AI.
We don’t truly understand what we value until it’s gone. The more we rely on AI, the closer it is to blurring the line between human and machine. And in that blurring, we might lose the very essence of what makes writing powerful; writing’s authenticity. So this Uncanny Valley is really a good thing, as that chasm has not been closed.
Yet.
In the end, the Uncanny Valley in writing isn’t just about discomfort, it’s a unique warning. And an ugly one too. It's a reminder that as we move forward, we must know what we might be leaving behind.
Biblobography
Mori, Masahiro. “The Uncanny Valley.” Proceedings of the IEEE Robotics and Automation Conference, 1970.
Freud, Sigmund. The Uncanny. Translated by Alix Strachey. London: Penguin Books, 2003. [Original German publication: Das Unheimliche, 1919.]