The Fable of Myhouse.wad, or: Why We Should All Be Scared of ChatGPT
This column is a reprint from Unwinnable Monthly #183. If you like what you see, grab the magazine for less than ten dollars, or subscribe and get all future magazines for half price.
———
We are what we’re afraid of.
———
Much like every other undergraduate English major of the last 25 years, I have very strong feelings about Mark Z. Danielewski’s debut novel House of Leaves. Part cosmic horror, part insufferably dense postmodern experiment in how many frame narratives can be seated atop one another, HoL and its geometrically impossible architecture took hold of my psyche over a decade ago and have not let up. And I am not alone – in 2023, Steve Nelson released Myhouse.wad, a map for Doom II that takes it premise largely from the novel, forcing the player to loop around and through a map of a suburban home that shifts in sudden and increasingly distressing ways.
This is, in and of itself, a perfectly fine conceit for a fun romp in Doom – the game even has multiple endings for some added exploration incentives. What is most compelling to me, however, is not the gameplay, but a series of additional text files downloaded along with the map. Within these files, the author/fictional programmer of Myhouse (not Nelson himself, but an affected character) spins a tale of a basic map that starts modding itself without the will or input of its creator. The map takes on a malevolent persona as its creator becomes equally obsessed with and terrified of what this game will be.
The text files are, to be frank, not super impressive in their prose or storytelling ability, and could largely be written off as a little bit of self-indulgent fanfic to accompany a sincerely baller Doom II map. Until, that is, you consider what it is that the author is so scared of – a computer program that takes the power of creation and autonomy away from a human author and begins to make decisions against the will of its progenitor. A piece of art generated without any meaningful human intervention. Sounding familiar to anyone?
Since being an educator is my day job, I’ve frequently had to deal with the recent influx of LLM and AI-generated writing and art on the internet, which then often makes its way (accidental or not) into the work my students submit. And beyond all of the very valid concerns of intellectual property, academic honesty and the environmental consequences for how insanely energy-intensive LLMs are, I’ve noticed a deeper, darker horror. Every time someone uses text or art generated by an LLM, they have allowed the AI to make decisions about the final composition – its general tone, the way it presents or excludes pieces of information, how it organizes topics to foreground or obfuscate points of view. Even if the AI doesn’t spout nonsense or factually inaccurate things (which is also a frequent occurrence), the AI has been allowed to manage the tone and rhetoric of the text. And if you’ve gone so far as to read a piece of independent games criticism on a modded Doom map, I don’t suppose I need to belabor the point for you that art impacts people. It influences beliefs, changes values and reinforces worldviews. It is an incredibly powerful and persuasive tool that can be used for great good or extreme ill. Generative AI technology takes humanity’s hands off the wheel and puts decision-influencing power in the hands of a computer that neither thinks nor feels, simply strings words together based on common collocation.
All of this to say, myhouse and other “evil supercomputer” games media before it (shoutout to System Shock) show us what happens when we try to circumvent the tough cognitive work of communication and creativity by farming out labor to software. This is not, emphatically, a logistical or processing problem – it is an ethical one. Computers, in their current form, can never be held accountable, which means we can’t ever let them have the final say in any decision with material ramifications (I’m paraphrasing here from a famous IBM PowerPoint slide in the 70’s, before they remembered that Being Evil was good for profits). When we let computers steer our cultural ship, we run the risk of extreme harm to ourselves and others, not just on the microlevel of one guy getting tilted at his Doom mod, but in terms of sowing prejudice against and dehumanization of politically expedient groups. Also, just to be petty, computers are shit at making good art.
I’ll get off my soapbox here, but I think it’s important to note that artists have been warning us for years about the insidious issues of letting someone else do our thinking for us, whether that’s a governmental body or a computer program. So, the next time you think about having ChatGPT write that work email for you, consider the implications, lest the whole of the internet turns into a House of Leaves.
———
Emma Kostopolus loves all things that go bump in the night. When not playing scary games, you can find her in the kitchen, scientifically perfecting the recipe for fudge brownies. She has an Instagram where she logs the food and art she makes, along with her many cats.