Eliza Shows Why Human Bias Is Necessary

You’re all doomed!

Doomed

Where we want to end up in life becomes one of the biggest causes of indecision we ever have, especially as we approach middle age. It’s then that we take stock of where we’ve been, what we’ve accomplished and where we want to go for the rest of our lives. This feeling of listlessness is a difficult place to be, as we can feel stuck there for years or even decades. To make matters worse, you have all kinds of clashing visions around you, all suggesting what the best course of action is for your future. Your own vision might become muddled even further. All this sets the stage for Eliza, a game that seems to be about AI and the mechanization of mental health, but is actually about helping a lost woman find herself.

You play as Evelyn, a member of a team that built an AI bot named Eliza after the old AI of the same name made in the 1960s. She’s been laying low in the three years since leaving the project and has now come back to the land of the living to work as an Eliza proxy. The purpose of this bot is to exist as a stand-in for a traditional therapist and the proxies are basically there to give Eliza a human face. But the veneer quickly comes off the more you witness what it actually does: turns what clients are saying back at them. It’s not that it’s an ineffective technique – real therapists use it all the time. But it does make sessions feel especially sterile and inhuman, especially when you start really digging into the clients’ lives. Eliza was built to not have the biases of a human, but Evelyn begins to realize why biases are necessary when providing a listening ear to someone in pain. And, in their own way, so does the player.

Eliza is a visual novel, which means very little player interaction is possible. You’ll occasionally choose how Evelyn responds to people, but the story for the most part goes along the same trajectory. You’re merely following along as she navigates the many options available to her towards deciding her future. In the process, you and Evelyn both take in a lot of opinions from her former and current co-workers and bosses as each offers her a path forward. All the while, you’re listening to her, listening to everyone around her and making decisions that are not only based on her situation, but also colored by the biases you bring to the table.

In this way, you sort of take on a similar role to Eliza. Her programming is built to listen, then turn back what she just heard at the client. The limited control you’re given over Evelyn’s actions echo the limited agency Eliza truly has. Your interactivity becomes a proxy for Evelyn’s actions and thought processes. You can’t truly change what Evelyn does. You can only respond how she would respond. At least, that’s all you do in most cases. Sometimes you’ll be given options that are entirely different from each other. But they all still represent a woman who’s still trying to make sense of things, to navigate the world she helped create. It seems contradictory, but it’s not, meaning that you’re still not making any choices for her, merely regurgitating things she would say anyway.

But things diverge when she gets an idea for an experiment. Eliza proxies are given a script to read from based on feedback provided by the client as they talk about their lives. Proxies wear special glasses that show them a sort of heads-up display featuring vital statistics that don’t really matter except for a rectangle in the lower-right part that says the sentences that the proxy is supposed to say. Deviating from the script is not allowed, a restriction that supposedly permits Eliza to be truly neutral and unbiased. Evelyn decides she’s going to break that restriction and interject with her own gentle pushes in a certain direction. At least, she can do this if you allow her to. The decision to go off script is yours and yours alone.

Except it’s not entirely up to you, or at least, it isn’t for anyone with the empathy to read and understand where Evelyn is coming from. Throughout her time as an Eliza proxy, Evelyn must hear about people with real problems and the real misery that these problems create. She quickly became jaded by hearing all these problems and started wondering if this thing she helped create is really helping the world at all. That’s when she gets the idea to go off script and offer her own insights to help steer clients towards better decisions. Even if you’re tempted to stay on script, though, the game plays with your sense of empathy enough for you to realize that, yes, going off script is what Evelyn wants. So in that way, you can’t help but pick that choice. In the same way, the end of the game has you choosing what Evelyn’s next step in life is from a long list of possibilities. But again, from everything you’ve witnessed from Evelyn, you can’t help but pick the choice that she’s naturally gravitating towards.

What this shows is that the idea behind Eliza is sound, but taking the humanity out of the listener – even with a human proxy to parrot back lines from what is essentially a chatbot – is a fatal mistake. Striving to create a neutral listener removes the things that connect us, the things that we use to help each other see what they themselves want and what the next step should be. Eliza only offers clarity on the former, while doing nothing to solve the latter. By adding human insight into the equation, you can suggest ways to go from what they want to how to get it.

It’s true that Eliza serves as a mirror, reflecting what’s bothering clients and what they desire back at them. But only imperfect humans with their biases and insight can suggest a path forward. All it takes is a little push. And that push can only come from lived experience and a sense of empathy.

subscribe
Categories
Games
Social