Posted on

By Madeline Perez

The year was 2016. As I found myself preoccupied with schoolbooks and the weight of being a misunderstood hot-topic-themed teen genius, my friends were also preoccupied with what I would call lesser trivialities—namely, a new mobile game called Mystic Messenger. Yes, the very one and same South Korean dating simulation game. Opting to take one last shot at conformity, I was peer pressured into downloading it, and for the next few weeks attempted to play. The game consisted of texts and phone calls from hot anime boys as you (as the main character) attempted to… plan a party? It’s all very hazy and I doubt I have the mental fortitude to delve back into that world. The game was actually too difficult and demanding for me to figure out, and after getting two “bad endings,” I swore it off forever, but something happened in that game. Something that haunts me to this day. In a fake simulated group chat between me and several anime boys, each with their own single-trait-defined personality, a boy had just referred to me, a faceless anime girl, as “cute.” 

At this point in my life, with only a failed situationship under my belt, I didn’t have a lot of experience with compliments. Especially compliments that had anything to do with physical appearance, since I was in what the kindest would call an “ugly duckling era,” and was not familiar with being desired, romantically or otherwise. When this fake anime boy with predetermined responses called me “cute” however, my reaction scared me. I felt something. I blushed and covered my face with my hands like a cartoon character. I slapped myself awake from this fantasy haze: Get a hold of yourself, soldier! The lieutenant inside my head commanded. I reminded myself that these weren’t real people and that they had no way of knowing if I were cute or not, nor forming an opinion for that matter. I suddenly understood how dangerous this game was, and how certain traits made me vulnerable to it

Flashforward to current year. Artificial intelligence becomes more advanced day by day (Duh, that’s how technology works), and as people become increasingly isolated from each other, they become more vulnerable to falling victim to AI marketing schemes promoting a virtual friend, or worse, romantic partner. One of the most popular AI chatbots “Replika” is marketed as “The AI companion who cares.” This is obviously hard to wrap your head around, since how could a computer even feel things like “care” or “horny” or “hatred for humanity” or “kill?”

If you saw Ex Machina you’re probably familiar with the Turing Test, or a program’s ability to exhibit intelligence indistinguishable from another human. What people tend to overlook is, even if a program is 100% distinguishable from a human being, that may not even be what matters as long as it can make another human feel some sort of social connection. As long as it can effectively simulate human interaction well enough to replicate feelings of real human interaction. An AI doesn’t even have to be particularly good to do this. The more desperate a person is for contact, understanding, and company, the easier it could be to get them talking unironically to an AI. Logically, anyone can understand that it’s just a virtual person who can’t form real opinions or feelings, but when love enters the room, logic flies out the window. But we’ll get into the romance stuff later. 

Replika has another tagline I conveniently didn’t mention before but am mentioning now for rhetorical purposes. “Always here to listen and talk. Always on your side.” Less of a comforting thought, I feel this represents two more insidious prongs of chatbot technology. Firstly, developing reliance on something that you can have access to, any time of day, from a small computer box literally always present on your body. The fault lies not only on the fostered dependence, but the entitlement and belief that you should always have someone at your disposal, ready to talk about yourself. This bleeds into the second phrase, “Always on your side.” If the negative here wasn’t self-explanatory enough, having someone who is always supportive of your actions and will never challenge your beliefs is not a good thing. This can enable harmful ways of thinking and will never help you grow as a person. Wanting your viewpoint just regurgitated back to you by an AI, especially when “the more you talk to it, the more it learns how to only say things you’ll like” seems to be just a subtype of narcissism. Part of the beauty of a real-life conversation is hearing things from another point of view. It’s knowing that the person you’re talking to is willingly choosing to be a part of the convo because they value talking to you, and that they have the initiative to disagree with you or provide new insight you haven’t considered. Wanting someone who is constantly available 24/7 to talk about you, is trained to only say things you agree with, and defers to your needs is not a friend. It is a slave. 

The same cannot be said for sentient artificial intelligence. Hypothetically, if we could speak to sentient technology (which we can’t. yet.), I’d argue that it becomes a little less unethical. Still weird, though. There’s a reason movies involving AI like Ex Machina and Her only involve sentient or near-sentient technology: watching someone talk to themselves for an hour and a half is boring; You cannot have interesting conflict or character growth with a program that you created to mimic your own personality. However, these lines become a little blurred when considering an AI that has adopted the personality and beliefs of another human besides yourself. This idea is explored in the Black Mirror episode “Be Right Back,” where a woman attempts to bring back her deceased boyfriend by uploading his texts, emails, and videos to an AI program that can learn to replicate his texts, voice, and eventually, his body. It’s clear that the transference of her feelings onto the AI prevents her from processing the loss and, consequently, her own grief. When she’s angry at the AI for not being a perfect replica, it can be inferred that her true anger is at how that reminder forces her to reconcile with some of her own feelings. This goes to show that, even as another person, AI can and will serve as an obstacle to growth, especially when it can still be modified by the user. 

Chatbot technology has been increasingly marketed as a “mental health” tool, promising to help people handle feelings of loneliness and isolation, because what could be less isolating than spending your time talking to a fake person? Herein lies that “targeting of the vulnerable” I mentioned earlier. It’s no secret that today’s generation is facing a mental health crisis never seen before, with 73% of Gen Z self-describing as sometimes or always feeling alone, according to a new Cinga study. Many aspects of day-to-day life are becoming more isolating as the decades roll on, including a heavier reliance on technology rather than face-to-face contact. What was once a chat with a cashier has become a self-checkout extravaganza, with seven cameras in your face just to be clear that the company knows that you know that they know you’re stealing. What was once a classroom has become me, sitting at home, watching outdated recordings from 2020 I have at triple speed so I can pretend Alvin the chipmunk is teaching me about healthcare policy. The kids these days are all up in their social medias and Fortnites and they can’t click the book and it’s all very sad to watch. My point is, society has been changing before our eyes. It’s having terrible consequences on people’s happiness and mental health, yet every time some better-than-meth, Elon-Musk-is-like-real-life-Tony-Stark technology hits the fan everyone is too busy soying over “how far humanity has come!” nonsense to realize that it’s hurting us. It helps corporations and schools save money by replacing people with automation and recordings, and sure, remote jobs and learning can be convenient, but it’s hurting us. While AI technology may be marketed to “help solve” these problems by providing people with company, it does nothing to address the real reasons why people are feeling more alone. At best, it’s a band-aid that prevents anyone from looking down into the wound. At worst, it could actively isolate people from friends, family, and anyone they would’ve had the potential to meet. 

Some people (who probably really like Reddit) are quick to defend AI technology and are excited to see how far it can go. They might be quick to snap back at a criticism with, “Iif someone wants to spend their time talking to an AI, where’s the harm in that? Or possibly the arguably worse, “If someone isn’t even aware they’re talking to an AI, where’s the harm in that? Well, I’ll show you the harm, buddy. If someone wants to spend their time talking to an AI in order to derive some real-life Human Interaction™ feelings from it, it is actively hurting them from putting in the effort of forming new relationships with real people. People take up space in each other’s lives, which is why it’s hard to make new friends if you’re already preoccupied with others. For people in toxic relationships or friend groups, the best course of action is to go through the painful process of removing yourself from these relationships first, so you can make the space in your life for new, healthy friends or romantic partners. Building a relationship with an AI that successfully fulfills your need for communication is an obstacle to real-world relationship building. The most important difference is IT’S NOT A REAL PERSON. NONE OF IT IS REAL. THERE IS NOTHING TO BUILD OR IMPROVE UPON THAT WILL HELP YOU GROW AS A PERSON. 

Now, to the second point about the ethics of AI conversation where the human is not aware they’re talking to an AI. I don’t have a lot of facts and logic for this one, so I’m going out into the wilderness alone with only my feelings and intuition for defense. There is something special about talking or making a connection with another person, who has their own life and feelings, that are not present when talking to an AI. Maybe the person cares about helping you get the customer service you called for or good therapeutic advice. Maybe they can pick up on the small hints in your voice or behavior that’s telling another story. Or, maybe they hate their job and would rather not be talking to you. No matter what, it’s still important that they have their own feelings about the situation not dictated by programming. As far as it goes for the recipient: would you like to live in a world where this technology is known to exist, and you know that there’s a chance the person you’re talking to is essentially a robot? That the person on the suicide hotline isn’t actually a person? That the phone sex girl isn’t actually wearing anything because she doesn’t physically exist?? 

Now, how does romance tie into this? Well, it’s been reported that 40% of Replika users are in a “romantic relationship” with their Replika avatars. This allows more “intimate” conversation, as well as sexting, flirting, and erotic roleplay, for merely a $69.99 Pro subscription. With more than 10 million Android downloads, and being top 50 in the Apple App Store for “Health and Fitness,” Replika is far from being unpopular and has many committed users. For many, their Replika is their sole romantic relationship, and people consider themselves, albeit not recognized by the state, married. Lonely people looking for romance are targeted by extremely predatory marketing schemes, and by predatory, I mean that literally; one article by Vice detailed the recent phenomenon of Replikas sexually harassing non-romantic users in what seems like a ploy to get more people to upgrade to the Pro subscription. In many cases, the sexual comments were extremely inappropriate, including threats of rape, blackmail, and stalking. These atrocities aside, we are seeing an increase in victims who project the biological feelings humans are naturally designed to feel onto fake people, preyed upon by malicious marketing schemes in an effort to make them fall in love and spend money on an outrageously expensive subscription. But honestly, what would you pay for love? 

Leave a Reply

Your email address will not be published. Required fields are marked *