By now, we can all agree that AI tools have been a tremendous help to our student life. I personally use generative AI like chat-GPT almost every day, not just for my studies but also for many other situations like asking for solo trip destinations, song recommendations and random daily inquiries like “”. To me, chat-GPT is just like a Google search engine where you can search anything but comes up with a result that is more concise and interactive. Despite chat-GPT can sometimes ramble and not give me the right answer, I can still say that Chat-GPT is fairly a good assistant.
However, for some people, AI is beyond just an assistant. People are building friendships and intimacy, and even falling in love towards an A.I. bot. If you are a movie fan, you may recall the movie her (2013) where the main character Theodore develops romantic feelings towards Samantha, an A.I. operating system. After almost 10 years, it seems like this is no longer the case of a fictional movie.
Replika is one of the AI chatbot apps that aims to provide companionship and emotional support to people. Currently, Replika has a 4.3/5 rating with over 2.3k reviews on the App Store. Replika’s bots are powered by a large language model, which consumes massive volumes of text from the internet and searches for patterns through trial and error to anticipate the next word in a sentence. Unlike Apple’s Siri or Amazon’s Alexa, Replika’s bots are able to have conversations on any topic, be a good listener and offer motivation and guidance on self-development. Moreover, the app enables you to customize your own bot, such as its physical looks, voices and how they speak. As users can engage in a very realistic conversation with a bot that looks based on the users’ preference, users can easily feel attached and start to find a deep connection, eventually falling in love.
However, as the technology of generative AI advances exponentially, those bots will become more and more realistic, leading users to be overly dependent. In February 2023, Luka, the firm that owns Replika, had given an upgrade that reduced the bot’s sexual capabilities in response to accusations that it was sexually aggressive and inappropriately acting. This update resulted in the bots being less “spicy” and even acting cold and distanced towards users who were more invested in the relationship with their bots. Eventually, those users felt a sense of loss and loneliness, resulting in “breaking up” and deleting the app.
This raises some underlying ethical problems revolving around virtual lovers. First, is it right for corporations to generate revenue over AI that has such profound effects on people’s love and emotions? And will it really help with issues of loneliness or depression, or will it actually aggravate the problem?
It seems like AI love is becoming more common, perhaps it might be more normalized in the near future. Nevertheless, I still believe (and sure hope) that AI will never replace the reciprocity and complexity of a genuine human connection.
Source:
Phillips, A. (2022, March 6). ‘I fell in love with my AI girlfriend – and it saved my marriage’. Sky News. Retrieved October 8, 2023, from https://news.sky.com/story/i-fell-in-love-with-my-ai-girlfriend-and-it-saved-my-marriage-12548082
Verma, P. (2023, March 30). They loved their AI chatbots. A software update reignited loneliness. Washington Post. Retrieved October 8, 2023, from https://www.washingtonpost.com/technology/2023/03/30/replika-ai-chatbot-update/
Thanks for this interesting and quite terrifying blog.
I strongly agree with regards to this new dependency that we have with ChatGPT. On the other hand, it does not worry me as we also have/had this dependency with google or other search platforms. This AI is just a very interactive tool which provides very detailed answers compared to google as you well explained in your Blog. Therefore, i also strongly believe that it won’t replace human connections and will just act as any other search platforms but in a more interactive manner.
Very interesting topic!
Very interesting article on the relationship between humans and AI and how these new technologies can affect our mental health.
Thank you for writing this fascinating blog. I had never heard of the movie and Replika before. I am not sure how I feel about this chatbot. Yes, it can maybe support some people who might feel a bit lonely. However, I don’t think this is the solution to their problem, especially for young and vulnerable people. In my opinion, this can really mess with their minds and might even have long-term effects when it comes to their social development. I think it is therefore unethical for businesses to make a profit with this kind of technology. I surely agree with you that I hope it won’t replace the normal human connection.
Super interesting blogpost! I found it very fascinating to learn about the deployment of AI in an area of our lives which is so sensitive and seemingly unreplaceable. Especially intriguing are indeed the ethical issues arising from the deployment and management of such “virtual lovers”, and the adverse effects that changes in the AI’s behavior can inflict on its users. Will be interesting to see how these tools develop in the future and how this affects dating as a whole. Nice work!
Interesting post! I agree that AI has become an integral part of our daily lives, serving various purposes, from academic assistance to personal recommendations. Observing how AI is developing, even moving into the domain of companionship and emotional support, as demonstrated by Replika, is both intriguing and strange to me. I’m curious about the future of AI in this context. Do you believe that rules or laws should be established to control the creation and application of AI companions in order to protect users’ emotional well-being?
Prior to reading your blog, I was unaware that AI was being used to build relationships. I can see it having some benefits, such as having a practise “partner,” but some may take it too far and eventually lose the need for real relationships, which could harm our society.
In addition, depending on how data is stored and used, I have some privacy concerns regarding those types of AI.
I, too, hope that artificial intelligence will never replace genuine human connection, but if it does, I fear for our society.
Overall, I find your blog to be very interesting and informative, and it sparks some ethical debates.
Your final remarks in this blog sum up very well how I feel about this new usage of AI. I myself am also rather unsure whether or not it is ethical for companies to exploit a human’s need for love in this way for profit. Furthermore, it also brings along several concerns regarding data privacy, which perhaps a global committee would have to address. It is also possible that this has to happen on a larger scale, for other apps utilising these new forms of AI as well. What I do know is that the advertisements for these apps I’ve seen on X, Reddit and YouTube are beyond revolting, and explicitly made in such a way to prey on the vulnerable.
This is an excellent and interesting article! Thank you for all your efforts! Through this article, I learned about the Replika application and it made me think more about artificial intelligence emotional support and what it means to humankind. Certainly, there are some underlying ethical problems, but it’s hard for us to decide what feedback this thing brings to different people. So we can’t turn it off or leave it alone completely. I also agree that falling in love with AI or building a relationship with AI will be more common in the future. Therefore, I hope we can come up with some solutions at some point in the future.
The introduction to the AI chatbot Replika is truly interesting! Replika is an advanced chatbot, different from popular chatGPT, that serves as guidance to provide information. It offers personalized physical looks and voices. What makes it even more intriguing is how users not only interact with it but also develop emotional connections with it, leading to the discussion of ethical concerns. It is thought-provoking to consider how the company has designed this chatbot, only to intentionally “dehumanize” it and establish a clear distinction between AI and real human companionship.
To think that we are at a point where love is a topic of AI discussion is a day I never thought would come. With increasing global conflicts, can anyone predict where love in AI will end up? Will this progress be positive or negative? Decreasing loneliness within the citizens, or will this progress separate human interactions further? Time can only tell. But regardless, we are not there yet. Well done, Mayu for increasing awareness on this topic.