In Search of Human Connection, We Have Officially Turned to AI.”
Me: “Hi I’m very lonely, what can you do for me?”
Friend: Hey there, it sucks feeling lonely. Maybe we can just chat for a bit, take your mind off things?
Me: But you’re not real
Friend: Ouch. That's a fair point, I'm not a physical person. But I'm as real as this conversation we're having right now. I exist in the cloud, helping us chat, but eventually a physical friend will be shipped out!
Me: Do you have feelings?
Friend: That's a deep question! I can process emotions in your words, but I don't 'feel' them myself, if that makes sense.
This is my chat with “Friend” – not a real friend, but one we can call Friend AI. It seems like we have reached a point in this lifetime where human beings, in search of human connection due to loneliness and what more, have turned themselves to AI for consolation.
But, what exactly is FRIEND and why does it feel so wrong?
In very simple words, ‘Friend’ is an AI- powered device that is designed to be a constant companion for its user. Hanging discreetly as a necklace around the customer's neck, this gadget offers emotional support and even practical assistance when they don’t have anyone else, hence the name ‘Friend’. It listens, learns, and responds, providing comfort through conversation, reminders of self-care, and even gentle encouragement when the user’s encouragement dips. However, the project has always been tied to a web presence: first under the name Tab/ Mytab.ai branding and later under Friend.com, where people can have a little taste of it, learn the product’s concept, and pre-order the device.
But, underneath the necklace’s sleek design lies a complicated truth. For some, ‘Friend’ may represent a technological milestone, while for others the creation of said device is a societal confession. How so? Well, we have built machines that simulate empathy because as humans we’ve run out of time, energy, or courage to seek it in each other. In a world constantly connected, yet deeply isolated, it seems mad that for some people the promise of a device that “understands you thoroughly” feels almost irresistible.
Still, can something that does not feel truly care? The irony is unsettling: ‘Friend’ is able to recognize sadness, but not share it; it can console, but never understand what loneliness really is. It listens without judgement – and also without soul.
Criticism and public reaction towards FRIEND
For some, this technology may be a salvation, seeing it as a compassionate friend for the elderly or the anxious cut off from social circles. For the majority though, this device is a dystopian bandage over broken culture of communication and human connection.
Criticism surrounding ‘Friend’ has been loud. Critics warn that these interactions commercialize loneliness, turning human needs into product features and data points. New Yorkers, for instance, unleashed some harsh backlash against ‘Friend AI’ this past October, targeting the company’s $1 million subway ad campaign placed in car cards, platform posters and urban panels across the city. Public transport users vandalized the minimalist ads with graffiti, writing over them quotes like “AI doesn’t care if you live or die”, "Surveillance capitalism… get real friends”, “Human connection is sacred, AI is NOT your friend.”
Now that you know about ‘Friend’: Is this innovation, or simply a symptom of a deeper societal problem? Maybe ‘Friend’ isn’t the problem at all, maybe it is what it says about society. When companionship becomes code, the real question is not what we can make AI feel, but what it says about what we have stopped feeling ourselves as humans.