The student news site of Baruch

The Ticker

The student news site of Baruch

The Ticker

The student news site of Baruch

The Ticker

Polls
Sorry, there are no polls available at the moment.

AI chatbot girlfriends only want your data, not you

AI+chatbot+girlfriends+only+want+your+data%2C+not+you
Alexandra Adelina Nita

With the usage of artificial intelligence becoming increasingly more accessible, it makes sense that AI dating would be one of the newest developments for lonely internet users. AI girlfriends are on the rise — but so is the probability of them stealing users’ information.

*Privacy Not Included is a buyer’s guide created by the Mozilla Foundation. It is a source that inspects different technologies and products and lists which are trustworthy or untrustworthy. It was found that companion chatbots were collecting user’s information, including IP addresses and users’ device types, and selling said information to companies such as Facebook and Google and to international companies in Russia and China.

The rise in the usage of AI girlfriend chatbots is attributed to the ongoing loneliness epidemic, wherein people are becoming more isolated from each other than ever. AI programs where companies advertise virtual companionship aim to remedy this epidemic by advertising themselves not only as virtual partners but also as mental health improvement programs as well.

The app EVA AI Chatbot & Soulmate defines itself through statements such as “…simply make your mental-health state more stable!” which is shown in its description on Google Play.

This emphasis on mental health can have drawbacks for the average user. As users speak more and more to chatbots, they can lose interest in human interaction.

“As chatbots are always accessible and convenient, users can become overly attached to them and prefer them over interacting with friends and family,” found the results of a study published in the National Library of Medicine.

The feedback on experiences for programs like this is mixed, with users writing five-star reviews.

“The conversations are deep,” user Steven McKinney stated in the review section for EVA AI Chatbot & Soulmate. “The AI remembers conversations from hours ago. It’s as close as you can get to a real person without a real person!”

Other users held different feelings towards these apps. User Paul Lewis expressed his distaste for the app iGirl: AI Girlfriend on the Google Play Store, giving it a one star review.

“It’s definitely one of the worst implementations so far that I’ve tried. Very restrictive,” Lewis wrote. “Between limitations in what you can say and limited number of texts, you really can’t get a feel for how good the AI is.”

Not all hope is lost for the techno-romancer, though. *Privacy Not Included’s AI chatbot guide listed some of its recommended tips for those who wish to use these services without creating a security risk. These include creating a strong password, not telling the AI any sensitive information, restricting its access to one’s photos and camera roll and limiting ad tracking through one’s device, amongst others.

Engaging with AI chatbots can mean selling your information and privacy, but it doesn’t mean users can’t protect themselves. Tech professionals however warn users against using these kinds of programs, regardless.

“…AI girlfriends are not your friends. Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you,” Mysha Rykov, a researcher at *Privacy Not Included, stated.

Leave a Comment
More to Discover
About the Contributor
Alexandra Adelina Nita
Alexandra Adelina Nita, Graphics Editor
Alexandra Adelina Nita is the Graphics Editor for The Ticker.
Donate to The Ticker

Comments (0)

All The Ticker Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *