/
Virtual soulmates labeled 'worst possible use' of AI

Virtual soulmates labeled 'worst possible use' of AI


Virtual soulmates labeled 'worst possible use' of AI

What may initially seem like harmless interaction with new technology could have long-range damaging effects. One of the emerging branches of artificial intelligence is the virtual "girlfriend."

A quick Google search shows more than 2 million people have sought information how to use a chatbot to share the intimate details of life. The searches for "AI Boyfriend" exceed 3.5 million.

A leading tech industry observer says Match, eharmony and similar "dating" portals obviously aren't for everyone, and the potential in this substitution of human engagement is frightening.

"Not everyone has a great deal of success when it comes to the online dating apps, and some are kind of forced into that environment in one way or another. A lot of people fall through the cracks. What we've seen the last couple of weeks is there's actually an uptick in this trend of AI girlfriends," Jake Denton, research associate for The Heritage Foundation's Tech Policy Center, said on American Family Radio Monday.

Some pornography outlets have taken the extra step of using chatbots to create a more personal experience for their users, Denton told show host Jenna Ellis.

The damaging effects can be more than emotional. If unscrupulous companies or individuals end up with a user's cyber security information, they can wreak untold kinds of havoc.

Denton, Jake (Heritage Foundation) Denton

"The same folks who were making OnlyFans pages – kind of the subscription pornography platforms – are actually using all of their media, their audio files, their photographs, videos, to replicate themselves in an AI fashion. Then they're selling subscriptions to an application that allows for their fans to date them virtually," he said.

Subscriptions for an AI "significant other" can be purchased by the minute, the day, the week or month. The replica of a male or female voice will send users audio and video messages and engage them in phone calls.

Users are responding. Whatever their personal circumstances, many men and women are trusting false, projected images with run-of-the-mill daily events but also with much more information that many would consider private.

AI soulmates? You ain't seen nothing yet

"In a technology that will only improve, it's incredibly realistic. There are gaps in the technology still in this very rudimentary phase of AI deployment, but we're already reaching kind of horrific side effects like this. People are starting to have these very personal conversations with AI bots," Denton said. "It's horrific. I think this is probably the worst possible use case so far that we've seen for this technology."

Search engine numbers don't necessarily reflect the number of people who take the plunge and actually pay for an AI boyfriend or girlfriend service.

"The amount of people typing in 'AI girlfriend' has skyrocketed over the last couple of weeks. So, there's obviously intrigue around it. We'll probably have to wait a little bit to see the download metrics and hear about how many people are actually logging in on a daily basis; but at the very least, people are curious and they're exploring this," Denton said.

The numbers can also be thrown off because many people would not admit to using this kind of service, Denton added.

During the COVID-19 pandemic, The Scientific American identified loneliness as a public health problem.

"It is also a public health problem that has been linked to increased risk of mental health issues, heart disease and even death," the journal stated in March 2021.

That was more than two years ago, and even then people were looking for substitutes for human companionship.

"With rates of loneliness on the rise in the U.S. and around the world, people are addressing this crisis using everything from companion robots to social networking sites and apps," noted the article on Scientific American.

Where do those conversations go?

Many chatbot users don't care that they're not speaking to real people, according to Denton.

"They have no one to kind of share the details of their life with," he elaborated. "And so, when an AI chatbot will come along and just text you [inquiring] 'How's your day going' or 'What are you up to?' a lot of people are going to take that and just run with it. It's just that you have someone to express to your concerns of the day or how you're feeling."

If the interaction ended there it would just be sad, but Denton warned it can extremely harmful.

As he explained, these conversations – whether through voice or keystrokes – don't just disappear, but are recorded and stored; and the companies that make this technology available are profit-driven. For many providers, the AI soulmate experience is designed to keep the user engaged to extract as much information as possible.

"If you're confiding in this chatbot about the most intimate details of your life, and they're phishing to get your passwords, that's amplified through these types of conversations. There's a great deal of concern both on a social side of what it says, particularly about where our younger generation is, but then you get on the data privacy side," Denton said.

As noted in the Scientific American article, chatbots can be trained to ask open questions and follow up on "clues" in the conversation – the overall objective being to appear empathetic and "build trust" between the user and the chatbot.