Think Twice Before Sharing Private Thoughts With an AI Chatbot

One artificial intelligence expert urges caution when coming clean with AI.

One New Year’s resolution that should be a priority is limiting what you say to an artificial intelligence chatbot like ChatGPT or Bard.

That’s a good idea as humans seem to be warming to the idea of interacting directly with AI bots.

According to a recent study by the AI chat development firm Tidio, 88% of its customers will have at least one chat with an AI bot in 2023. Additionally, 62% of people surveyed said they prefer to engage with a chatbot for tasks like customer service rather than deal with a human. 38% of people surveyed said the opposite.

“If the alternative was to wait 15 minutes for an answer, 62% of consumers would rather talk to a chatbot than a human agent,” the study noted. “(Yet) there are some customers who always prefer human interaction. It is a good idea to provide this option as well. However, most online shoppers would prefer not to have this option and use a self-service chatbot instead of waiting.”

Just Don’t Get Personal

As the Tidio study indicates, using an AI chatbot to get a retail discount or solve a customer service problem turns out to be a good idea. After all, almost two-thirds of people feel that way.

What humans probably should not do is let a chatbot in one’s deepest secrets, like personal political convictions or hot takes on sensitive social issues.

That’s the takeaway from Oxford University professor Mike Wooldridge, who gave the annual Royal Institution’s Christmas lecture last week in London.

In it, Wooldridge noted the inherent problems of confiding in AI bots, saying it’s bad for people’s personal business.

‘(AI) has no empathy. It has no sympathy,” he said, as reported by the UK Daily Mail. “’That’s absolutely not what the technology is doing and crucially, it’s never experienced anything. The technology is basically designed to try to tell you what you want to hear – that’s literally all it’s doing.”

Expecting more is a fool’s errand and likely worse if you share too much with an AI bot.

“You should assume that anything you type into ChatGPT is just going to be fed directly into future versions of ChatGPT,” Wooldridge said. “It’s extremely unwise to start having personal conversations or complaining about your relationship with your boss, or expressing your political opinions.”

If you do share personal opinions with an AI chatbot, good luck getting the data back.

Due to AI model infrastructures, it’s “nearly impossible” to get your data back “once it’s in the system,” Wooldridge added.

Consider that downbeat scenario if you’re considering sharing your opinion on the Israel-Hamas war or going public with your 2024 US presidential pick.

If you do that on ‘blabbermouth’ AI chat platforms, it may go public – and a lot sooner than you may think.

Recent Posts

Leave a Reply

Your email address will not be published. Required fields are marked *