Memo to Management: Watch Out for AI Voice Cloning Scams

Voice-based deepfakes may be the next big fraud alert for the C-Suite.

Voice cloning scams have been around in the last decade or two, but with the clunky technology and high risk of getting caught, fraudsters largely kept their distance from audible-based crime.

That may not be the case anymore.

A new study of over 7,000 people from McAfee shows that about 25% “had previously experienced some kind of AI voice scam, with 1 in 10 targeted personally and 15% saying it happened to someone they know,” the company noted. “77% of victims said they had lost money as a result.”

Artificial intelligence is driving the rising wave of voice cloning scams, which seems to be flowing at a “full-steam ahead” mode these days. A case in point: McAfee researchers find fraud artists can clone a voice from just three seconds of audio.

“Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person’s voice and deceive a close contact into sending money,” said Steve Grobman, McAfee’s chief technology officer.

Artificial intelligence brings incredible opportunities, but with any technology, there is always the potential for it to be used maliciously in the wrong hands, Grobman says. “This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways,” he notes.

A Rise in AI-Based Voice Cloning Crime

Expect voice cloning scams to surge in 2024, according to Ping Yang, professor of computer science and director of the Center for Information Assurance & Cybersecurity at Binghamton University, State University of New York.

“As deepfake technology continues to advance, I anticipate a surge in voice cloning scams in 2024,” Yang notes. “In the voice cloning scam, scammers leverage voice cloning, a type of deepfake technology, to impersonate individuals such as CEOs, government officials, or family members, to persuade victims to initiate money transfers.”

Whether you’re a senior executive tasked with keeping cyber fraud away from the workplace or are looking to safeguard the family safe at home, getting to know the risks associated with voice cloning fraud is well worth the effort – before it sounds off and strikes you first.

Recent Posts

Leave a Reply

Your email address will not be published. Required fields are marked *