AI Voice Scams
Artificial intelligence is proving to be a new and lucrative tool for scammers. Cybercriminals are now using readily available AI technology to clone voices to trick us. Watchdog invited a volunteer group to the University of East Anglia to show just how easy it is to fall for an AI scam, with the help of Professor of Cyber Security, Oli Buckley.
Our volunteers knew that they were taking part in an experiment, however had no knowledge of what we were about to attempt. We ensured that our group consisted of a cross section of society by age, gender and profession. They were contributors who were aware of scams, and had both previously been a victim of a scam, and never been scammed.
We invited each of them into a recording room one at a time and asked them to talk about themselves for just 2 minutes, which we recorded. This audio would provide ample source material for the AI software to use, just as scammers would use the content of a fake marketing phone call to allow the software to replicate their voices. Oli then used these recordings to reproduce each voice using AI software.
Our group were shocked by the results. Included amongst our volunteers were a mother and daughter, Sam and Amelia. Both were stunned to hear the fake voicemail from Amelia. Sam recognised that the sense of urgency in the voicemail caused a physical reaction from her, and a need to help, despite Amelia sitting next to her and the complete knowledge that the message was false. One volunteer, Gary, wasn’t asked to make a recording, and was surprised to learn that Oli was also able to create a fake replication of his voice using just his internet content.
We advised our audience to always listen to the subtle nuances in a voicemail, and if in doubt always make a direct phone call to the person to double check. Equally, when uploading online content, background noises and music make a harder job for the scammers to steal your voice.