Scam calls are nothing new. Criminals have weaponized the telephone to fraud victims well before the internet was widely available for nearly everyone on the planet. With new technologies, like artificial intelligence, scammers have new tools to target a larger group of victims and make their attacks more convincing — and potentially more lucrative.
More specifically, online criminals can use generative artificial intelligence (AI) to fabricate convincing-sounding calls. Therefore, generative AI and so-called AI voice cloning have made scam calls more difficult to identify as fake. And as AI tools become more accessible and easy to use, the number of AI scams and other such frauds will only grow.
One advantage of generative AI for online criminals is that they can create more convincing scam calls at an unprecedented scale. Until now, scammers have relied on generic attacks that are easy to spot as fake, such as phishing messages with only the recipient’s name changed. With AI, scammers can generate unique scam calls and messages targeted at a certain individual with a much smaller effort.
With the help of generative AI tools, scammers can pick a snippet of somebody’s voice and create a life-like audio clip of whatever they wish. The AI-generated voice can be eerily reminiscent of the real person, such as your family member. With a sense of urgency mixed in, the victim of an AI scam call may not know to suspect a scam, even if they are familiar with common tricks used by fraudsters.
According to the US Federal Trade Commission (FTC), imposter scams are among the most common, taking the largest slice of the total $8.8 billion victims lost to scammers in 2022. With AI taking over, this number is likely to become much higher. Even the 2022 numbers of consumer losses showed an increase of more than 30% over the previous year.
Fabricating someone’s voice can be done with just a few-second snippet, and there is no shortage of material online. For example, social media is abundant with videos of people talking, making it a treasure trove of material to exploit for scammers. Online criminals can also use voicemail messages or call you directly to get a sample of your voice.
Voice cloning scams are only one way criminals use artificial intelligence to deceive their victims. In addition to new forms of online fraud, AI can be used to supercharge more traditional scams, such as phishing and identity theft. A few common types of AI scams to look out for include:
You do not need to have a large bank account to get scammed on the internet. Everyone is a tempting victim of online crime, whether that means hacking, identity theft, spreading malware, or scamming with fake AI calls. Follow this advice to avoid getting swindled with AI scam calls or deceived in some other way:
Although AI technology used for fraud and phone scams gets more nuanced and technologically advanced, so do the tools to combat the malicious use of AI. In addition to being wary of AI scams on the internet, protect your devices against malware with award-winning antivirus software, such as F‑Secure Total.
Online criminals can do so much more than just clone voices of your family members, friends and loved ones to deceive you. Malware, hacking, phishing and identity theft are other ways criminals steal money and infect devices. Use F‑Secure Total and its powerful antivirus to stop malware and identify dangerous websites. Total’s VPN, password management tools and identity protection help you stay safe on the internet. Try F‑Secure Total for free.