“Protect your family,” cybersecurity firm Bitdefender warns as it launches a hard-hitting campaign alerting to AI-fueled attacks that “mimic the people you trust.” It’s the reason the FBI tells citizens to “create a secret word” for use within your family.
Bitdefender’s campaign “uses the same tools scammers rely on – voice cloning, deepfakes, generative AI – to show just how easy it is to be manipulated by someone who sounds exactly like your child, partner, or parent.” The people you trust the most.
Bitdefender says these AI-fueled attacks are part of a “trillion dollar global industry, affecting over 100 million Americans last year.” The old rules no longer apply.
The FBI warns criminals now clone voices and create AI-generated videos “to impersonate well-known, public figures or personal relations to elicit payments” or to “create believable depictions of public figures to bolster their fraud schemes.”
The bureau’s advice is equally stark and hard-hitting. “Create a secret word or phrase with your family to verify their identity.” Think that through for a moment. Such is the believability of these attacks, with AI clones generated just through social media scrapes, that you need a safety word to assure yourself that your relative is real.
The FBI also says “look for subtle imperfections in images and videos, such as distorted hands or feet, unrealistic teeth or eyes, indistinct or irregular faces, unrealistic accessories such as glasses or jewelry, inaccurate shadows, watermarks, lag time, voice matching, and unrealistic movements.”
Bitdefender echoes that advice. “Scammers are increasingly adopting highly personalized forms of impersonation, leveraging the very things that make us human—our faces, our voices, and our relationships.”
This is frighteningly effective. “Voice cloning scams now mimic the voices of children or relatives to demand urgent payments. AI-generated videos use stolen social media photos to craft convincing romance or investment pitches.” You need to “agree on safe words within your family,” the team says, “to confirm emergencies.”
The U.S. is the most dangerous place in the world for such scams — whether they come at you by text, email, social media post of phone call. In the last six months, “the U.S. received nearly 37% of global spam, making it the world’s primary target. Within those emails, 45% of the global spam received by Americans was fraudulent or malicious.”
That’s the trillion dollar global industry at work. It’s breaking down borders in the same way as other leading tech industries. SMS attacks build in China are deployed in New Jersey, California and Arizona. And AI is making everything worse.
“AI has fundamentally reshaped the economics of scamming. What once required teams of fraudsters and weeks of preparation can now be executed in minutes with freely available tools. In short, AI makes scams faster, cheaper, and more convincing.”
Watch the video and read the report. Make sure your family is briefed on what not to do — click, link, reply, respond. Do nothing until you’re sure. “Pause and verify,” Bitdefender says. And it’s especially critical that teens and young adults (any with phones) and elderly friends and relatives are forewarned.
The FBI’s advice is clearcut — keep it front of mind:
- “Create a secret word or phrase with your family to verify their identity.
- Look for subtle imperfections in images and videos, such as distorted hands or feet, unrealistic teeth or eyes, indistinct or irregular faces, unrealistic accessories such as glasses or jewelry, inaccurate shadows, watermarks, lag time, voice matching, and unrealistic movements.
- Listen closely to the tone and word choice to distinguish between a legitimate phone call from a loved one and an AI-generated vocal cloning.
- If possible, limit online content of your image or voice, make social media accounts private, and limit followers to people you know to minimize fraudsters’ capabilities to use generative AI software to create fraudulent identities for social engineering.
- Verify the identity of the person calling you by hanging up the phone, researching the contact of the bank or organization purporting to call you, and call the phone number directly.
- Never share sensitive information with people you have met only online or over the phone.
- Do not send money, gift cards, cryptocurrency, or other assets to people you do not know or have met only online or over the phone.”