At First Financial, your safety has always been our utmost priority. We believe in not just safeguarding your financial assets, but also in ensuring you are informed and protected against the evolving threats in our digital age. One such alarming development that we wish to bring to your attention is the rise in AI (artificial intelligence) voice cloning scams.
Understanding AI Voice Cloning Scams
In a typical scenario, scammers will utilize artificial intelligence and voice cloning technology to create a near perfect imitation of a person’s voice. Armed with this technology, fraudsters can impersonate loved ones – demanding emergency funds or posing as genuine businesses and government agencies to deceive people into sharing their personal and financial information.
How It Works
To clone a voice, scammers only really need a short audio clip of the intended person. Surprisingly, these clips can easily be sourced from public platforms like social media. This evolution of scams has been observed by significant entities such as the Federal Trade Commission, which has raised alerts about this alarming trend.
To put things in perspective, 2022 witnessed a significant surge in phishing attacks attributed to the availability of AI. This new age scam is an offshoot of the well-known “grandparent scam,” which institutions like AARP have warned their members about.
Spotting the Scam
Although the technology behind these scams is advanced, there are some common indicators:
- Requests for urgent payment – especially via wire transfers, cryptocurrency, social media, or gift cards.
- AI generated texts that have repetitive words, short sentences, lack idioms or contractions, or make implausible statements.
- Suspicious communication from “genuine” organizations or from someone you know. Always independently verify the source by calling them on your own using the publicly listed business phone number or the phone number saved in your contacts, before taking action.
A Real Life Experience
Jennifer DeStefano shared her harrowing encounter with News 12 New Jersey to raise public awareness. She received a call demanding a $1 million ransom, with the caller threatening that they had kidnapped her daughter. The chilling part? The voice of her sobbing daughter in the background sounded undeniably real. The relief came when she discovered her daughter was safe with her husband. The traumatic voice she’d heard was an AI generated clone.
How to Protect Yourself
- Skepticism is Key: Be wary of any unusual payment requests via gift cards, wires, or cryptocurrency. If it sounds too strange, it likely is.
- Analyze the Language: Pay attention to any odd phrasing, repetitive words, or short, robotic sentences in text communication.
- Keep Plans Private: Be mindful of the personal details and plans you share on social media, as well as if your profile is public or if you have any public videos online with your or a loved one’s voice recorded.
- Don’t Trust Caller ID Blindly: Scammers have the means to mask their numbers, as well as make them look like a legitimate organization’s phone number.
- Report Suspicious Activity: Always report any potential scam to local authorities and the FTC.
At First Financial, we pledge to continue our relentless pursuit of safeguarding your well-being in all facets of your financial journey. Let’s work together to protect our community, financial assets, and personal information from falling into the wrong hands.
If you ever have any concerns or questions about any of your First Financial accounts, please call member services at 732.312.1500 or visit one of our branches. Check out our First Scoop blog to stay up to date on the latest scams.