Skip to main content

Verified by Psychology Today

Artificial Intelligence

AI Creates New Threats for Common Scams

Imposter scams take on new credibility with the help of AI.

Emergency-type scams like the grandparent scam have been around for a while but with the help of AI it can be almost impossible for a family member to discriminate a “real” distress call from a loved one versus a fake one. Imagine getting a call from your loved one. It sounds like them, and they might even appear to be calling from their own, trusted phone number. They sound panicked, telling you they’re in trouble and urgently require money. Understandably, the situation seems dire, and given that the call is coming from the loved one’s number and has their voice, why would you be doubtful?

This scenario, though, may be part of a scam that uses AI (artificial intelligence) and, sometimes, fake caller ID to fraudulently replicate a call from your loved one, a 2023 FTC Consumer Report warns. With snippets of a voice taken from social media or voicemail recordings, AI is able to recreate a person’s voice to sound realistic enough over the phone.

For example, an Arizona mom received a call from an unfamiliar number, answered it, and heard her 15-year-old daughter in distress, supposedly caught by kidnappers who were asking for $50,000. The voice sounded just like her daughter, yet it was AI. Luckily, the mom’s worried friends helped her by calling 911 and her husband, finding out that her daughter was safe (Campbell, 2023).

This AI emergency scam is just one of many that utilize AI. Now, AI can even create realistic videos of somebody. At the time of this writing, there is a trending GitHub repository that has software that can perform a webcam face swap based on a single image (Edwards, 2024). In fact, AI video scams can be so believable that even finance workers have been scammed. For example, a Hong Kong finance worker paid out $25 million to scammers who used deepfake technology to replicate his coworkers on a video call (Chen & Magramo, 2024).

While these AI voice scams may be advanced due to the use of technology and purposefully imitating high-stress situations, there are still some ways to protect yourself For example, create a codeword to use with your family if they're in trouble, ask the caller information about your loved one that only they could know, have somebody else text or call the loved one, or even quickly hang up and immediately call back your loved one at their trusted number (Rogers, 2024).

Additionally, as a more general tip, display appropriate caution and understand that, yes, it could happen to you. By believing you’re immune to scams, you may be putting yourself at greater risk than if you acknowledge that everybody, including you, could be vulnerable. Unfortunately, we are not always adept at predicting how we will act in stressful situations. As the FTC (2023) states, scammers will try to convince you that the need for money is urgent and should be kept secret, as well as play with your emotions to make you less vigilant. However, preparing yourself for the possibility of these calls will aid you in staying cautious, calm, and collected.

This post was written in collaboration with Hannah Peeples, a research assistant and recent graduate of Pomona College.

References

Campbell, S. (2023, April 10). ‘I’ve got your daughter’: Scottsdale mom warns of close call with AI voice cloning scam. Arizona’s Family. azfamily.com/2023/04/10/ive-got-your-daughter-scottsdale-mom-warns-close-encounter-with-ai-voice-cloning-scam/

Chen, H. & Magramo, K. (2024, February 4). Finance worker pays out $25 million after video call with deepfake ‘chief financial officer’. CNN. cnn.com/2024/02/04/asia/deepfake-cfo-scam-hong-kong-intl-hnk/index.html

Edwards, B. (2024, August 13). Deep-Live-Cam goes viral, allowing anyone to become a digital doppelganger. Ars Technica. arstechnica.com/information-technology/2024/08/new-ai-tool-enables-real-time-face-swapping-on-webcams-raising-fraud-concerns/

Puig, A. (2023, March 20). Scammers use AI to enhance their family emergency schemes. Federal Trade Commission.

Rogers, R. (2024, April 8). How to Protect Yourself (and Your Loved Ones) From AI Scam Calls. Wired. wired.com/story/how-to-protect-yourself-ai-scam-calls-detect/

Scammers Use Fake Emergencies To Steal Your Money. (2022, September). Federal Trade Commission.

advertisement
More from Stacey Wood, Ph.D.
More from Psychology Today