Skip to main content

AI Is Being Used to Impersonate People You Trust : Shared by Lou Mendez


A caution worth sharing....
Lou

AI Is Being Used to Impersonate People You Trust

Identity Theft Guard Solutions, Inc. (IDX)

IDX Knowledge Center

November 13, 2025

Click here for the article

Summary

AI deepfake technology is enabling scammers to convincingly impersonate trusted individuals, resulting in over $200 million in losses in the first quarter of 2025 alone. Fraudsters use AI-generated voices, faces, and spoofed caller IDs to create elaborate deception schemes targeting various groups.

Grandparent scams have become alarmingly common, with fraudsters cloning young people's voices from online video clips and using AI masking technology to impersonate grandchildren claiming to need emergency financial assistance. These callers often spoof caller IDs to appear from trusted numbers and may hand the phone to accomplices posing as lawyers or law enforcement officials.

Celebrity impersonation through AI deepfakes has proliferated across social media, with fake endorsements from figures like Oprah Winfrey (promoting weight-loss supplements), Gordon Ramsay (offering free cookware), and Kim Kardashian (soliciting wildfire relief donations). In romance scams, AI-generated celebrity personas have defrauded victims of substantial sums—including a French woman conned out of $850,000 by a fake Brad Pitt and a California woman who lost $81,000 plus her family home to a scammer using AI deepfake videos of a soap opera actor.

Law enforcement officials are commonly impersonated due to their authority, with recent cases including AI voice cloning of the Salt Lake City police chief (demanding $100,000) and Virginia law enforcement officials (threatening victims over alleged court fees).

IDX recommends several protective measures: verify celebrity endorsements through trusted sources, confirm unusual requests by contacting individuals at known phone numbers, screen calls from unknown contacts, exercise extreme caution with urgent payment demands, and be wary of requests for payment through mobile apps, wire transfers, gift cards, or money orders. Requests to keep discussions confidential are significant red flags.

Quotes

"In the first quarter of 2025 alone, deepfake-driven fraud resulted in more than $200 million in losses." — IDX

"The U.S. military community nationwide reported nearly 43,000 imposter scams in 2024, costing troops and their families an estimated $178 million." — Military Times

"When faced with a person who unexpectedly asks or directs you to send money or personal information, it's safest to step back and do your homework: Seek out trusted friends, relatives, or news sources to see if it's real or fake." — IDX

Comments

  1. I know it sounds rather funny almost outlandish but when discussing finances online (chat or verbal) with the family - we have code words to ensure identity.

    ReplyDelete

Post a Comment

If you are a member of XUNICEF, you can comment directly on a post. Or, send your comments to us at xunicef.news.views@gmail.com and we will publish them for you.