Area police warn of AI kidnapping scam

CELINA — Area officers are warning people of scammers using artificial intelligence (AI) to clone their loved ones’ voices.

[DOWNLOAD: Free WHIO-TV News app for alerts as news breaks]

The Celina Police Department said in a social media post that these scams sound very real and are designed to create panic.

The scams often target older adults by using audio clips stolen from the internet to mimic family members in distress.

TRENDING STORIES:

Officers warn that technology has become advanced enough to make these fake calls sound very real, according to a social media post.

Scammers gather short audio clips from social media, voicemail greetings, and videos to create a convincing voice replica.

Using AI, criminals can simulate the voice of a child, grandchild, or other family member to claim they are in trouble and need money immediately, the social media post stated.

Common scenarios used in these calls include claims of being in a car accident and needing bail money. Other callers may claim they have been arrested and beg the victim not to tell other family members.

The scammers often pressure victims to stay on the phone and act quickly without verifying the story.

Officers recommend that people pause and verify any such emergency.

If a suspicious call is received, they should hang up and call their loved one directly using a phone number they already have on file, the social media post said.

People can report these scams to the Federal Trade Commission on this website.

[SIGN UP: WHIO-TV Daily Headlines Newsletter]