Tech

AI voice scams are on the rise – here’s how to stay safe, according to security experts

Share
Share

  • AI voice-clone scams are on the rise, according to security experts
  • Voice-enabled AI models can be used to imitate loved ones
  • Experts recommend agreeing a safe phrase with friends and family

The next spam call you receive might not be a real person – and your ear won’t be able to tell the difference. Scammers are using voice-enabled AI models to automate their fraudulent schemes, tricking individuals by imitating real human callers, including family members.

What are AI voice scams?

Scam calls aren’t new, but AI-powered ones are a new dangerous breed. They use generative AI to imitate not just authorities or celebrities, but friends and family.

The arrival of AI models trained on human voices has unlocked a new realm of risk when it comes to phone scams. These tools, such as OpenAI’s voice API, support real-time conversation between a human and the AI model. With a small amount of code, these models can be programmed to execute phone scams automatically, encouraging victims to disclose sensitive information.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
This bold compression tool promises smaller files and a greener planet, but hides key features behind paywalls
Tech

This bold compression tool promises smaller files and a greener planet, but hides key features behind paywalls

CompressionX promises better compression and greener storage, but locks core features behind...

What to know about buying electric vehicles after the federal tax incentives end
Tech

What to know about buying electric vehicles after the federal tax incentives end

Credit: Kindel Media from Pexels The massive tax and spending cut bill...