Tech

AI voice scams are on the rise – here’s how to stay safe, according to security experts

Share
Share

  • AI voice-clone scams are on the rise, according to security experts
  • Voice-enabled AI models can be used to imitate loved ones
  • Experts recommend agreeing a safe phrase with friends and family

The next spam call you receive might not be a real person – and your ear won’t be able to tell the difference. Scammers are using voice-enabled AI models to automate their fraudulent schemes, tricking individuals by imitating real human callers, including family members.

What are AI voice scams?

Scam calls aren’t new, but AI-powered ones are a new dangerous breed. They use generative AI to imitate not just authorities or celebrities, but friends and family.

The arrival of AI models trained on human voices has unlocked a new realm of risk when it comes to phone scams. These tools, such as OpenAI’s voice API, support real-time conversation between a human and the AI model. With a small amount of code, these models can be programmed to execute phone scams automatically, encouraging victims to disclose sensitive information.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
Street Fighter 6 gets new gameplay trailer for next DLC character Mai Shiranui
Tech

Street Fighter 6 gets new gameplay trailer for next DLC character Mai Shiranui

The gameplay trailer for Street Fighter 6‘s next fighter Mai Shiranui has...

Curriculum learning–based LLM shows benefits of step-by-step reasoning in AI systems
Tech

Curriculum learning–based LLM shows benefits of step-by-step reasoning in AI systems

The figure illustrates a comprehensive dataset structure designed to evaluate diverse tasks...

New electrical pulse method enhances carbon fiber recycling efficiency
Tech

New electrical pulse method enhances carbon fiber recycling efficiency

Photographs of the recovered (a) CFRP and (b) CF obtained by DD....