How Bad Movie Dubbing Led to the Fake Biden Campaign Robocalls

Jordan Howlett

2024-02-21    

Cybersecurity experts have spent years warning about deepfakes—artificially generated or manipulated media that can pass as authentic. While much of the concern has centered on images and video, it’s become clear over the past year that audio deepfakes, sometimes called voice clones, pose the most immediate threat. Vijay Balasubramaniyan, founder of the fraud detection agency Pindrop, says his company has already begun to see attacks on banking customers in which fraudsters use synthetic audio to impersonate account holders in customer support calls.

请登录后继续阅读完整文章

还没有账号?立即注册

成为会员后您将享受无限制的阅读体验,并可使用更多功能,了解更多


免责声明:本文来自网络公开资料,仅供学习交流,其观点和倾向不代表本站立场。