The attackers were never caught.Įarlier this year, the FTC warned about the rise of such scams, but experts say there’s one easy way to beat them. The executive was told that the transfer was urgent and the funds had to be sent within the hour. The best known and first reported example of an audio deepfake scam took place in 2019, where the chief executive of a UK energy firm was tricked into sending €220,000 ($240,000) to a Hungarian supplier after receiving a phone call supposedly from the CEO of his company’s parent firm in Germany. With enough time and data, the highest-quality audio deepfakes are much more convincing than the example above. And for many executives at large firms, such recordings can be easily collected from earnings calls, interviews, and speeches. The more data you have and the better quality the audio, the better the resulting voice clone will be. But such attacks will be more common as deepfake tools become increasingly accessible.Īll you need to create a voice clone is access to lots of recordings of your target. The attack was ultimately unsuccessful, as the employee who received the voicemail “immediately thought it suspicious” and flagged it to the firm’s legal department. The target “immediately thought it suspicious”
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |