VMware recently published its annual Global Incident Response Threat Report. One alarming fact from the report: the number of cyberattacks has risen sharply since Russia’s invasion of Ukraine. What is particularly striking is the increasing threat of deepfakes.
But what are deepfakes? In short, it is a method of attack in which manipulated images, audio or even video recordings are used to impersonate another person. Cybercriminals use the opportunities offered by AI to slip into the identity of an authorized person, for example, to bypass security checks and retrieve sensitive information. In the report, a full two-thirds of respondents said they had already experienced deepfakes as part of an attack.
32 million US dollars with a single deepfake attack
A recent example shows the immense damage that deepfakes can cause. Patrick Hillmann, spokesman for the world’s largest crypto exchange Binance, has always liked to portray himself as an AI-produced hologram in Zoom calls. A sitting duck for cybercriminals. They created a deepfake avatar of Hillmann’s hologram and appeared as him in a call. The unknowing participants fell for the deepfake – and were relieved of 32 million US dollars.
But how did it get this far? The Brazilian cryptocurrency company BlueBenx specializes in brokering crypto loans and was keen to list its own cryptocurrency Benx on Binance – as quickly as possible. In a Zoom call with Hillmann’s hologram, a transfer of $200,000 and 25 million Benx was agreed upon – both of which BlueBenx promptly paid, only to discover a few minutes later that the Benx paid had been exchanged for the stablecoin USDT. And this continued until all deposits from BlueBenx investors, including the USDT reserve pool, were empty. Since the beginning of August, payouts have been at a standstill, with more than 25,000 speculators affected.
Dancing heirs to the throne are the lesser evil
Deepfakes don’t always have to be about multi-million dollar scams. Sometimes it is simply a matter of damaging a reputation and portraying a person in a compromising situation. For example, as a performer:in in a porn movie.
A not-so-bad – but still deceptively real – deepfake of Spanish Princess Leonor, went viral in mid-August. She danced in a TikTok video to ‘Medina’ – a song by Filipino rapper Andrew Ford. The teen’s face was superimposed on that of another dancer. The result: a deceptively real-looking dancing heiress to the throne.
What should you learn from the increasing deepfake attacks?
Certainly, a healthy skepticism is in order – for example, if a colleague or the boss suddenly asks for sensitive information over the phone. However, you should not panic. Rather, it is important to sensitize oneself and one’s own staff to the issue of deepfakes. In the event of an actual attack with the help of deepfakes, the alarm bells should ring at an early stage. This is particularly effective with Security Awareness Training. Test our Security Awareness Training now with our free demo and contact us for more information.