Creating a fake image, sound recording or manipulated video with the support of artificial intelligence (AI) is not difficult. The media files appear absolutely genuine – but they are not. We are talking about so-called deepfakes.
A classic example is the replacement of a face – in still or moving images. Even if deepfakes are often only used for a short joke on the Internet, they pose a risk to your IT security that should not be underestimated.
As the attentive readers of our blog are well aware, cybercriminals usually target the staff of a company for their attacks. Targeted spear phishing attacks on individual employees are often crowned with success and are extremely lucrative. The attackers may be able to capture amounts in the millions with a successful attack on a company.
Manipulation through artificial intelligence
It is not surprising that the attackers invest in modern technologies and AI with these great successes. Particularly coveted: programs that imitate the voice of a specific person. As an example, we can mention a freely available software from descript: Lyrebird AI. Such tools can be used to imitate the voices of superiors, for example. This procedure is called voice cloning.
Manipulating people with the help of deepfakes is therefore easier than one might think. The triumph of deepfakes began in 2017. At that time, it was primarily in the form of videos, but they were easily recognizable as fakes.
In the last five years, AI technology – and with it machine learning – have been vigorously developed. Even users with no experience in creating deepfakes now have the ability to create fake audio, images and video. And they can do it within a very short time.
Cyber attacks with the help of deepfakes
Deepfakes are ideal for launching social engineering attacks. Social engineering is a scam that involves interpersonal influence on a person. The attacker tries to gain a person’s trust and thus persuade them to divulge confidential information, for example.
In the corporate context, this is usually done through voice phishing. As described above, a computer voice deceptively imitates the voice of a colleague, a boss or another person from the work environment. The aim is to trigger a desired action in the victim or to legitimize a phishing e-mail that requests a password reset, for example.
This example is a double barrel attack. Double barrel refers to the double barrel of a shotgun. The attack is based on two contacts – in this case by phone and by mail. Such attacks take some effort on the part of the cybercriminals, but usually succeed.
Deepfakes: What companies can do now
The perfidious thing about double barrel attacks is the combination of voice phishing and a phishing email. This increases the success of a cyberattack immensely. Accordingly, companies of all sizes should take measures in the sense of a well thought-out cyber security strategy. bitkom also advises this in its current article “IT security: What companies should do now as a matter of urgency”, as the war in Ukraine is also being waged in the digital space and cyber attacks on German companies are to be expected.
Even though software solutions are already being developed that detect deepfakes, they are nowhere near reliable enough to reliably protect against an attack. It is more important to invest in the training of employees and to make them aware of the danger posed by deepfakes and phishing.
The best way to do this is to conduct regular security awareness trainings like the ones we offer at IT-Seal. Our Awareness Academy combines e-learning and phishing simulations in such a way that the acquired knowledge can be directly applied in practice. The training is always tailored to your needs in order to conserve the resources of your employees – as much as necessary and as little as possible.
Would you like to learn how you can sensitize and train your employees in autopilot? Try our phishing demo and train your security awareness for free!
After registering, you will receive a total of four phishing emails in the coming weeks and, at the end of the demo, an evaluation of whether and on which emails you clicked links.