Deepfake is quite a recent phenomenon. The term is used to describe the result of artificial intelligence tampering with the video and audio recording. It creates the illusion of recorded subjects saying or doing things that they never actually did.
This can be fun to watch but can easily turn into a dangerous weapon when used against politicians or celebrities. It can destroy the reputation and trustworthiness of an individual. Moreover, it can be misused to spread dangerous ideas or manipulate society because the opinions of famous people have a big impact on the masses.
Phishing is one of the most common ways to deceive people and to infect computers. Spear phishing (which is phishing targeting a specific person) has an even higher success rate than average phishing e-mail. What’s more, when an attacker uses, for example, the boss’s voice to trick their victims, they have an even higher probability to succeed.
That’s exactly what happened recently. Scammers have leveraged deepfake to deceive the employees and they succeeded! A recording with the voice of a company’s chief executive has been produced and scammers received $243,000 thanks to it. We will probably see such attacks more often, as deepfake is becoming easier and easier to make.
Our Cyber Resiliency Team will simulate a real phishing attack to your organization and based on the results collected and our in-depth analysis of the company email system (encryption, protocols, filters, etc.), we will help optimize the system to increase the overall security posture to help keep cybercriminals from entering your network.
How easy it is to make a deepfake?
Lyrebird developed by Montreal Institute for Learning Algorithms is an example of a speech synthesis software. Using a one-minute long audio sample it can generate a recording with any content pronounced with that person’s voice. A recording is believable, though not perfect and it can be produced relatively quickly using GPU computing power.
At the same time, a deepfake detection software is being produced. It monitors several features of the video, for example blinking, breathing, misalignment, lighting conditions, etc. Specific detection models can be tailored for highly targeted people like politicians.
To conclude, deepfake phishing will probably become more and more popular. It would be impossible for the common people to distinguish between an actual recording and a deepfake. A good detection software could help with this issue. However, people should be cautious when it comes to unusual requests – even when those requests are from their employers or people they know.
- Forget BEC: Fraudsters deepfake CEO’s voice to trick manager into transferring money
- How to recognize and avoid phishing scams