A voice note in which the CEO of the group asks his subordinates to urgently transfer 220,000 cash to a supplier in Hungary. A video in which a former cryptocurrency guru asks frustrated customers to click a link to receive compensation. Hologram handles the business of a video call, posing as an executive. These are some of the uses that criminals are already putting deep in technology that recreates voices and even faces using artificial intelligence (AI), to impersonate the corporate world.
Deepfake tools look at hundreds of processes and already existing content to “teach” artificial intelligence how to replicate an image or sound (in this case voices or faces). In fact, advertising has already used them to posthumously return personalities such as Lola Flores, Cantinflas and Salvador Dalà to the screens. In the context of hacking programs like Midjourney, which generate more and more individual images, the artists warn that criminals can also use them to replicate people’s faces without their consent and scams, albeit within certain limits.
“Several cases have already been seen in which this technology has been used to commit crimes. On the one hand, we have calls from the CEO, which used to confirm the transfers of millions of cash. In others, the conversations used to pass far away and take place, which are guilty of more confidential information from the company , which hired them, allows them to be stolen”, explains the director of Research and Awareness of the cyber security company ESET Retina in Spain, Josep. Whites
He adds that companies like Microsoft are already working on special anti-fake detection tools and he thinks that in the future, the creators of these programs and systems may be able to identify them.
“One example of what has already happened is the use of deepfake technology to convince people to fraudulently invest in cryptocurrency.” The images were technically altered by the personalities to make it appear as if they were saying something they didn’t say in the interview. In fact, it’s quite common in this type of hack,” says Bitdefender security analyst Silviu Stahie.
Experts consulted for this report agree, however, that deep-fake technology still can’t fool biometrics. For example, facial recognition tools for unlocking phones or performing financial transactions can accurately distinguish whether they are dealing with a real person or a digital image. “Facial biometrics, like in phones, need a 3D target that can’t be fooled by a 2D representation, however good it is,” Stahie says.
As for voice impersonation, ESET indicates that any facial recognition system that is “very competent” is capable of recognizing when it’s being faked. However, Albors highlights that, in addition to the systems, the vulnerabilities of the biometric data that have been collected must be analyzed. “Today it is impossible to be a fool with a biometric sensor. But biometric data is unique to each person and cannot be changed, so the theft and impersonation of this data represents a real risk that must be taken into account”, advises the expert.
Some victims of DEEPFAKES
Bianca. In 2022, the head of Communications of this cryptocurrency exchange platform, Patrick Hillman, revealed that he had “created a hologram of artificial intelligence” that had been tuned into several video calls. According to the executive, this falsity was used to negotiate various agreements on cryptocurrency on the platform.
ftx The infamous cryptocurrency exchange was another victim of deepfakes. After the announcement of its collapse and bankruptcy in November 2022, a fake video of the CEO, Sam Bankman-Fried, began to circulate on the networks indicating that they could access their feelings if they clicked on a link to access a Web page. In reality, this was all part of a SCAM to steal victims’ information and money.
United Kingdom Cybersecurity company Avast has revealed that in 2019 the UK industry fell victim to deepfake scams. They composed a voice message from the CEO of the parent company to contact the subsidiary’s employees and request more transfers to the supplier. Cybercriminals manage to defraud 220,000 coins