AI Scammers Use Deepfake to Steal 23.8 Million Euros


Artificial intelligence is the technological sensation of the moment. Thanks to tools like ChatGPT, the eyes of society have been fully focused on observing what these systems are capable of. However, many continue to advocate the search for regulations that guarantee fair use of this technology, since it has been proven that it can be differential even with biological weapons. Or, as happened to a recent victim, tricking a person into thinking they are talking to an entire team of managers.

How the portal collects Techspot In a recent publication, this was the case of the employee of a Hong Kong branch of an international company. Although no details related to the company or the employee have been revealed, it has been confirmed how the scam was carried out. Thanks to Deepfake, a technology to recreate faces, scammers were able to impersonate company directors. And this, added to audio and video records available on the Internet, allowed them to impersonate those responsible to obtain 15 transfers with a total of 23.8 million euros.

They tried to scam 3 company workers

Apparently, just a real person was present on the call: the scammed one. As the news reports, the rest of the members were fake, since they were digital recreations made with artificial intelligence. Thus, after introducing himself at the beginning of the call, the victim did not make any further interactions and he only limited himself to obeying the orders of his superiors. And these, who were actually the scammers, ordered him to make 15 transfers to different accounts, all of them with a value of 23.8 million euros.

Unfortunately for the victim, he was not aware of the scam until a week later. Upon contacting the company, he realized that he had fallen for a scam. Thus, he was the only one of the 3 company workers who fell into this farce, since the other two ignored the emails they received, suspecting that it was an attempted fraud. In this way, cases like this show that AI, if not regularized, can become very dangerous, since it allows impersonate any person with only an audio record and a couple of photographs of his face.

Leave a Comment