Scammers Use AI To Fake CEO’s Voice, Transfer $243,000

voice AI for cybercrime
Image: Depositphotos

We all know how AI has advanced over the years and is continuing to do so for a better future. While AI helps us in numerous ways, it also contributes to malicious activities as suggested by a new cybercriminal case.

AI used in money fraud

As hinted by a report by the Wall Street Journal, fraudsters used AI-based voice-generating software to fake the voice of the CEO of an unnamed German company.

The incident took place in March this year, wherein the scamsters tricked the CEO of a UK-based energy firm (a subsidiary of the German company) into believing that he was speaking to its parent company’s CEO. This led to a fraudulent money transfer of around $243,000.

The fraudsters asked the CEO to transfer the money to a Hungary-based supplier in an hour and promised to make the refund soon. Due to the same voice and accent as the German CEO, he did so without any questioning.

However, the money was never refunded, and another money transfer demand was made, which was finally turned down by the energy firm’s CEO.

What happened next?

The report further suggests that the money was first transferred to Hungary, following which it reached Mexico and other places. Authorities are investigating the matter, but there is no word as to who was involved in this malicious activity.

As for the money, Euler Hermes Group insured the energy firm, and the money involved in the cybercrime was compensated for.

A new cybercrime we need to know?

The cybercrime mentioned above is a social engineering technique called ‘Vishing’ or voice phishing. For those who don’t know, vishing refers to a malicious activity wherein the fraudster impersonates someone and scams a victim over the telephone.

AI-based fraud seems to be the start of the existing long-list of cybercrimes we should all know about. As this can become more serious in the near future, some safety measures such as a tougher verification process should be maintained to contain malicious activities.

Also Read: This Chinese Face-Swapping App Looks Like A Privacy Nightmare

Similar Posts