How hackers cloned a company director’s voice and robbed $35 million USD from his bank account

An unprecedented cybercriminal incident was detected in the United Arab Emirates (UAE), where the manager of a bank was deceived by hackers who used a complex technique to bypass security systems and steal a millionaire figure. According to the report, the threat actors employed an artificial intelligence tool to clone the voice of a business owner, whose accounts were at the attacked bank, allowing them to trick the manager into authorizing $35 million USD of bank transfers.

Although the procedure performed by the manager seemed legitimate, everything was possible thanks to the use of the implementation known as ‘deep voice technology’ to clone the voice of the company’s director, UAE authorities concluded. Local police began seeking the collaboration of private agents and investigators in the U.S., in order to track down a portion of the stolen assets. The UAE has already identified at least 17 individuals potentially involved in this scam.

This is the second recorded case of using voice cloning tools for the purpose of committing bank fraud. The first similar incident was identified in the United Kingdom during 2019, when a hacking group managed to clone the voice of the director of a company to steal more than $200,000 USD. This new incident is a sign of the speed at which technological implementations based on artificial intelligence are advancing.

Jake Moore, security specialist at ESET, believes that these could be just the first steps towards the widespread use of these tools for cybercriminal purposes: “We are at a peak moment for threat actors to change their expertise and resources to use the latest technology to manipulate people who are unaware of the scope of artificial intelligence.”

A matter of a few years ago the use of artificial intelligence to commit crimes seemed like a matter of science fiction. However, it is enough to delve a little into these topics to discover that artificial intelligence tools such as Aflorithmic, Respeecher and Resemble.AI work incredibly and can be used for cybercriminal purposes.

Remember that hackers can obtain voice logs from all sorts of sources, including fraudulent phone calls, WhatsApp voice memos, YouTube videos, and social media posts, so you may prefer to keep these records outside of social media and other publicly accessible platforms, especially if you employ voice authentication methods for your bank accounts. 

To learn more about information security risks, malware variants, vulnerabilities and information technologies, feel free to access the International Institute of Cyber Security (IICS) websites.