Cybercriminals Use AI To Impersonate Chief Exec’s Voice

Scammers leveraged artificial intelligence software to mimic the voice of a chief executive and successfully request $243,000.

  • As part of an incident in March, an attacker called the CEO of a UK-based energy business pretending to be the head of its German parent company. Analysts believe AI-based software was used to impersonate the chief executive’s voice
  • The caller issued an “urgent” request to the CEO, demanding he transfer $243,000 to a Hungarian supplier within an hour’s time
  • The transfer went through and the money was later moved to other countries

Subscribe
Notify of
guest

2 Expert Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Jake Moore
Jake Moore , Cybersecurity Specialist
InfoSec Expert
September 4, 2019 12:43 pm

Jake Moore, Cybersecurity Specialist at ESET:
“I predict that we will see a huge rise in machine-learned cyber-crimes in the near future. We have already seen DeepFakes imitate celebrities and public figures in video format, but these have taken around 17 hours of footage to create convincingly. Being able to fake voices takes fewer recordings to produce. As computing power increases, we are starting to see these become even easier to create, which paints a scary picture ahead.

“To reduce risks it is imperative not only to make people aware that such imitations are possible now, but also to include verification techniques before any money is transferred. Two-factor authentication is another powerful, inexpensive and simple technique that adds an extra layer of security to protect your money going into a rogue account. When being called about a money transfer, particularly of large sums, check the number calling and ask to call back. Do so using a number in your address book, rather than hitting the \”call back\” option in your call history.”

Last edited 3 years ago by Jake Moore
Javvad Malik
Javvad Malik , Security Awareness Advocate
InfoSec Expert
September 4, 2019 11:50 am

Experts have been predicting that cybercriminals will be leveraging AI to assist in scams. The use of technology to impersonate a chief executive has some scary implications, especially given the fact that it is not inconceivable that coupled with video, the same attack could be played out as a video-call.

While technologies can help to identify fraudulent emails, voices, or videos, ultimately this boils down to a process issue. One employee should not have the ability to independently create a new payee and transfer large sums of money.

It is in these scenarios where having robust processes and segregation of duties can help immensely whereby more than one employee is needed to create a new payment, and its legitimacy is validated through pre-approved channels

Last edited 3 years ago by Javvad Malik
Information Security Buzz
2
0
Would love your thoughts, please comment.x
()
x