Comments On Deepfakes Security Issues

Cybercriminals are now perfecting deepfakes to impersonate people to steal money and anything that might be valuable. The technology has been improved to reach a higher level where it becomes difficult tell the difference between a fraud or friend.

https://twitter.com/gabrielwilder/status/1202500978785705984

Experts Comments

December 07, 2019
Robert Capps
VP
NuData Security
While an attacker can use deepfake techniques to convincingly emulate the likeness of an individual, it still difficult to digitally impersonate ones voice without fairly obvious imperfections. Deepfake audio or video cannot currently be rendered in real time, without an attacker having a large volume of computing resources and a lot of high quality audio and video source material to train computer machine learning algorithms. While Deepfakes can be convincing to other humans, they are unable.....Read More
While an attacker can use deepfake techniques to convincingly emulate the likeness of an individual, it still difficult to digitally impersonate ones voice without fairly obvious imperfections. Deepfake audio or video cannot currently be rendered in real time, without an attacker having a large volume of computing resources and a lot of high quality audio and video source material to train computer machine learning algorithms. While Deepfakes can be convincing to other humans, they are unable to pass physical or passive biometric verification, so coupling strong liveness detection, along with the collection of passive and physical biometric signals to verify a user’s identity, largely mitigate the current risks presented in banking transactions.  Read Less
What do you think of the topic? Do you agree with expert(s) or share your expert opinion below.
Be part of our growing Information Security Expert Community (1000+), please register here.