Deepfake video and audio technologies could become a major threat to businesses over the next two years, leading to substantial financial losses, according to a report by CyberCube entitled Social Engineering: Blurring reality and fake.
The cyber insurance analytics firm said that cyber-criminals have become increasingly adept at creating realistic audio and video fakes using AI and machine learning technology in recent years. Advancements in this field have accelerated further as a result of the shift to remote working during the COVID-19 pandemic, as organizations become more reliant on video and audio-based methods of communication.
The study observed that the growing number of video and audio samples of business people available online provides further opportunities to simulate individuals in order to influence and manipulate others. This includes building photo-realistic representations of influential people, and the use of mouth mapping technology, which enables the movement of the human mouth during speech to be mimicked with high accuracy.
These methods can put organizations at risk of severe financial losses. For instance, the report highlighted a case where cyber-criminals used AI-based software to impersonate a chief executive’s voice to demand the fraudulent transfer of $243,000.
The analysis also highlighted how traditional social engineering techniques have been ramped up since the start of COVID-19. This includes gathering information available online or from stolen physical records to create a fake identity for a particular target, a practice known as social profiling. Methods such as this have become easier for cyber-villains because of the greater use of online platforms in addition to the blurring of domestic and business IT systems during the pandemic.
The report’s author Darren Thomson, head of cybersecurity strategy at CyberCube, commented: “As the availability of personal information increases online, criminals are investing in technology to exploit this trend. New and emerging social engineering techniques like deepfake video and audio will fundamentally change the cyber-threat landscape and are becoming both technically feasible and economically viable for criminal organizations of all sizes.
“Imagine a scenario in which a video of Elon Musk giving insider trading tips goes viral – only it’s not the real Elon Musk. Or a politician announces a new policy in a video clip, but once again, it’s not real. We’ve already seen these deepfake videos used in political campaigns; it’s only a matter of time before criminals apply the same technique to businesses and wealthy private individuals. It could be as simple as a faked voicemail from a senior manager instructing staff to make a fraudulent payment or move funds to an account set up by a hacker.”
The cyber insurance analytics firm said that cyber-criminals have become increasingly adept at creating realistic audio and video fakes using AI and machine learning technology in recent years. Advancements in this field have accelerated further as a result of the shift to remote working during the COVID-19 pandemic, as organizations become more reliant on video and audio-based methods of communication.
The study observed that the growing number of video and audio samples of business people available online provides further opportunities to simulate individuals in order to influence and manipulate others. This includes building photo-realistic representations of influential people, and the use of mouth mapping technology, which enables the movement of the human mouth during speech to be mimicked with high accuracy.
These methods can put organizations at risk of severe financial losses. For instance, the report highlighted a case where cyber-criminals used AI-based software to impersonate a chief executive’s voice to demand the fraudulent transfer of $243,000.
The analysis also highlighted how traditional social engineering techniques have been ramped up since the start of COVID-19. This includes gathering information available online or from stolen physical records to create a fake identity for a particular target, a practice known as social profiling. Methods such as this have become easier for cyber-villains because of the greater use of online platforms in addition to the blurring of domestic and business IT systems during the pandemic.
The report’s author Darren Thomson, head of cybersecurity strategy at CyberCube, commented: “As the availability of personal information increases online, criminals are investing in technology to exploit this trend. New and emerging social engineering techniques like deepfake video and audio will fundamentally change the cyber-threat landscape and are becoming both technically feasible and economically viable for criminal organizations of all sizes.
“Imagine a scenario in which a video of Elon Musk giving insider trading tips goes viral – only it’s not the real Elon Musk. Or a politician announces a new policy in a video clip, but once again, it’s not real. We’ve already seen these deepfake videos used in political campaigns; it’s only a matter of time before criminals apply the same technique to businesses and wealthy private individuals. It could be as simple as a faked voicemail from a senior manager instructing staff to make a fraudulent payment or move funds to an account set up by a hacker.”