Deepfake 'Face Swap' Attacks Trending
Deepfake attacks using are using “face swap” technology with the aim to bypass remote identity verification has increased by 704% in 2023, according to a recent report published by iProov.
Remote identity verification technology is necessary to ensure that organisations interact with the right people online, but in this era of Generative AI, not all technologies offer the same levels of assurance.
Now, free and low-cost face swap tools, virtual cameras and mobile emulators are increasing the growing number of deepfake-focused threat actors, identity verification company iProov found in its 2024 Report called The Impact of Generative AI on Remote identity Verification.
“Generative AI has provided a huge boost to threat actors’ productivity levels: these tools are relatively low cost, easily accessed, and can be used to create highly convincing synthesised media such as face swaps or other forms of deepfakes that can easily fool the human eye as well as less advanced biometric solutions,” iProov Chief Scientific Officer Andrew Newell said in a public statement.
In addition to identifying face swaps as the “deepfake of choice amongst persistent threat actors,” iProov’s Security Operations Centre (iSOC) found that injection attacks targeting mobile identify verification platforms increased by 255%, while use of emulators in these attacks rose by 353% between the first and second halves of 2023.
Furthermore, the number of threat groups exchanging information online about attacks on biometric and video identification systems nearly doubled between 2022 and 2023, with 47% of these groups surfacing within the last year.
Deepfakes videos are most commonly combined with digital injection attacks that use a virtual camera feed to replace the webcam or other device camera feed that would normally be used to display one’s face for verification. For example, OBS Studio, a legitimate open-source streaming tool, includes a virtual camera feature that could potentially be used to display deepfake video.
Digital injection attacks are more technically advanced than presentation attacks, in which a mask or a video on a screen is held up to the camera. While many facial biometric systems are equipped with presentation attack detection (PAD), injection attacks are more difficult to detect and doubled in frequency in 2023 says a Gartner Press Release.
By 2026, attacks using AI-generated deepfakes on face biometrics will mean that 30% of enterprises will no longer consider such identity verification and authentication solutions to be reliable in isolation, according to Gartner.
Deepfake threat actor groups frequently target manual or hybrid identity verification systems where a human operator has the last say, according to iProov. These groups consider humans to be easier to fool using deepfake injection attacks compared with computerised facial recognition systems.
In 2023, the FBI has also warned about a rising number of scammers using deepfake tech to impersonate job candidates during interviews for remote positions.
To assist organisations in protecting themselves against AI-generated deepfakes beyond face biometrics, CISOs and risk managers should select service providers who can demonstrate they have the capabilities and a plan that goes beyond current standards and are monitoring, classifying and quantifying these new types of attacks.
SC Media | iproov | Gartner | PetaPixel | The Next Web | Twitter | Fintech
Image: Dasha Yukhymyuk
You Might Also Read:
Deepfakes Are Making Business Email Compromise Worse:
___________________________________________________________________________________________
If you like this website and use the comprehensive 6,500-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible