Tech/Science

Deepfake Voice Attacks Pose Challenge for Biometric Software Companies and Researchers

Deepfake voice attacks are posing a real-world challenge for biometric software companies and public researchers as they strive to detect these deceptive impersonations. Recently, there have been instances of deepfake voice attacks, such as robocalls impersonating President Joe Biden, which have raised concerns about the efficacy of current detection methods.

Amidst this backdrop, software maker ID R&D, a unit of Mitek, has entered the market with a video demonstrating its voice biometrics liveness code’s ability to differentiate between real recordings and digital impersonations. This move comes in response to a previous high-profile voice cloning scandal involving pop star Taylor Swift.

However, the recent electoral fraud attempt involving a deepfake audio of Biden has presented a unique challenge. While some detector makers like ElevenLabs and Clarity have weighed in on the situation, there is still uncertainty surrounding the ability to accurately detect deepfake voices.

ElevenLabs, which focuses on creating voices, recently achieved unicorn status after raising an $80 million series B funding round. On the other hand, Clarity found the misinformation attack to be 80 percent likely a deepfake. The lack of consensus among industry players underscores the complexity of the issue.

Amidst the uncertainty, a team of students and alums from the University of California – Berkeley claim to have developed a detection method that operates with minimal errors. Their approach involves using deep-learning models to process raw audio and extract multi-dimensional representations, known as embeddings, to discern real from fake.

While this research offers promise, it is important to note that the method has thus far been tested in a lab setting and will require further validation in real-world scenarios. As the debate around deepfake voice attacks continues, the industry is grappling with the urgent need for more effective detection methods to combat the growing threat of deceptive audio impersonations.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *