Mon 07 August 2023:
The world has been dominated by generative AI in recent years. Cybercriminals have discovered a way to use this technique to clone voices, deceive individuals and organizations into providing passwords or significant quantities of money, reports Business Tech.
AI has developed into a potent instrument in the company toolkit as an intelligent and extremely capable technology that offers quick analysis, assistance, and usefulness.
Stephen Osler, a cybersecurity expert and co-founder of Nclose, stated that cybercriminals have taken advantage of AI’s deceptive powers to build convincing deep fakes and commit unnervingly realistic voice scams.
In 2019, the technique was used to extort $243,000 (R4.3 million) by impersonating the voice of the CEO of an energy firm in the United Kingdom. A Hong Kong business was robbed of $35 million (R631 million) in 2021.
Voice clone scams, such as kidnapping hoaxes, requests for money from friends or family, and emergency calls, are all part of these scams that are proving difficult to detect.
The use of artificial intelligence methods to clone voices has created a completely new world of risk for both businesses and individuals, according to Stephen Osler, Co-Founder and Business Development Director at Nclose.
According to Osler, WhatsApp voice notes could become a significant vulnerability for people, particularly high-level executives.
“Using readily available tools online, scammers can create realistic conversations that mimic the voice of a specific individual using just a few seconds of recorded audio.
“While they have already targeted individuals making purchases on platforms like Gumtree or Bob Shop, as well as engaged in fake kidnapping scams, they are now expanding their operations to target high-level executives with C-Suite scams,” he said.
“All they do is create a WhatsApp voice note with these automated, voice generative tools and send it to the finance manager and request them to approve the transaction,” Osler told News24.
According to the cybersecurity expert, targeting companies where such transactions are standard business practice is much easier.
“An IT administrator might receive a voice note from their manager requesting a password reset for their access to O365,” noted Osler.
“Unaware of the malicious intent, the administrator complies, thinking it’s a legitimate instruction. However, in reality, they unintentionally provide privileged credentials to a threat actor. This information can then be exploited to gain unauthorised access to critical business infrastructure and potentially deploy ransomware.”
What you can do to avoid being a victim
To guard against this, Osler advises organizations to implement extremely robust systems and procedures that need several levels of verification, particularly for financial or authentication-based activities.
Companies should also establish a clearly defined formal process for all transactions.
“Relying solely on a voice note from the CIO or CISO should not be sufficient to change a password, authenticate a monetary transaction, or grant hackers access to the business,” he said.
“It is crucial to educate employees and end-users about the evolving risks associated with these threats. If they are aware of this type of scam, they are more likely to take a moment to verify the information before making a costly mistake”.
Moreover, Osler said people must always ensure that any voice note or instruction they receive is from a trusted source, adding that it’s important to double-check and confirm that the communication is indeed from the intended person.
“Cultivate an inquisitive mindset and question the source, whether it is a call, email, or message. By doing so, both organisations and individuals can be better prepared to identify and protect themselves against potential voice cloning scams,” he said.
SOURCE: INDEPENDENT PRESS AND NEWS AGENCIES
______________________________________________________________
FOLLOW INDEPENDENT PRESS:
TWITTER (CLICK HERE)
https://twitter.com/IpIndependent
FACEBOOK (CLICK HERE)
https://web.facebook.com/ipindependent
Think your friends would be interested? Share this story!