Cloning the person’s entire speaking style

“What’s really important here is that it’s not only the individual words that sound authentic, but cloning the person’s entire speaking style,” Hamerstone said. “It not only sounds like the person on a word-for-word basis, it also sounds like the person when they are speaking in longer sentences. It picks up patterns of speech too, such as pauses, mouth noises, etc. and is very convincing.”

The scammers are looking for money, especially in the form of gift cards since they are difficult to trace. Fraudsters will also be trying to gain access to a computer, confirmation of bank information or passwords or take a bolder step and request for access to funds via wire, Zelle, or another instantaneous payment method, Pierson said.

The number of consumer scams will increase — grandparent scams are likely to proliferate and donation/charity scams are likely to benefit from this too, he said.

“Voice cloning, ChatGPT, image creators and deep fakes are all incredibly powerful tools in their own right, but when used in combination, they are likely to overwhelm even the most security-conscious person,” Hamerstone said.

Voice cloning will be popular, and it is about cloning the person’s entire speaking style.

“This technology can be used for malicious purposes and we are starting to see this happening,” Pierson said. “Right now it looks more like a luxury attack method, but in the coming months and years it will most likely be applied en masse and really create a cyber cyclone of fraud.” Chris Pierson, CEO of BlackCloak said.

AI is making it possible to clone a person’s voice and produce “authentic-sounding statements and conversations from them,”

According to TheStreet, a financial news and financial literacy website based in New York, United States.

哈默斯通說:“這裡真正重要的是,聽起來真實的不僅是單個單詞,還有這個人的整個說話風格。” “它不僅逐字逐句聽起來像這個人,而且當他們說較長的句子時,它也聽起來像這個人。它也會拾取說話的模式,例如停頓、嘴巴噪音等,並且是 很有說服力。”

詐騙者正在尋找金錢,尤其是以禮品卡的形式,因為它們很難追踪。 皮爾森說,欺詐者還將試圖獲得計算機訪問權限、確認銀行信息或密碼,或者採取更大膽的步驟,請求通過電匯、Zelle 或其他即時支付方式獲取資金。

他說,消費者詐騙的數量將會增加——祖父母詐騙可能會激增,捐贈/慈善詐騙也可能從中受益。

“語音克隆、ChatGPT、圖像創建者和深度造假本身都是非常強大的工具,但當結合使用時,即使是最注重安全的人也可能會不知所措,”Hamerstone 說。

來自奧蘭多的 BlackCloak CEO Chris Pierson 表示:“這項技術可以用於惡意目的,我們開始看到這種情況發生,”皮爾森說。 “現在它看起來更像是一種奢侈的攻擊方法,但在未來的幾個月和幾年裡,它很可能會被大規模應用,並真正造成一場網絡欺詐風暴。”

https://en.vocofy.ai/scammers-use-ai-technology-to-trick-victims/?swcfpc=1
Voice Cloning and Scammers
person’s entire speaking style