[ad_1]
As digital doppelganger technology continues to advance, making it harder for viewers to separate fact from fiction, crypto investors are being asked to pay attention to “deepfake” cryptocurrency scams.
David Schweed, COO of blockchain security firm Halborn, told Cointelegraph that the crypto industry is more “susceptible” to deepfakes than ever before. He has less time to verify the authenticity of the video because “time is of the essence when it comes to making decisions.”
According to technical writer at OpenZeppelin Vlad Etoup, deepfakes use deep learning artificial intelligence (AI) to manipulate and modify the original media, such as swapping faces in videos, photos and audio, to create highly realistic images. create great digital content.
Estoup noted that cryptocurrency scammers often use deepfake technology to create fake videos of famous people to carry out their scams.
An example of such a scam was a deepfake video of former FTX CEO Sam Bankman-Fried in November 2022. Scammers used old Bankman-Fried interview footage and an audio emulator to lure users to his malicious website, which promises “he doubles your cryptocurrency.” ”
Over the weekend, a verified account impersonating FTX founder SBF posted dozens of copies of this deepfake video in a phishing scam aimed at exfiltrating FTX users’ cryptocurrency wallets. Offered “compensation for loss”. pic.twitter.com/3KoAPRJsya
— Jason Koebler (@jason_koebler) November 21, 2022
Schwed said the volatile nature of cryptocurrencies can lead people to panic and take a “safer than regret” approach, leading to deepfake scams. he said:
“If a CZ video comes out claiming that withdrawals will be stopped in an hour, are you going to withdraw your funds immediately or spend hours trying to determine if the message is genuine? ?”
However, Estup believes that while deepfake technology is advancing rapidly, it is not yet “indistinguishable from reality.”
How To Spot A Deepfake: Look In The Eyes
Schwed suggests that one useful way to quickly spot deepfakes is to watch subjects blink. If it looks unnatural, it’s most likely a deepfake.
This is due to the fact that deepfakes are generated using image files sourced on the Internet, Schwed explained, and the subjects usually have their eyes open. Therefore, deepfakes must simulate subject blinking.
Hey @elonmusk & @Tucker Carlson have you seen it? #deepfake A paid ad featuring the two of you? @Youtube How is this allowed?This is getting out of hand, it’s not # freedom of speech it is straight #scam: Musk reveals reasons for financial aid to Canadians https://t.co/IgoTbbl4fL pic.twitter.com/PRMfiyG3Pe
— Matt Dupuis (@MatthewDupuis) January 4, 2023
Of course, the best identifiers, according to Schwed, are to ask questions that only real individuals can answer.
AI software that can detect deepfakes is also available, Estoup said, suggesting that significant technological advances in the field should be noted.
He also gave age-old advice.
Related: ‘Wow! ‘Elon Musk warns users against latest deepfake crypto scam
Last year, Binance Chief Communications Officer Patrick Hillman revealed in an August 2022 blog post that his deepfakes were used to carry out a sophisticated scam.
Hillman said the team used past news interviews and TV appearances over the years to create deepfakes and “spoof some very intelligent cryptocurrency members.”
He only realized this when he started receiving online messages thanking him for speaking with the project team about the possibility of listing his assets on Binance.com.
Earlier this week, blockchain security firm SlowMist noted that there will be 303 blockchain security incidents in 2022, 31.6% of which will be due to phishing, lag pulls and other scams.
[ad_2]
Source link