Fun new deepfake consequence: more convincing crypto scams

And the deepfakes are getting better.

The video is almost convincing: it features Solana co-founder Anatoly Yakovenko announcing a “historic day” for Solana. He thanks the “S-O-L” community and offers a giveaway through a QR code and a website. Sure, he sounds a little robotic — his voice is a monotone, unusual for him — and he hardly makes eye contact with the camera, but it’s a video, and seeing is believing, right?

It’s a fake, of course. And it’s been up on YouTube for a day. Not only that, at least one internet user says they saw it as an ad. It’s not just YouTube, either. The fake video is appearing in ads on the platform formerly known as Twitter and which Elon Musk would prefer you call X.

“There has been a substantial increase in deepfakes and other AI-generated content recently,” says Austin Federa, head of strategy at the Solana Foundation. (He notes it is not just a crypto problem.) He told me that Solana takes these fakes seriously, reporting them as quickly as possible. But Solana isn’t in charge of actually taking down the fakes. That’s up to platforms like YouTube and X. And they’re poky about it — Solana reported that video to YouTube last night.

After the publication of this article, YouTube terminated the account associated with the video. YouTube spokeswoman Nicole Bell confirmed the account had been removed.

Oh my god what is going on at twitter, I just got a deepfaked @aeyakovenko ad. @Austin_Federa you have contacts at Twitter, no? pic.twitter.com/FlVOx8tWjX

Crypto industry participants have complained to me for years that Big Tech platforms don’t act quickly enough to remove scams. But deepfakes get better, the scams become more convincing, and quick removal becomes increasingly important.

This is a moderation problem, of course, but it’s one with real consequences, particularly since the possibility of a Bitcoin ETF looms on the horizon. One of the perks of that kind of financialized product is that it’s handled by finance professionals — making it seem relatively safer. Those participants, after all, are less likely to be taken in by this kind of faked video.

Author: DPN

Leave a Reply

Your email address will not be published. Required fields are marked *