top of page

Is it Evan?

There are some problems with 2025. Admittedly, things like Covid-19 aren’t the problem they were in your time, but I’m sorry to tell you that other crises have usurped them. I can’t get into it all right now, but this is a message for both the 2021 version of myself (who I’ll refer to as Evan-21), and the rest of you, who can learn from his mistake.

Around the summer of 2021, Evan-21 published something very stupid on Substack:


Deepfake Me! This is so funny and such a good idea As you know, I’ll be releasing a book soon, and I briefly touch on the subject of “deepfakes”. It’s a new type of artificial intelligence-powered tool which can manipulate things like images and videos. It’s a problem because if it gets out of control, we won’t know what’s real. Which is to say we’ll be even less sure of reality than we are now. So I think the subject deserves some attention.
I have a vested interest in wanting there to be more discussion about solving or at least combatting the problem, because if I become any kind of public political figure, I’ll be vulnerable to this as a kind of attack on my character, credibility, and trust with my readers. Can you imagine seeing convincing videos of me doing or saying the most horrible things, but not knowing if it was genuine or “synthetic” media? In my book, I reference Nina Schick’s book about deepfake technology, which I recommend along with this interview she did with Andrew Yang.
A short excerpt from my book where I reference Schick’s work:
“She points out that anyone with videos of themselves online will be susceptible to completely realistic deepfake manipulation of that content. And that means there will be convincing video and audio fabrication. The person in the deepfake will look and sound just like you, and the deepfaker can pull the puppet-strings however they want.
As of 2020, approximately 95% of this content is deepfake pornography, in which the people (mostly women) who have been face-swapped did not originate or consent to its production. But some people like Schick think that, by 2030, a high percentage of all online content could be synthetic.”
But rather than rambling on about the underlying technology, this post is just meant to put a spotlight on the issue and to put my own skin in the game. That’s right — below are a number of links to audio and video files I’ve uploaded which are ideal for deepfake manipulation. I’m going to be uploading hours of myself talking on YouTube in the upcoming months, anyway, at which point I will be easily deepfaked. So my thought is why not rip the bandage of now?
I’m encouraging you, my readers, to deepfake me, for the greater good! Share your creations with people you know, and help bring about the public awareness which will lead to action on this issue.

I’ll end the excerpt there. You can see where it’s going. Evan-21 posted a high-resolution video of himself speaking in a clear voice, reciting the alphabet, turning around 360 degrees, and generally creating the ideal conditions for fakery. You and I can see that this would get out of hand quickly, but somehow Evan-21 thought it would be a jovial, good-intentioned public education campaign.


He got what (he thought) he wanted. The video file spread well beyond what was, at the time, his small readership. Endless hours of synthetic Evan drowned out the limited amount of authentic videos and podcasts he had released at that point. The deepfake videos were indescribably grotesque, so I shall not describe them. Worse yet, it developed a life of its own, detached from the original intention of speaking about an important issue.


The thing is, Evan-21 was right that deepfakes were a problem, and that it was going to get worse. There is more synthetic content online in 2025 than there was in your time — there’s no getting around that, and I don’t want to lie to you. But it’s not as bad as Evan-21 and Nina Schick thought it would be. At that point, there were already several easy-to-use deepfake tools available online, such as Deepfakes Web, Aphrodite, and Avatarify.


It turned out, though, along with AI-manipulation of photos and videos, some other new technologies were just around the corner. That’s what I wish I would have written about instead of the post above. Honestly, the whole debacle stemming from it could have been avoided.


So let me see if I can explain why deepfakes are a problem, but not a catastrophe.


Some of the conversation in your time centered around an “AI arms race”. The thought was that deepfakes are power by AI, and we were also creating deepfake-detection tools powered by AI. They are seen like predator and prey — co-evolving. The deep learning of AI means that both sides will get stronger from their interaction. There is truth to this, especially beyond the conversation around deepfakes, but it is not my focus today.

What this line of thinking failed to factor was soon-to-be-released “invisible watermarking” technology. Imagine that you are wearing contact lenses, and they are so thin and clear that you forget you’re wearing them. A microscopic “watermark” is imprinted on the lens, which is detectable with a computer, but not by our own eyes. Every time you “record a video” with your eyes, you file it in the large metal cabinets in your brain.


Most everyone who knows you thinks you are great! But perhaps there is someone who wants to damage your career or relationship by deepfaking a scandalous video of you. That would not be nice of them, but these things happen. They reach into their own cranial filing system, and pull out what appears to be a video of you doing something improper with a peach. It’s obvious to you that the video is synthetic, but would someone who doesn’t know you be able to say for sure? The thing is, it looks just like the videos that your very own, authentic eyeballs record — except for what you did to that peach.


But wait! You transfer the file to your desktop computer and fire up Windows 95 or whatever was cutting-edge in your time. You run a program that “sees” the video in a different way — between the lines, in a sense. It can tell that, although it looks authentic to humans, it is missing the unique watermark that can only be produced by your personal lenses.


That might seem like a solution in itself: Put these invisible watermarks on every camera lens. Maybe even make glasses or contact lenses which record watermarked video. We can even imagine an audio equivalent — a superimposition of sounds beyond the human register, instead of a visual watermark too small for us to see. But it would be difficult to defend the authenticity of any of these without cryptography and blockchain.


Around that summer when Evan-21 wrote “Deepfake Me!”, people were starting to hear about cryptocurrencies and some were even enticed to get in on what seemed like it could be a new gold rush or Silicon Valley. But most of this latter group still didn’t know what they were buying. But let’s get right into how it overlaps with deepfakes and invisible watermarks.


The problem was, AI was able to detect these invisible-to-human watermarks. However intricate or teeny-tiny, the machines were one step ahead. That is to say, when people applied these watermark filters over their phones’ camera lenses, they thought they were safe. Instead, deepfake tools evolved ways of detecting and forging these watermarks — authentic media was still susceptible to being turned into synthetic media.


So we went another direction, and made the watermark more like a QR code. Which simply means that it is a unique, scannable pattern, linked to some other piece of data. If you’ve scanned one with your phone before, your phone sees it as an instruction to go to a certain web address. In 2025, we scan media with our phones, and they pick up these microscopic codes, which tell our phones to show us the “supply chain” of the image or video in question.


To understand why you would want this, you need to see things from my perspective. I know, in 2021, it seems normal to you that you can prove who you are. You just need to show ID. You probably have some combination of a birth certificate, social security number, voter registration, financial documents, driver’s license, and passport. What may not be obvious is the degree to which your identity is not self-sovereign. Your identity is created and verified by some kind of external authority, usually the government of your country.


This is a problem especially for refugees, who are increasingly being driven out of their homes by climate change, as well as for people whose governments are authoritarian and unaccountable. But really, as a world, we never looked back once we realized that blockchain technology allowed us to create and verify our own identities. We can take our “self” anywhere in the world, and do not rely on governments or corporations to prove who we are.


Your debates about things like voter ID laws went away, because our blockchain-based identities are, themselves, IDs. When you vote, the unique “key” of your blockchain does a kind of secret handshake with our blockchain-based voting system. So there is really no way to argue anymore that votes can be forged or that someone could vote more than once.


That’s why this whole self-sovereign identity thing was needed as the back-end for any tool, camera or otherwise, which generates or replicates media. It’s a public, incorruptible ledger with “receipts” showing every time authentic media is faithfully copied with the consent of its owner/originator. That’s why I said earlier: There are more deepfakes than ever, but we can quickly crosscheck public records of authentic media to see what’s what.

“Sovereign technology, able to operate in peer-to-peer networks, validating identity, preserving anonymity, encrypting data, decentralizing infrastructure, with free (as in freedom) open source code, can completely disrupt the described landscape.” - DemocracyEarth

Perhaps it’s easier if you imagine that you have the user-end, which involves you and your smartphone, for example. And the blockchain-powered database and verification system behind the scenes. And the two “talk” to each other, same as with your blockchain identity and the blockchain voting system mentioned above. And the core feature is that this conversation is done securely — nobody else can be you. And once the conversation is complete, the version of the media “discussed” between the two will be the only verifiably non-synthetic content of you online. The person who made a deepfake video of you and that peach would not be able verify it on this public database without your private key.


Smartphone identity system (SIS): I have recorded a video, identified by a unique QR code. Please make a record of this file on your decentralized computer server, paired with a timestamp. Blockchain identity system (BIS): Hello, SIS! I’d be happy to help you with that. What is this video? Your owner and a dog on a trampoline? Excellent! SIS: Yeah, I don’t want someone to deepfake it so it looks like my owner is bouncing with some other kind of animal, like a turtle or something. BIS: I wouldn’t want that either. You’re all set! The video in its current state has been saved in my files. Any other SISes who try to send me unauthorized duplicates or deepfakes won’t know our secret handshake, so they won’t be able to associate the file with your public blockchain address. Everyone will know it’s fake! SIS: You’re the best. BIS: No, you. SIS: I love having this conversation with you every time I send you a file. BIS: <3

Now you see, your pictures can show ID. I lament: That was not the case with Evan-21. And I can’t even stop him from posting Deepfake Me! this summer. But I hope this message has helped put his folly in a larger context. You probably don’t need advice like “don’t post high-resolution video of yourself speaking in a clear voice, reciting the alphabet, and turning around 360 degrees”. But I hope it’s constructive for me to say that I think you should be excited about the deepfake-fighting technology I spoke about, and the broader implications of blockchain-based everything! That’s all I can say though. You have exciting times ahead of you.

Comments


bottom of page