You’re here to learn about deepfakes, right? I’ve spent the past year of my life discovering exactly how AI can be used in real life scams. This is my friend, Hannah.
Let’s take this video of her. “First room is done, in the first house I’ve ever owned. And we went with ‘Moss Green’!
” New mom. Just the sweetest person. But we’ll get back to her later.
You’ve probably been seeing a lot of crazy headlines about AI, how it can now be used to synthesize just about any type of media. These are examples of deepfakes. Where someone has manipulated someone else’s appearance and voice and used them for their own purposes.
“What’s up, TikTok? ! ” For example, to steal money, votes, or even create explicit content.
This technology is capable of shaping our perception of reality, and even though it might be completely fake, the consequences are far from it. It can bring down companies of all sizes. Small businesses all over the world are getting deepfake phone calls from who they believe to be a trusted vendor, asking for immediate payment for an “overlooked account.
” And once they pay it, that money’s not coming back. Maybe you’ve heard about the worker at that one company, who made a transfer of tens of millions of dollars at the request of the company’s CEO. Except it wasn’t the company’s CEO, it was an advanced AI voice clone of them.
The damage adds up real fast. Deepfakes even have the potential to derail elections. Just think, what if a damaging video of what looks like a candidate emerges in the final hours of an election?
“Deepfaked audio of President Biden told some not to vote. ” The consequences can be very real. [Speaking Ukrainian falsely telling his forces to surrender] Because these days, it’s become ridiculously easy.
“First room is done, in the first house I’ve ever owned. ” You remember my friend Hannah… “Hey, please don’t tell anyone. But they’ve got me in jail, and I need you to wire me bail money right now or I’ll have to spend the night here.
Please? ” You used to need hours of audio, or thousands of photos to pull off a good deepfake. Now, with just a single clip, or even a single image, a bad actor can make a play for your money.
Your vote. Your reputation. Your identity.
You. Luckily, I don’t have any bad intentions. But imagine what could happen if I did?
I can make it look like Hannah’s here. Or here. Or even here.
[Hannah arguing loudly] This is just what I was able to do while you were watching this. Think about what I could do if I had more time. It’s not complicated to do anymore, which means it can be done BY anyone, TO anyone.
Even you. And this is why we created a dedicated digital platform, the First Ai-iD Kit, with everything you need to know. But to get you started, I can give you some quick tips: If you get an email, text, call or video that seems suspicious, just stop.
That’s your biggest line of defense. All you have to do is stop, and check that this person is actually who they say they are by getting back to them using some other method. What scammers are trying to do is get you to act quickly and without thinking.
So all you gotta do is stop. And ask yourself a question: do I know this person, right? Does it seem legit?
Are they asking me something unusual, urgent, or weirdly emotional? Another great idea is decide on a safe word with those closest to you. Then, in any situation where something seems kinda off, go ahead and ask for it.
My personal safe word is, “space penguin. ” Well, it was, but now I have to change it because I told you, so thanks a lot. If you’re still worried, all you have to do is end the conversation.
Unfortunately, this is the world we live in now. Get familiar with the First Ai-iD Kit now, and get to know deepfakes before they know you.