hi everybody it's me jennifer lopez i'm so excited to be on start here my absolute favorite show let's talk about my new album or we could talk about the dangers of deep fakes they've gone from look-alikes that seem a bit off this is a dangerous time trust in what is genuine and what is not to scary good videos that have gone viral we have now reached the state where the impersonations are extremely well done extremely hard to detect i genuinely love the process of manipulating people online for money the technology behind deep fakes is getting
smarter while new apps are making it easier the better deep fakes become the harder it becomes to assess what is real and what is fake so what are deep fakes how can you spot them and why could something as fun as a face swap actually be the start of something way more sinister [Music] let's break down the word itself the fake and deep fake is pretty self-explanatory while deep refers to deep learning by a machine which is a type of artificial intelligence and this is used to impersonate people into making saying things you know they've
never said or acting like that they've acted before probably the best way to explain it is to show you how we did it so i'm here with haldoon start here's graphics genius and he's taken a video of me and a video of jennifer lopez so i used a free software online called deep face slab the first step is to turn your video into hundreds of images basically the frames and the second step is to separate the face from the images so the software is pinning my eyes and my mouth to the center of the frame
so that it's easier to align with jlo's face and then we go to the step three which is the machine learning i need to keep pressing p to update because like it takes so much time i just like put the die on the pee and then put some weight on and go grab coffee [Music] and then we have the final step basically is adjusting the skin tune and merging the two faces on the final result the whole process actually took a few days but making deep fakes is getting easier as the technology improves and it's
improving fast so what can we do with it well there are plenty of useful creative and harmless applications to all this deep fakery this anti-malaria campaign had david beckham speaking nine languages malaria isn't just any disease this documentary used deep fakes to hide the real faces of lgbtq people in chechnya who were afraid to be identified there are slightly weirder uses as well the website myheritage reanimates photos of your dearly departed relatives and some people find that comforting what all of those examples have in common is that they're not about trying to deceive people and
the people involved are all in on it the problem is when deep fakes are made of people without their consent and so often that means women there are people out there using pictures of celebrities and ordinary women and deep faking them onto pornography actors a study in the netherlands found that a staggering 96 of the deep fakes online were non-consensual porn it's women's bodies uh identities um and rights that are being transgressed it's humiliating it's embarrassing and particularly with deep flex becoming so good it's very difficult to convince people that that isn't you because if
it looks like you it might as well be you we we see big impacts on people's mental health uh depression anxiety and it goes beyond the world of porn a mother in pennsylvania has been accused of trying to discredit three of her daughter's rivals on a cheerleading squad where police said she made deep fakes of them naked drinking and smoking financial scammers are using deep fake technology too in 2019 criminals used ai software to impersonate the voice of a businessman's boss on the phone they convinced him to transfer more than 240 000 to a bogus
hungarian bank account so the dangers of deep fakes are already real and they're adding to a whole world of misinformation a world where it can already be hard to know what's true and what's not where actual facts are dismissed as false conspiracy theories thrive and powerful states run sophisticated disinformation campaigns most of the misinformation we see in most of what people get affected by is much lower tech things photos taken out of context things that are simple photoshop jobs much simpler and cheaper ways of making misinformation go viral like this video of nancy pelosi a
senior u.s democrat who was made to look drunk just by slowing down the video it's really sad and here's the thing and i told this to the room it's really sad here's the thing and i told this to the room but the power of deep fake technology takes it all to another level in the old days if you wanted to threaten the united states you needed 10 aircraft carriers and nuclear weapons and long-range missiles increasingly all you need is the ability to produce a very realistic fake video what deep fakes do is create a climate
of doubt to the point where what's actually real can be mistaken as something fake that's what happened in gabon [Music] back in 2019 president ali bongo hadn't been seen in public for months and when the government released this video he didn't really look like himself turns out the president had suffered a stroke that changed how he looked but some people were convinced he was dead and that the video was a deep fake a week later a group of soldiers attempted a coup whoa so how can we spot deep fakes well there are some signs we
can look for there might be differences in resolution or if you see ghosting around the face or blurring around the ears or hairline chances are a computer made it what i tell people is if something makes you feel a strong emotion either really good or really mad that's the time to take an extra second and check to see if it's real but the reality is that as deep fakes get better they'll get harder and harder to spot researchers at universities and companies like microsoft and facebook are working on automated software to find and flag them
organizations like the un europol and the fbi are all actively looking into how to counter deep fakes as a threat we're always in this arms race of kind of a new technology exists people start using it for bad things and then we kind of adjust our understanding and move forward there's nothing inherently bad about the technology but we know the harm it can do so what you and i can do is be more aware don't be fooled by the rocks that i got or the face that i got either [Music] if you want to see
more start here episodes subscribe to the al jazeera youtube channel and our facebook instagram and twitter pages where we release new shows every week see you there you