Under 18s are the biggest audience on YouTube. My favourite YouTuber is a gamer called DanTDM. I like, like animal videos.
MrBeast. And parents and schools now rely on it as a tool to access great science content. But there's a new type of content creator, using Artificial Intelligence to create videos full of false information.
That never existed, this never happened to him, this woman doesn’t exist, those conversations didn't happen. Our research shows that these creators take advantage of kids’ clicks to cash in, while spreading misinformation to classrooms around the world. And the kids seem to be buying it.
I found it really interesting that pyramids can make electricity. ‘Bad science’ videos are flooding YouTube, optimised for the algorithm with catchy titles and controversial topics. [AI Voice:] ‘Imagine being told that the world you live in is not real.
’ Full of pseudoscience and false information. [AI Voice] ‘Unpredictable patterns of highs and lows that might not be directly correlated with the effects of human activities and greenhouse gas emissions. ’ Creators are tagging these videos as ‘educational content’, and they’re often beating legitimate science videos in the race to be recommended.
These videos do well because they are potentially in some way maybe conspiratorial. You know, we're all fascinated by things that run counter to what we're officially told. And children obviously maybe are more susceptible perhaps to this than adults.
We wanted to see if these videos were reaching children, so we created an experiment. We set up four children’s accounts on YouTube. Each account watched 50 science videos from legitimate creators.
After only four days, one of the 'bad science' channels cropped up in the recommended videos. Once we clicked these videos, they flooded our recommended feed. And it’s in every corner of the globe, with channels translating the videos into more than 20 languages.
But would kids in the real world believe what they were seeing? We showed two examples of ‘bad science’ videos to two groups of children: one in the UK and one in Thailand. [AI Voice]: ‘No-one knows when they were built, how they were built, who built them, and most importantly, why were they built?
’ [AI Voice] ‘With the right amount of pressure, the Great Pyramid could generate a tremendous amount of electricity. ’ I find it really interesting that pyramids can make electricity. [AI Voice]: ‘Pyramid power-plants were and are possible.
’ I was quite surprised to find out just a pile of rocks can form electricity. [AI Voice, in Thai]: ‘Terrifying images of UFOs are depicted in many paintings from the Renaissance period. ' I believe in it more, previously I was sceptical, but after watching it I believe in alien stories more.
I thought it was really cool because I like love aliens and stuff like that. [AI Voice]: ‘The only thing missing for the Great Pyramid of Giza to function as a power plant was a source of energy. ’ I didn’t know that people so long ago would be able to make electricity and use modern technology.
[AI Voice]: ‘Due to the recent surge in sighting reports from all around the world, the UFO community endured a period of extreme heat. ’ At the beginning I wasn’t sure aliens exist but as I continued watching it, I think they do. That’s why I enjoyed watching it because there was proof.
[AI Voice]: ‘The objects are said to be of exotic origin or non-human intelligence whether alien or ancient in origin. ’ The person who was talking sounded like very professional and knew what he was talking about. We found more than 50 channels creating these ‘bad science’ videos, and they are getting hundreds of thousands, sometimes millions of views.
But how are they multiplying so fast? We found out that these channels are being created using Artificial Intelligence. A video needs a script.
With AI, it can be generated in seconds. Then it needs a voice. [AI Voice: "It no longer needs to be human"] It's not quite there yet, but eventually, we won’t be able to tell the difference.
Then, AI can find footage from across the internet - taking from different sources and piece together the final film. Some footage and graphics have been stolen from legitimate educational creators, and repurposed into false information. Kyle Hill, a science communication specialist, educator and YouTuber began to notice these videos cropping up in his feed a couple of months ago.
So being a YouTube creator, I always try to have my ear to the ground for what other science and technology related channels are doing. But it wasn't until one of my viewers actually pointed out that it looked like a lot of the channels they were getting recommended after watching my videos started looking very the same. And these videos do all look really similar.
The logos look alike, the same subjects and near identical thumbnails. And they’re full of false information, like this: [AI Voice]: ‘Weather patterns have seen some remarkable changes in the past decades, something which many might attribute to climate change. ’ [AI Voice]: ‘But these changes might not be caused by climate change at all.
’ Once the footage is taken, the AI channels change or even ignore the original meaning. Here, they took old footage from a NASA expert’s video. They took out his voice, and replaced it with AI narration saying climate change isn’t caused by humans.
Which isn’t in the original. Here, they’ve taken a James Webb telescope animation from a legitimate science creator. The AI video used it to say scientists are covering up that the telescope disproved the big bang theory - which it never did.
These channels seem to have identified the exact right thing, and how to do that thing to maximise views with the least amount of actual effort. And more views equals more money through advertising revenue, with channels often getting thousands of pounds per video. With new AI tools, anyone can create channels in a matter of hours, and there’s hundreds of tutorials on YouTube.
[AI Voices]: 'So, you want to make money with AI and YouTube. ' 'I created this faceless YouTube channel using only AI. ' 'To script, edit, and create a faceless YouTube channel.
' With each video getting tens of thousands of views, these channels can mean massive payouts for creators. And creators aren’t the only ones profiting. YouTube takes nearly half of advertising revenue from every video.
The idea that YouTube and Google making money off the back of adverts being served against, pseudo science, AI-generated news, that's really, you know, that seems really unethical to me. That video was actually all fake. I’m actually really confused, I thought that was 100% real.
I would’ve probably believed it if you hadn’t told us it was fake. I think I did believe it until a few minutes ago. I’m just shocked.
I think children will often take what they've seen as fact, first and foremostly, and then maybe, when they're a little older, start to question it. But it's not your starting point. If you're watching something educational, you're watching it so that you learn and we don't question, do we?
It's just not in our wiring to do that, so that's why it's such a concern. I have seen some students sharing disinformation at school, if I spot it I would tell them to think harder, not believe it all. YouTube told us that they recommend YouTube Kids for under 13s, which has a “higher bar” for videos shown.
They said they’re committed to removing misinformation from their platforms. They also directed us to information panels that show “additional context” on conspiracy-related content. We found this was only present for a few of the videos across the 50 channels.
They didn’t comment on advertising revenue they may receive from these videos. We reached out to some of the channels for comment. One responded, saying their videos were intended for “entertainment purposes” and that they didn’t target children.
They also said the majority of their scripts were not written using AI. Good information is probably going to be pushed out. We will have so much AI-generated content that you will not want to spend the time or the effort ever sifting through it.
I think this is an emerging threat. I think that we don't have a really clear understanding yet of how AI and AI-generated content is really impacting children's understanding. But some of the kids were able to spot that there was something not quite right about the videos.
Maybe because of the voice, the choice of the voice they had, they used an AI voice. I thought it was fake because you could tell that it was not edited properly. As teachers, we need to have conversations with our children about what they're watching and the media that they're absorbing so that we understand that.
AI is evolving fast. As these videos continue to multiply, bad science could drown out good content.