The nightmare videos of childrens’ YouTube — and what’s wrong with the internet today | James Bridle

The nightmare videos of childrens’ YouTube — and what’s wrong with the internet today | James Bridle


I’m James. I’m a writer and artist, and I make work about technology. I do things like draw life-size outlines
of military drones in city streets around the world, so that people can start to think
and get their heads around these really quite hard-to-see
and hard-to-think-about technologies. I make things like neural networks
that predict the results of elections based on weather reports, because I’m intrigued about what the actual possibilities
of these weird new technologies are. Last year, I built
my own self-driving car. But because I don’t
really trust technology, I also designed a trap for it. (Laughter) And I do these things mostly because
I find them completely fascinating, but also because I think
when we talk about technology, we’re largely talking about ourselves and the way that we understand the world. So here’s a story about technology. This is a “surprise egg” video. It’s basically a video of someone
opening up loads of chocolate eggs and showing the toys inside to the viewer. That’s it. That’s all it does
for seven long minutes. And I want you to notice
two things about this. First of all, this video
has 30 million views. (Laughter) And the other thing is, it comes from a channel
that has 6.3 million subscribers, that has a total of eight billion views, and it’s all just more videos like this — 30 million people watching a guy
opening up these eggs. It sounds pretty weird, but if you search
for “surprise eggs” on YouTube, it’ll tell you there’s
10 million of these videos, and I think that’s an undercount. I think there’s way, way more of these. If you keep searching, they’re endless. There’s millions and millions
of these videos in increasingly baroque combinations
of brands and materials, and there’s more and more of them
being uploaded every single day. Like, this is a strange world. Right? But the thing is, it’s not adults
who are watching these videos. It’s kids, small children. These videos are
like crack for little kids. There’s something about the repetition, the constant little
dopamine hit of the reveal, that completely hooks them in. And little kids watch these videos
over and over and over again, and they do it for hours
and hours and hours. And if you try and take
the screen away from them, they’ll scream and scream and scream. If you don’t believe me — and I’ve already seen people
in the audience nodding — if you don’t believe me, find someone
with small children and ask them, and they’ll know about
the surprise egg videos. So this is where we start. It’s 2018, and someone, or lots of people, are using the same mechanism that, like,
Facebook and Instagram are using to get you to keep checking that app, and they’re using it on YouTube
to hack the brains of very small children in return for advertising revenue. At least, I hope
that’s what they’re doing. I hope that’s what they’re doing it for, because there’s easier ways
of making ad revenue on YouTube. You can just make stuff up or steal stuff. So if you search for really
popular kids’ cartoons like “Peppa Pig” or “Paw Patrol,” you’ll find there’s millions and millions
of these online as well. Of course, most of them aren’t posted
by the original content creators. They come from loads and loads
of different random accounts, and it’s impossible to know
who’s posting them or what their motives might be. Does that sound kind of familiar? Because it’s exactly the same mechanism that’s happening across most
of our digital services, where it’s impossible to know
where this information is coming from. It’s basically fake news for kids, and we’re training them from birth to click on the very first link
that comes along, regardless of what the source is. That’s doesn’t seem like
a terribly good idea. Here’s another thing
that’s really big on kids’ YouTube. This is called the “Finger Family Song.” I just heard someone groan
in the audience. This is the “Finger Family Song.” This is the very first one I could find. It’s from 2007, and it only has
200,000 views, which is, like, nothing in this game. But it has this insanely earwormy tune, which I’m not going to play to you, because it will sear itself
into your brain in the same way that
it seared itself into mine, and I’m not going to do that to you. But like the surprise eggs, it’s got inside kids’ heads and addicted them to it. So within a few years,
these finger family videos start appearing everywhere, and you get versions
in different languages with popular kids’ cartoons using food or, frankly, using whatever kind
of animation elements you seem to have lying around. And once again, there are millions
and millions and millions of these videos available online in all of these
kind of insane combinations. And the more time
you start to spend with them, the crazier and crazier
you start to feel that you might be. And that’s where I
kind of launched into this, that feeling of deep strangeness
and deep lack of understanding of how this thing was constructed
that seems to be presented around me. Because it’s impossible to know
where these things are coming from. Like, who is making them? Some of them appear to be made
of teams of professional animators. Some of them are just randomly
assembled by software. Some of them are quite wholesome-looking
young kids’ entertainers. And some of them are from people who really clearly
shouldn’t be around children at all. (Laughter) And once again, this impossibility
of figuring out who’s making this stuff — like, this is a bot? Is this a person? Is this a troll? What does it mean
that we can’t tell the difference between these things anymore? And again, doesn’t that uncertainty
feel kind of familiar right now? So the main way people get views
on their videos — and remember, views mean money — is that they stuff the titles
of these videos with these popular terms. So you take, like, “surprise eggs” and then you add
“Paw Patrol,” “Easter egg,” or whatever these things are, all of these words from other
popular videos into your title, until you end up with this kind of
meaningless mash of language that doesn’t make sense to humans at all. Because of course it’s only really
tiny kids who are watching your video, and what the hell do they know? Your real audience
for this stuff is software. It’s the algorithms. It’s the software that YouTube uses to select which videos
are like other videos, to make them popular,
to make them recommended. And that’s why you end up with this
kind of completely meaningless mash, both of title and of content. But the thing is, you have to remember, there really are still people within
this algorithmically optimized system, people who are kind
of increasingly forced to act out these increasingly bizarre
combinations of words, like a desperate improvisation artist
responding to the combined screams of a million toddlers at once. There are real people
trapped within these systems, and that’s the other deeply strange thing
about this algorithmically driven culture, because even if you’re human, you have to end up behaving like a machine just to survive. And also, on the other side of the screen, there still are these little kids
watching this stuff, stuck, their full attention grabbed
by these weird mechanisms. And most of these kids are too small
to even use a website. They’re just kind of hammering
on the screen with their little hands. And so there’s autoplay, where it just keeps playing these videos
over and over and over in a loop, endlessly for hours and hours at a time. And there’s so much weirdness
in the system now that autoplay takes you
to some pretty strange places. This is how, within a dozen steps, you can go from a cute video
of a counting train to masturbating Mickey Mouse. Yeah. I’m sorry about that. This does get worse. This is what happens when all of these different keywords, all these different pieces of attention, this desperate generation of content, all comes together into a single place. This is where all those deeply weird
keywords come home to roost. You cross-breed the finger family video with some live-action superhero stuff, you add in some weird,
trollish in-jokes or something, and suddenly, you come
to a very weird place indeed. The stuff that tends to upset parents is the stuff that has kind of violent
or sexual content, right? Children’s cartoons getting assaulted, getting killed, weird pranks that actually
genuinely terrify children. What you have is software pulling in
all of these different influences to automatically generate
kids’ worst nightmares. And this stuff really, really
does affect small children. Parents report their children
being traumatized, becoming afraid of the dark, becoming afraid of their favorite
cartoon characters. If you take one thing away from this,
it’s that if you have small children, keep them the hell away from YouTube. (Applause) But the other thing, the thing
that really gets to me about this, is that I’m not sure we even really
understand how we got to this point. We’ve taken all of this influence,
all of these things, and munged them together in a way
that no one really intended. And yet, this is also the way
that we’re building the entire world. We’re taking all of this data, a lot of it bad data, a lot of historical data
full of prejudice, full of all of our worst
impulses of history, and we’re building that
into huge data sets and then we’re automating it. And we’re munging it together
into things like credit reports, into insurance premiums, into things like predictive
policing systems, into sentencing guidelines. This is the way we’re actually
constructing the world today out of this data. And I don’t know what’s worse, that we built a system
that seems to be entirely optimized for the absolute worst aspects
of human behavior, or that we seem
to have done it by accident, without even realizing
that we were doing it, because we didn’t really understand
the systems that we were building, and we didn’t really understand
how to do anything differently with it. There’s a couple of things I think
that really seem to be driving this most fully on YouTube, and the first of those is advertising, which is the monetization of attention without any real other variables at work, any care for the people who are
actually developing this content, the centralization of the power,
the separation of those things. And I think however you feel
about the use of advertising to kind of support stuff, the sight of grown men in diapers
rolling around in the sand in the hope that an algorithm
that they don’t really understand will give them money for it suggests that this
probably isn’t the thing that we should be basing
our society and culture upon, and the way in which
we should be funding it. And the other thing that’s kind of
the major driver of this is automation, which is the deployment
of all of this technology as soon as it arrives,
without any kind of oversight, and then once it’s out there, kind of throwing up our hands and going,
“Hey, it’s not us, it’s the technology.” Like, “We’re not involved in it.” That’s not really good enough, because this stuff isn’t
just algorithmically governed, it’s also algorithmically policed. When YouTube first started
to pay attention to this, the first thing they said
they’d do about it was that they’d deploy
better machine learning algorithms to moderate the content. Well, machine learning,
as any expert in it will tell you, is basically what we’ve started to call software that we don’t really
understand how it works. And I think we have
enough of that already. We shouldn’t be leaving
this stuff up to AI to decide what’s appropriate or not, because we know what happens. It’ll start censoring other things. It’ll start censoring queer content. It’ll start censoring
legitimate public speech. What’s allowed in these discourses, it shouldn’t be something
that’s left up to unaccountable systems. It’s part of a discussion
all of us should be having. But I’d leave a reminder that the alternative isn’t
very pleasant, either. YouTube also announced recently that they’re going to release
a version of their kids’ app that would be entirely
moderated by humans. Facebook — Zuckerberg said
much the same thing at Congress, when pressed about how they
were going to moderate their stuff. He said they’d have humans doing it. And what that really means is, instead of having toddlers being
the first person to see this stuff, you’re going to have underpaid,
precarious contract workers without proper mental health support being damaged by it as well. (Laughter) And I think we can all do
quite a lot better than that. (Applause) The thought, I think, that brings those
two things together, really, for me, is agency. It’s like, how much do we really
understand — by agency, I mean: how we know how to act
in our own best interests. Which — it’s almost impossible to do in these systems that we don’t
really fully understand. Inequality of power
always leads to violence. And we can see inside these systems that inequality of understanding
does the same thing. If there’s one thing that we can do
to start to improve these systems, it’s to make them more legible
to the people who use them, so that all of us have
a common understanding of what’s actually going on here. The thing, though, I think
most about these systems is that this isn’t, as I hope
I’ve explained, really about YouTube. It’s about everything. These issues of accountability and agency, of opacity and complexity, of the violence and exploitation
that inherently results from the concentration
of power in a few hands — these are much, much larger issues. And they’re issues not just of YouTube
and not just of technology in general, and they’re not even new. They’ve been with us for ages. But we finally built this system,
this global system, the internet, that’s actually showing them to us
in this extraordinary way, making them undeniable. Technology has this extraordinary capacity to both instantiate and continue all of our most extraordinary,
often hidden desires and biases and encoding them into the world, but it also writes them down
so that we can see them, so that we can’t pretend
they don’t exist anymore. We need to stop thinking about technology
as a solution to all of our problems, but think of it as a guide
to what those problems actually are, so we can start thinking
about them properly and start to address them. Thank you very much. (Applause) Thank you. (Applause) Helen Walters: James, thank you
for coming and giving us that talk. So it’s interesting: when you think about the films where
the robotic overlords take over, it’s all a bit more glamorous
than what you’re describing. But I wonder — in those films,
you have the resistance mounting. Is there a resistance mounting
towards this stuff? Do you see any positive signs,
green shoots of resistance? James Bridle: I don’t know
about direct resistance, because I think this stuff
is super long-term. I think it’s baked into culture
in really deep ways. A friend of mine,
Eleanor Saitta, always says that any technological problems
of sufficient scale and scope are political problems first of all. So all of these things we’re working
to address within this are not going to be addressed
just by building the technology better, but actually by changing the society
that’s producing these technologies. So no, right now, I think we’ve got
a hell of a long way to go. But as I said, I think by unpacking them, by explaining them, by talking
about them super honestly, we can actually start
to at least begin that process. HW: And so when you talk about
legibility and digital literacy, I find it difficult to imagine that we need to place the burden
of digital literacy on users themselves. But whose responsibility
is education in this new world? JB: Again, I think this responsibility
is kind of up to all of us, that everything we do,
everything we build, everything we make, needs to be made
in a consensual discussion with everyone who’s avoiding it; that we’re not building systems
intended to trick and surprise people into doing the right thing, but that they’re actually involved
in every step in educating them, because each of these systems
is educational. That’s what I’m hopeful about,
about even this really grim stuff, that if you can take it
and look at it properly, it’s actually in itself
a piece of education that allows you to start seeing
how complex systems come together and work and maybe be able to apply
that knowledge elsewhere in the world. HW: James, it’s such
an important discussion, and I know many people here
are really open and prepared to have it, so thanks for starting off our morning. JB: Thanks very much. Cheers. (Applause)

Danny Hutson

100 thoughts on “The nightmare videos of childrens’ YouTube — and what’s wrong with the internet today | James Bridle

  1. So around when my brother was 4-5, my mother let my brother watch YouTube, he watched these kinds of videos and now if I try to get him off the TV or his Phone, he gets aggressive. Physical aggressive. He is 10 years old and has not changed. I fear for when he is an adult..

    Edit: another example of these videos brainwashing kids, my Ex’s mom used to let her little brother watch these videos hours and hours on end. I tried telling her about this but her mom didn’t listen.

  2. What I find curious is the fact that none of the media corporations who own the copiright to Peppa Pig, Paw Patrol, Super Wings, Frozen, etc. seems to go after all those channels that steal their IPs. Not even Disney strikes copiright claims against them.

    Most of them are made in India BTW. You can tell by the accent.

  3. They need to make a "Youtube Kids" login so they can monitor all those videos and make it kid safe

  4. I did get a tablet at I think 6, but I watched like, mincraft mod reviews and stuff. I'm really glad that I never watched THIS

  5. Not only this, the ads that play on these videos, any videos for that matter, are not appropriate for children to see. If YouTube wants to keep the content family friendly, they should at least care about what ads are playing infront of these kids. I know at least one person knows what im talking about.

  6. One thing some parents think is that youtube can babysit their child. They need to learn what it's like to play outside and see the REAL world. Kids that only know technology will be tech savvy but they will lack other important learning experiences.

  7. Anyone remember the times whenever "Happy Tree Friends" was the only time you could be traumatized whenever you where on youtube besides this weird crap

  8. It is also adding to animal cruelty. People are making false animal rescue vids for views by animal lovers (to make money by donations and advertising). In so doing are intentionally causing great harm and cruelty to animals.

  9. Instead of using their money, time, and animation/game development skills to making educational content for kids, they do this friking weird things

  10. we
    are
    fuuuuuuuuuuuuuuuuuuuuuuuuùuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuucked.

  11. The most disturbing thing to me is the audience laughter at 12:20. That kind of youtube content he is speaking of can seriously damage people's mental health in the long run.

  12. I am an adult and this video was recommended to me while I was watching a surprise egg video.. You wish I was trolling. 🥵

  13. I think the way that children brains are still developing etc makes the vivid colors and the animation and the sense of travel to the world of the video content more exciting than adults can conceive, especially since we grew up in a much simpler naturaler world. My nephew watches these videos and I see him so engaged in what I see as the most boring content… he's actually in a trance, probably thinking about something entirely different, perhaps triggered by something in a video, but the imagery is hypnotic if only because its a constant sound and motion… the video/audio is edited in such a way as to never get boring… unlike natural experience there are no pauses in conversation or action…. 

    It makes me think of how a person who is being shocked by electricity is locked in place until the current is turned off.

  14. This is like a plague. My kid got into this, so I had to remove YouTube. The problem is the bad quality and zero value.
    The other big problem is the adverts in kids apps and games – this is pure evil. I’m more than happy to pay for a good game, but often this is not an option – kids are being bombarded with stupid ads they need to watch – drives them mad.

  15. I am glad i grew up with video game son the Sony PS2 console when i was a kid…. nowadays, i am hoping for a Nintendo switch.

  16. You know, we really need to have our uploaded videos tested by real expert that are actual people and experts so that it can be safe to watch.

  17. My dad got on my case for watching TV shows all the time, I'm just grateful these weren't a huge thing back then

  18. The arrogance of the comments… Just self congratulatory for growing up with whatever special and nostalgic """"real""" entertainment we had compared to the poor kids being targeted by these unfeeling algorithms . understand this dang video isnt about YOU, its about the kids.

  19. Good talk, with no exagerated cheesy intention. This shows just how disturbing human kind can get. I'm not a little kid but i feel authentically disturbed and scared by just glancing at those kind of videos. Like i actually get the darn creeps

  20. I see small kids everywhere, no more than 4 or 5 years old, holding smartphones and tablets in their hands and watching endless crap on YT. Good job on allowing them to waste hours on some of the most addictive technologies ever created!

  21. If you were born somewhere between 2004-2008 you would most likely have Minecraft mod reviews and minigames as your early shield to block weird stuff like this.
    Thanks, Minecraft

  22. ttttttttttttttttttttttttttttttttttttttttttooooooooooooooo llllllllooooooooooooooonnnnnnnnnnggggggggggggg

  23. I’m happy that when I was little (and I mean like 4 year old little, I’m just a preteen so I’m still a kid) that my parents didn’t let me watch this mess. They monitored what I was watching and put on shows with lessons.

  24. I don't know why but these days parents do not seems to bother about their children anymore. Everytime when I am dining in the restaurant or shopping at the mall, kids around 3-4 years old are watching Youtube and playing online games. When their kids begin to whine, parents will just stuff their kids with mobile phone. I was really shocked about this scene. Come on parents, they are your child, raise them the right way. Don't spoil them with phones, tablets, Youtube or any other things. Just spend your time with them, play family games, go to picnic or whatever that is fine and healthy. They are too young to be exposed to Youtube environment. And next time, when they grow up, when they enter primary or high school, they will be too obsessed with Youtube and phones and forget and ignore their academics and curriculum results. And these irresponsible parents will just blame their kids for not paying attention and not putting enough effort without acknowledging the fact that they are actually the culprit behind all of this.

  25. This is beyond sickening. We need to call a spade a spade here, folks. Namely, that parents have been relying on technology to raise their children for them. Parents need to do their job and actually PARENT their children, instead of plopping them in front of a device and relying on that. How fucked is this whole thing? Can we really be shocked that this type of content is out there, when there are ppl enabling it? And I’m not talking about kids here. I’m talking about adults. The ones with fully formed brains. Simply bizarre.

  26. It is not by accident! That's what's really frightening. Look at other children's programming. Then check out the games. We must protect their hearts and minds, for they are the future.

  27. Oh no, things change! Why dont kids play with rocks anymore? Not the medium, nor the guy creating low-quality contet is to blame. It's all on the parents.

  28. the only pro of article 13 is these nightmares will go away but I would decently want an alternative since memes would be banned

  29. Take some time to be face to face with your kids! Teach them to count, memorize countries/continents, multiplication/division! Don’t just stick an iPad in-front of them and expect them to not come across this stupid stuff.

  30. Youtube Idea: AI should be programmed to have list of non kid friendly content, watch with audio before sending and check things that are not kid friendly. Things that are not kid friendly will be blocked from being watched if kid friendly mode is on an account, you can change kid friendly mode on or off but got to change the password once every week or early so your kids can't remember the password and not watch non kid friendly content on the account and you need an account or guest account to use YouTube. Now security to protect safety mode. If someone tries changing the script or code of YouTube like hackers and it is not a planned time for an update the Youtube's system should disconnect from the device until a certain number of days or hours until reconnecting in that time a message will be sent to the device asking about the activity and ask for the YouTube code that changes when youtube change's it. There I solved the problem! I think.

  31. We could all go back in time and watch Jimmy Savile again. That's what I watched as a kid. Which is better?

  32. This is horrifying… what are the children raised on these videos going to end up like? Why are parents so insanely irresponsible?

  33. There's no conscience, nor personality in any of to these videos. The numbers on the video views (the eyeballs) is outstandingly large and are watched by children. This early stage of a human being learns their possibilities about who they are and their world around them. They also begin to learn about their thoughts and their emotions and how to think. Because these are based on algorithms and policed by robots, then that means that the source of information is being taught by A.I.'s — artificial intelligence. Hence going back to why these videos have no conscience, nor personality. So what effect will all of this have? What kind of thinking and behavioral system will this generation reveal to us in the future ?

  34. 8:03 the one in the top left corner is probably just a meme. Please let it be just a meme. Please.

  35. i watched youtube as a kid, but when i was around 7 before then i watched PBS. my parents monitered me pretty well and i had a limit. i remember watching stampylongnose and iballisticsquid and playing minecraft, those were the best years of my life

  36. I didn't start watching YouTube until I was around 9 or 10 and even then all I was watching was fucking SSundee

  37. I’m happy I watched Sesame Street on the television when I was a kid. On the computer I played educational games 😂. Happy to have lived without internet until the last grades of preschool.

  38. omg this is fucking terrifying… extremely disturbed. This guy is really impressive. "a guide to what our problems actually are." one of the few ted talks where I actually learned something/got an entirely new perspective.

Leave a Reply

Your email address will not be published. Required fields are marked *