TheirTube – How do the recommended videos look on their Youtube home page?
This whole project started when I was in a heated discussion with a person who thought climate change was a hoax and 9/11 was a conspiracy. Through conversations with him, I was surprised to learn that he thought everyone’s YouTube feed had the same information as his own feed. When we showed each other our YouTube homepages, we were both shocked. They were radically different. And it got me thinking about the need for a tool to step outside of information bubbles.
Looking for something to watch when you’ve got too much time on your hands?
The top 10 best 10-hour long videos on YouTube – Lifewire
It’s pretty unlikely that most viewers actually sit there to watch one of these excruciatingly long videos in full, but that’s not really the point. The point is that a 10-hour version of a popular video or meme simply exists, and that’s what makes it at least ten times funnier than the original.
The recent wildfire in California has been devastating for the towns and communities involved. This video gives us a glimpse of what some people have had to face. It’s worth pointing out that this wasn’t filmed at night.
A video on YouTube, shared via the Guardian News channel. But it’s YouTube that’s again in the news, over the blatantly false videos it hosts and the mechanisms for publicising them.
YouTube lets California fire conspiracy theories run wild
The Camp Fire in California has killed at least 79 people, left 699 people unaccounted for, and created more than a thousand migrants in Butte County, California. In these circumstances, reliable information can literally be a matter of life [and] death. But on YouTube, conspiracy theories are thriving.
I don’t want to go into the theories themselves, the specifics aren’t important.
But the point isn’t that these conspiracy theorists are wrong. They obviously are. The point is that vloggers have realized that they can amass hundreds of thousands of views by advancing false narrative, and YouTube has not adequately stopped these conspiracy theories from blossoming on its platform, despite the fact that many people use it for news. A Pew Research survey found that 38% of adults consider YouTube as a source of news, and 53% consider the site important for helping them understand what’s happening in the world.
Combine those statistics with these, from the Guardian, and you can see the problem.
Study shows 60% of Britons believe in conspiracy theories
“Conspiracy theories are, and as far as we can tell always have been, a pretty important part of life in many societies, and most of the time that has gone beneath the radar of the established media,” said Naughton. “Insofar as people thought of conspiracy theories at all, we thought of them as crazy things that crazy people believed, [and that] didn’t seem to have much impact on democracy.”
That dismissive attitude changed after the Brexit vote and the election of Trump in 2016, he said.
And that’s why these accounts of conspiracy theory vloggers manipulating YouTube to get millions of views for their blatantly false videos are so important.
The issue of conspiracy theorists in a society far predates platforms like YouTube. However, platforms such as YouTube provide new fuel and routes through which these conspiracy theories can spread. In other words, conspiracy theories are the disease, but YouTube is a whole new breed of carrier.
How on (the round) earth can we fix this?
James Bridle on why technology is creating a new dark age
Bridle is already well-known for his creative critiques of modern technology, including the 2012 drone-tracking project Dronestagram, a salt circle that traps self-driving cars, and last year’s influential essay about creepy YouTube kids’ videos. New Dark Age integrates these critiques into a larger argument about the dangers of trusting computers to explain (and, increasingly, run) the world. As Bridle writes, “We know more and more about the world, while being less and less able to do anything about it.”
And, as he explains in this piece for The Observer, the problem with those YouTube videos has not gone away.
How Peppa Pig became a video nightmare for children
As a result, while many videos have since been removed from the website, uncountable numbers still remain. In March, Wired catalogued a slew of violent accounts and demonstrated that it was possible to go from a popular children’s alphabet video to a Minnie Mouse snuff film in 14 steps, just by following YouTube’s own recommendations. As of last week, Googling the title of one of the now-removed videos mentioned in the New York Times article (“PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized”) results in a link to a near-identical video still hosted on the site (“PAW PATROL Babies Pretend To Die MONSTER HANDS From MIRROR! Paw Patrol Animation Pups Save For Kids”), in which the adorable pups don a freakish clip-art monster mask to terrify one another before being lured off a rooftop by a haunted doll. Is “Save For Kids” supposed to read “Safe For Kids”? Either way, it is not, and it’s obvious that just playing whack-a-mole with search terms and banned accounts is never going to solve entangled problems of copyright infringement, algorithmic recommendation, and ad-driven monetary incentives on a billion-view platform with no meaningful human oversight.
The weirdness of YouTube videos, the extremism of Facebook and Twitter mobs, the latent biases of algorithmic systems: all of these have one thing in common with the internet itself, which is that – with a few dirty exceptions – nobody intentionally designed them this way. This is perhaps the strangest and most salutary lesson we can learn from these examples, if we choose to learn at all. The weirdness and violence they produce seems to be in direct correlation to how little we understand their workings – and how much is hidden from us, deliberately or otherwise, by the demands of efficiency and ease of use, corporate and national secrecy, and sheer, planet-spanning scale. We live in an age characterised by the violence and breakdown of such systems, from global capitalism to the balance of the climate. If there is any hope for those exposed to its excesses from the cradle, it might be that they will be the first generation capable of thinking about global complexity in ways that increase, rather than reduce, the agency of all of us.
From James Bridle, a very disturbing look at the absolute minefield that YouTube has become, thanks to a combination of trolls, automation and algorithms. I’m finding it hard choosing a paragraph to quote, here — you really need to read the whole thing.
Something is wrong on the internet
What we’re talking about is very young children, effectively from birth, being deliberately targeted with content which will traumatise and disturb them, via networks which are extremely vulnerable to exactly this form of abuse. It’s not about trolls, but about a kind of violence inherent in the combination of digital systems and capitalist incentives. It’s down to that level of the metal.
This, I think, is my point: The system is complicit in the abuse. […]
What concerns me is that this is just one aspect of a kind of infrastructural violence being done to all of us, all of the time, and we’re still struggling to find a way to even talk about it, to describe its mechanisms and its actions and its effects.