#098: Hockey-Stick Problem Growth

A recent video by Matt Watson details a soft-core pedophilia ring using the platform not only for hosting, but also to connect, trade, and link to actual child pornography in the comments (Video). And it’s not just that you can find these kinds of videos on YouTube: It’s not even that hard to find them. YouTube’s recommendation engine will do the work for you, needing only a few clicks to get you from an unobjectionable video to a very objectionable one.

It’s not the first time YouTube has had to deal with similar issues. In 2017, “ElsaGate” revolved around YouTube’s recommendation algorithm offering sexualized, disturbing content to children. And this time is not really any different: “The Verge” has been able to recreate the same situation that Watson’s video describes, being able to go from innocuous videos to ones with predatory comments by only a few clicks on recommended videos. YouTube has since removed several of the videos and comments featured in the video. However, it’s clear that there’s a lot more lurking on the platform, including content hidden behind private channels1. YouTube is even showing ads on those videos, meaning whoever uploaded them can even make money off the vidoes.

YouTube’s current efforts to combat this problems rely on users flagging problematic content, as well as machine learning algorithms that are supposed to find and remove offending content automatically. Obviously, those algorithms aren’t doing a good job, and sometimes even target innocent channels. Recently, a number of Pokémon GO and Club Penguin channels were banned by YouTube, because they used the term “CP” in their titles. YouTube’s machine learning algorithms believed this to indicate “child porn”2, and took the videos as well as the channels down. All of this happened automatically, as if any human had actually watched the videos in question, they would not have found any offending material.

And even if a human flags a video which is subsequently removed, the uploaders are rarely banned, leaving them free to upload more. Watson further questions YouTube’s efforts, as most were uploaded by suspicious accounts, and had clearly problematic comments below them. It should have been easy for YouTube to identify and remove these videos.

Now that advertisers are pulling their ads from YouTube, it has started to aggressively remove offending videos and channels. It is unclear why YouTube has waited until advertiser pressure to do so. Additionally, this hurts creators on its platform, who rely on ad revenue for income. No advertisers mean no income for them, and unlike YouTube, most do not have a wealthy parent company that can keep them funded if need be.

YouTube’s continuing failure to address the many problems with their recommendations and review algorithms make this a home-grown problem. They are quick to react if they face significant financial pressures, doing everything needed to pacify their advertising partners, but until YouTube faces up to the core problems that their technology creates not only for themselves, but our society, and actually starts fixing them, things like this will keep happening.

Misinformation Black Hole

Pinterest, like Facebook, has had their own troubles with Anti-Vaxxers and other misinformation groups using them to spread their propaganda. But unlike Facebook, Pinterest has now taken an unusual step to combat this problem: It no longer displays search results for “polluted” terms, and no longer allows pins for URLs that are known sources of misinformation:

“We are doing our best to remove bad content, but we know that there is bad content that we haven’t gotten to yet,” explained Ifeoma Ozoma, a public policy and social impact manager at Pinterest. “We don’t want to surface that with search terms like ‘cancer cure’ or ‘suicide’. We’re hoping that we can move from breaking the site to surfacing only good content. Until then, this is preferable.”

While it’s still possible to find misinformation on Pinterest using other search terms, it’s a novel approach to combat these kinds of “data voids”.

Interplanetary Shooting

If you meet someone new, it is somewhat inconsiderate to also shoot them. Yet that’s exactly what JAXA’s Hayabusa2 probe did to asteroid Ryugu. It’s all in the name of science, though: By shooting a small bullet into the asteroid’s surface, the probe was able to collect several samples, and store them. It will repeat this procedure twice more, with the third one even intending to take a sub-surface sample, before starting the long trek back to Earth. After its arrival in December 2020, where scientists will be able to examine the collected samples.

Space Peanut

In the meantime, on the edge of the solar system, photos sent back by New Horizons reveal Ultima Thule to look like, well, a flattened peanut. No, really, look for yourself:

📖 Weekly Longreads 📚

“A shocking number of shootings go unsolved. In some police departments, hundreds of cases aren’t investigated at all”: Shoot Someone In A Major US City, And Odds Are You’ll Get Away With It.

🦄 Unicorn Chaser 🦄

SNUB, SNUQ, and KERMA: Bizarre Units used by Scientists

  1. Videos on those accounts aren’t publicly available, you need to follow (and be approved) by the channel owner first. 

  2. “CP”, in Pokémon parlance, refers to a Pokémon’s “combat power”