Will new videos be added to YouTube

theme - internet

In February 2018, a 19-year-old man shot dead 17 people at his former school in Parkland, Florida. In the days that followed, survivors called for stricter gun laws in interviews. David Hogg is among them. The then 17-year-old becomes one of the faces of the youth movement and can be seen on television every day. Hogg discusses with conservative TV presenters and the gun lobby. His sharp criticism of the gun laws earned him a lot of sympathy.

At the same time, you get a completely different picture of the young activist on YouTube. If you looked for David Hogg on the video platform in those days, you came across a sensational video. Hogg is a crisis actor who is "coached" for his TV interviews. A charge that quickly turns out to be a conspiracy theory. Nevertheless, YouTube still contributes to the spread of the video: If you watched a video about Parkland, the video was often automatically recommended. After a short time it was number one in the trend. The false accusation spread rapidly. So Hogg got even more targeted: Strangers threatened him and his family with death.

"One of the most powerful instruments for radicalization in the 21st century"

It's not the first time YouTube has been criticized for its recommendations. The platform has long been accused of dragging its users into a vortex of politically radical or emotionally shocking videos. YouTube is “perhaps one of the most powerful instruments for radicalization in the 21st century,” writes sociologist Zeynep Tufekçi. The focus of the criticism is the YouTube algorithm.

It's not that easy to answer. Because Google, the parent company of YouTube, generally does not disclose its algorithms. This leaves the criteria by which the algorithm evaluates videos unknown. Why YouTube picks a video out of the crowd should remain a secret.

Scientists and journalists are trying to shed light on the algorithm. Guillaume Chaslot, a former Google employee, has been researching YouTube's recommendations since 2016. The IT expert wrote a program that simulates the behavior of a normal user. His program starts with a video, follows the chain of recommendations and saves the results. The result after thousands of runs: YouTube increasingly recommends videos that are divisive, sensational and conspiratorial.

For YouTube, it is only important that users keep looking

The British newspaper "The Guardian" has reviewed Chaslot's study for the search terms "Clinton" and "Trump" for the period before the US election in 2016. The journalists confirmed his findings and found a "surprising amount of conspiratorial content and fake news against Clinton". In addition, YouTube recommended pro-Trump videos six times more often than those of his opponent. A research by the "Wall Street Journal" came to a similar result. YouTube rejected the findings; the methods of the studies were not correct.

Why should YouTube have promoted these videos in particular? There is no evidence that YouTube intentionally programmed its algorithms to benefit Trump. The business model of YouTube plays a much more important role.

So you don't let YouTube talk to you about conspiracy videos

1) Check who is behind the video. Is it a reputable account that, for example, explains its sources?

2) Don't just trust the algorithm. YouTube often recommends videos that add even more fuel to the fire.

3) Don't let anything persuade you. Compare the information with other websites.

4) The longer you watch, the more ads YouTube sells. So the corporation is doing everything it can to keep you on screen. Make a conscious decision whether you want to stay tuned or just switch off.

5) Don't run after you. Just because a video gets a lot of clicks doesn't mean it is true. Just think about why a video is so popular.

YouTube makes money from the advertising it plays before and during the videos and displays on the website. Tufekçi explains that Google is selling our attention to companies that pay for it. “The longer people stay on YouTube, the more money Google makes.” That's why YouTube geared its algorithm to the engagement of a video. The algorithm reinforces those videos that encourage people to keep watching, to write comments or to post likes. The main thing is that the users stay on the website.

Today, YouTube more often blocks channels that are rushing

According to the algorithm, these criteria seem to be met primarily by sensational and conspiratorial videos. Hardly anything is clicked as often as videos that spread anger and fear. Click by click they provide scapegoats and confirm prejudices. In addition: the consumers of such videos spend a lot of time on YouTube. So the algorithm learns from them and recommends similar videos to the rest of the users. What they are looking at must be particularly interesting.

YouTube has known about all of these problems for a long time. This was reported by former employees, such as the IT expert Chaslot. He accuses the company of having deliberately accepted the toxic content in order to keep users on the site longer. YouTube denies these allegations. The algorithms are constantly being further developed and adapted to the latest findings. More often than before, YouTube is blocking channels that upload inflammatory or conspiratorial videos. As of recently, the algorithm should also record the quality of the videos, supported by human moderators. Another update was recently announced: Viewers who have seen conspiracy theories are increasingly being suggested videos from credible media.

It is questionable whether YouTube will prevent the kind of events that Parkland student Hogg had to experience in the future. Because the goal of the algorithm is unchanged: keep users on the platform for as long as possible and thus increase advertising revenue.

Photos: Hahn & Hartung

This text was published under the license CC-BY-NC-ND-4.0-DE. The photos may not be used.