Important to look at YouTube's role in this, despite most of the focus being on Twitter and Facebook. Video doesn't get into the main stuff but:
Great article, however, that you should read in full:
Source.
I think most people have absolutely no idea how much influence these algorithms have on shaping society, as they are critical parts of the largest social institutions humanity has ever had; more people use Facebook than there are followers of Christianity, after all. History, even just that which is recent and tied to the development of the Internet or even the Cambridge Analytica stuff, tells us that whoever can best manipulate the information that's out there turns up on top, and we seem to continually fail to learn from that.
Great article, however, that you should read in full:
Chaslot was fired by Google in 2013, ostensibly over performance issues. He insists he was let go after agitating for change within the company, using his personal time to team up with like-minded engineers to propose changes that could diversify the content people see.
He was especially worried about the distortions that might result from a simplistic focus on showing people videos they found irresistible, creating filter bubbles, for example, that only show people content that reinforces their existing view of the world. Chaslot said none of his proposed fixes were taken up by his managers. "There are many ways YouTube can change its algorithms to suppress fake news and improve the quality and diversity of videos people see," he says. "I tried to change YouTube from the inside but it didn't work."
The software Chaslot wrote was designed to provide the world's first window into YouTube's opaque recommendation engine. The program simulates the behaviour of a user who starts on one video and then follows the chain of recommended videos – much as I did after watching the Logan Paul video – tracking data along the way.
When his program found a seed video by searching the query "who is Michelle Obama?" and then followed the chain of "up next" suggestions, for example, most of the recommended videos said she "is a man". More than 80% of the YouTube-recommended videos about the pope detected by his program described the Catholic leader as "evil", "satanic", or "the anti-Christ". There were literally millions of videos uploaded to YouTube to satiate the algorithm's appetite for content claiming the earth is flat. "On YouTube, fiction is outperforming reality," Chaslot says.
He believes one of the most shocking examples was detected by his program in the run-up to the 2016 presidential election. As he observed in a short, largely unnoticed blogpost published after Donald Trump was elected, the impact of YouTube's recommendation algorithm was not neutral during the presidential race: it was pushing videos that were, in the main, helpful to Trump and damaging to Hillary Clinton. "It was strange," he explains to me. "Wherever you started, whether it was from a Trump search or a Clinton search, the recommendation algorithm was much more likely to push you in a pro-Trump direction."
Source.
I think most people have absolutely no idea how much influence these algorithms have on shaping society, as they are critical parts of the largest social institutions humanity has ever had; more people use Facebook than there are followers of Christianity, after all. History, even just that which is recent and tied to the development of the Internet or even the Cambridge Analytica stuff, tells us that whoever can best manipulate the information that's out there turns up on top, and we seem to continually fail to learn from that.
Last edited: