enzo_gt

Member
Oct 25, 2017
6,299
Important to look at YouTube's role in this, despite most of the focus being on Twitter and Facebook. Video doesn't get into the main stuff but:



Great article, however, that you should read in full:

Chaslot was fired by Google in 2013, ostensibly over performance issues. He insists he was let go after agitating for change within the company, using his personal time to team up with like-minded engineers to propose changes that could diversify the content people see.

He was especially worried about the distortions that might result from a simplistic focus on showing people videos they found irresistible, creating filter bubbles, for example, that only show people content that reinforces their existing view of the world. Chaslot said none of his proposed fixes were taken up by his managers. "There are many ways YouTube can change its algorithms to suppress fake news and improve the quality and diversity of videos people see," he says. "I tried to change YouTube from the inside but it didn't work."
The software Chaslot wrote was designed to provide the world's first window into YouTube's opaque recommendation engine. The program simulates the behaviour of a user who starts on one video and then follows the chain of recommended videos – much as I did after watching the Logan Paul video – tracking data along the way.
When his program found a seed video by searching the query "who is Michelle Obama?" and then followed the chain of "up next" suggestions, for example, most of the recommended videos said she "is a man". More than 80% of the YouTube-recommended videos about the pope detected by his program described the Catholic leader as "evil", "satanic", or "the anti-Christ". There were literally millions of videos uploaded to YouTube to satiate the algorithm's appetite for content claiming the earth is flat. "On YouTube, fiction is outperforming reality," Chaslot says.
He believes one of the most shocking examples was detected by his program in the run-up to the 2016 presidential election. As he observed in a short, largely unnoticed blogpost published after Donald Trump was elected, the impact of YouTube's recommendation algorithm was not neutral during the presidential race: it was pushing videos that were, in the main, helpful to Trump and damaging to Hillary Clinton. "It was strange," he explains to me. "Wherever you started, whether it was from a Trump search or a Clinton search, the recommendation algorithm was much more likely to push you in a pro-Trump direction."

Source.

I think most people have absolutely no idea how much influence these algorithms have on shaping society, as they are critical parts of the largest social institutions humanity has ever had; more people use Facebook than there are followers of Christianity, after all. History, even just that which is recent and tied to the development of the Internet or even the Cambridge Analytica stuff, tells us that whoever can best manipulate the information that's out there turns up on top, and we seem to continually fail to learn from that.
 
Last edited:

DigitalOp

Member
Nov 16, 2017
9,369
Just like in the Slavery thread, this wouldn't matter as much if people were properly educated enough to verify information and properly identify credible sources. But in an capitalist information age that generates revenue based on clicks, you see the rise of bubble viewpoints gripping hold on the populace. And all it took was one ignorant populist demagogue to scream "Fake News!", and an entire Era of misinformation, confusion, and accepted ignorance was born.

Its very much so possible that we may never recover from this.

And lastly to add, the technocrats were complicit in all of this. Facebook, Youtube, and Twitter. Complicit
 

830920

Member
Oct 29, 2017
781
Not surprising at all. I never get recommended things I would actually like based on what I watch, but as soon as I watch one questionable video I will get recommended alt-right stuff for weeks. Like what the hell is going on there?
 

Deleted member 1041

User requested account closure
Banned
Oct 25, 2017
10,725
So this is the first of many elections influenced by social media, huh. We need the Patriots to contextualize this flow of information, in all honesty.
 

jman2050

Avenger
Oct 25, 2017
5,897
I've been alarmed over the past several years at just how readily both individuals and institutions of the world are placing their trust in algorithms without considering where those algorithms are coming from.

I do worry about the future where our ability to use math to solve problems continues to improve even beyond what it is now. No matter how advanced our technology gets, there will always be human interest behind every tiny piece of processing power.
 

DigitalOp

Member
Nov 16, 2017
9,369
If something isn't done to fund, fix, and support our education system, you're probably right.

I would like to hope Im wrong, but its so slim that its better to prepare for the worst

Not surprising at all. I never get recommended things I would actually like based on what I watch, but as soon as I watch one questionable video I will get recommended alt-right stuff for weeks. Like what the hell is going on there?

Put it like this:

Youtube, Twitter, and Facebook do not care that a sizable portion of users are using their platforms to spread harmful ideologies and misinformation if it keeps those sizable groups of people using their platforms. Its all about money.

Facebook had intelligence before hand that Russia was using the platform to influence the population and even paid in Rubles... Facebook accepted the payment anyway.
 
Oct 28, 2017
699
Psych ops at it again, boys and girls. This is nothing more than a dent created by those responsible for psychological warfare on our minds.
 

LakeEarth

Member
Oct 27, 2017
8,228
Ontario
I believe it. You watch one Stand Up video by Bill Burr, and suddenly YouTube starts recommending Red Pill shit. It's awful.

Also, Alex Jones channel is categorized as ''news" on YouTube. Fuck you, Google.
 

Garmonbozia

Member
Oct 27, 2017
592
Things like this almost makes me believe that the future won't be that different from a cyberpunk story. Giant technological conglomerates basically selling out human decency and meaningful communication.
 

UltraGunner

Member
Oct 25, 2017
11,213
Los Angeles, CA
I like to think that the radicalization of many young white men towards far right ideologies comes down to laziness on the part of these tech companies to change their algorithm, because the alternative is nightmarish to say the least.
 

Deleted member 862

User requested account closure
Banned
Oct 25, 2017
8,646
"Our search and recommendation systems reflect what people search for, the number of videos available, and he videos people choose to watch on YouTube."

they still don't get (or don't care) that this system is fundamentally broken and a downward spiral. The fact it's so easy to prove as well should give them pause for thought but it doesn't.
 
Last edited:

Occam

Member
Oct 25, 2017
2,510
I noticed the same thing. You watch something factual/scientific and next you are linked to fake nonsense/clickbait.
That's why I keep deleting the cache/collected data of the third party app I'm using to play youtube videos on android; this keeps the nonsense at bay for a whie.

No wonder so many weak minded people without tech knowledge start believing 9/11 conspiracy theories etc. It's basically all they are shown.
 
Last edited:

Deleted member 29464

Account closed at user request
Banned
Nov 1, 2017
3,121
I have a feeling there is a lack of willpower at the top of these companies to do anything about these things mostly due to money or because of other agendas. I don't know what could be done about this. I am constantly having to tell youtube these days that I'm not interested in some of the crap they recommend me.
 
Oct 25, 2017
3,789
The algorithm is designed to maximize view time just the interesting part is that is has such strangely broad consequences as to shape society itself. I mean imagine being the one who implemented it, you just want people to find more Minecraft videos but due to the scale and amount of user-generated content you've literally altered how an entire group of people thinks without ever knowing it. Incredible really. Facebook is the same way, it's only now that after a billion+ users and a decade in the field you're starting to see some of those societal warping effects. Even science fiction didn't see that one coming.
 

SegFault

Member
Oct 25, 2017
1,939
Algorithms are written by people and have implicit biases bestowed upon them by their creators.
 

ashep

Banned
Oct 25, 2017
1,703
Google manipulating results has been a thing forever. Do people not remember when they completely squashed rapgenius.com?
 
Oct 27, 2017
7,756
I'm not surprised by this at all. Shitheel antisocial basement dwellers, literal nazis, and your old racist uncle have found refuge together on these social platforms that utilize recommender algorithms to push people further and further down their own shitty rabbit holes.
 

Nightwing123

Member
Oct 27, 2017
5,435
I definitely believe this, since every time I watch a video that is even a little bit political, I end up getting some alt-right video recommendations.
 
Oct 25, 2017
3,789
Algorithms are written by people and have implicit biases bestowed upon them by their creators.

This isn't really how it works. Most of the time, the biases are informed by behavior. Say I want to use machine learning to predict who's a criminal by looking at them (a sort of precog algorithm). Well it's likely to discover that being black is good prediction feature because simply more blacks get arrested. That's not creator bias, that's societal bias. Or remember Microsoft unleashing Tay Tweets on Twitter as a blank slate and it repeating bigoted garbage? In fact, part of the problem is that we'd wish that algorithms change biases and that requires specific intervention to do the thing we want to see, not the thing actually happens.
 

Juan29.Zapata

Member
Oct 25, 2017
2,356
Colombia
Incredible how people inside YouTube wanted to fix this shit and were fired because Google only cares about money. Capitalism at its finest.
 

Dingens

Circumventing ban with an alt account
Banned
Oct 26, 2017
2,018
[...]
I think most people have absolutely no idea how much influence these algorithms have on shaping society, as they are critical parts of the largest social institutions humanity has ever had; more people use Facebook than there are followers of Christianity, after all. History, even just that which is recent and tied to the development of the Internet or even the Cambridge Analytica stuff, tells us that whoever can best manipulate the information that's out there turns up on top, and we seem to continually fail to learn from that.

judging by the low turnout in threads like these, it's more like people are not aware of the extend of this, or possibly don't care because technology can't possibly bad, and if you're critical of something, you're anti-science or whatever bullshit defence they can come up with.

I like to think that the radicalization of many young white men towards far right ideologies comes down to laziness on the part of these tech companies to change their algorithm, because the alternative is nightmarish to say the least.

I'm pretty sure it's the money... and the pressure inherent in a capitalist system. If you don't exploit an audience, someone else will. And without some oversight (regulation) to solve this prisoners dilemma, morals will always lose.

Algorithms are written by people and have implicit biases bestowed upon them by their creators.

The Algorithms themselves may be... but the results are as far removed from human comprehension as it gets by now
https://www.resetera.com/threads/te...stopia-just-to-make-people-click-on-ads.4640/
 

astroturfing

Member
Nov 1, 2017
6,584
Suomi Finland
I definitely believe this, since every time I watch a video that is even a little bit political, I end up getting some alt-right video recommendations.
hmm, hasnt been my experience at all and i'm a massive consumer of YouTube media.

i've even intentionally watched Alex Jones and some other loony garbage (with a "know your enemy" attitude), but it still keeps recommending stuff that i actually like and appreciate (whether it's science, politics, comedy etc).

i dunno, i guess i've been served well by their algorithms. i'm pretty addicted to YT, gotta admit. i wanted a service like that the first day i used the internet in 1993 or so, a video service that had pretty much everything, sometimes its hard to fathom its actually here.. i can watch live space rocket launches, hockey highlights, political debates from the 60s, evolutionary biology lectures, doge videos and everything i could possibly be interested in, AND i don't even have to actively search for them, they're just served to me automatically. it's amazing!

i dunno, maybe im just hopelessly infatuated and it truly is decadent and Google needs to be done with, sure why not. i'm not a fan of them having so much power over humanity.. split them up or something. or force them to give all oversight of their operations to a UN committee or something.
 

Luchashaq

Banned
Nov 4, 2017
4,329
Just like in the Slavery thread, this wouldn't matter as much if people were properly educated enough to verify information and properly identify credible sources. But in an capitalist information age that generates revenue based on clicks, you see the rise of bubble viewpoints gripping hold on the populace. And all it took was one ignorant populist demagogue to scream "Fake News!", and an entire Era of misinformation, confusion, and accepted ignorance was born.

Its very much so possible that we may never recover from this.

And lastly to add, the technocrats were complicit in all of this. Facebook, Youtube, and Twitter. Complicit

Mainstream news being trash for decades such as CNN/NYT are a big reason why this fake shit is believable. If even multiple cable news channels, and the most famous paper in America are so often full of trash, what sources is a random uninformed person supposed to trust?

Calling CNN fake news only works because they have had a shit track record longer than I've been alive.
 

gozu

Banned
Oct 27, 2017
10,442
America
Is this a case of lies being more entertaining than truth, and the algorithm (or its masters) not giving a fuck about the truth, only more clicks and more ad money because Key Performance Metrics > everything else ?

Is this what's happening?
 

kiguel182

Member
Oct 31, 2017
9,545
I'm enterily convinced machine learning and AI is destroying society. The effects this stuff has are gigantic and it's not stopping. You can extend this logic to any digital service and it's something that isn't regulated and the companies behind it give no shits about.

We let Sillicon Valley control the information we receive, the stuff we buy and even the people we date and they are concerned with monetizing all of that instead of thinking of the impact they decisions (even smaller ones) might have.

I realize how hyperbolic it sounds but yeah, I think this is all very dangerous.
 

kiguel182

Member
Oct 31, 2017
9,545
Is this a case of lies being more entertaining than truth, and the algorithm (or its masters) not giving a fuck about the truth, only more clicks and more ad money?

Is this what's happening?

It is yes. It's all made to keep you on the platform and get ad money and the algorithm doesn't care what is spreading. There's no inherit malice behind it as far as we know but if you feed this info to the algorithm it couldn't care less if it's true or not just that it gives more clicks. The fact that it's not intentional "evil" somehow makes this much worse.
 

Jtendo '82

Banned
Nov 18, 2017
642
As long as it's the reality they want you consuming, right?

Calling the church evil isn't fiction.
 

Azraes

Member
Oct 28, 2017
997
London
Actually what's interesting from this is an AI implication. I don't think we see how these algorithms learn and we're using this to build AI. That is troubling.

Right now these algorithms shape your thoughts and ideas by learning your interests, your patterns, how you function, what your parameters are by trying understand what you like. It's failing to understand the human condition - the human condition is that every thought, every opinion that comes from outside influences us. The amount of influence it has on us depends on how close we are to this person, what our own ideologies are, how convincing the argument is and other similar factors. Most people have a lot of trust on brands and corporations. People implicitly had trust in Google, in Facebook, etc. The less people trusted these companies, the less likely they are to completely believe these things. And that's for the average viewers. In terms of outliers we have confirmation bias, so for conspiracy theorists, loonies and the like it's easier to start trusting them when the content shown fits their agenda or beliefs. The power of influence is strong in terms of content and you can't change that because that's intrinsic to human nature. We are all influenced by some degree - how much is the trigger.

Now these same companies work on Machine Learning and Learning algorithms. The way they mimic learning is by our understanding and to a good degree it's using our own anecdotal experiences on learning. We aren't so advanced as to understand how thought is created or formed. We just have some understanding of the way we learn and understand things. It's not absolute. So when we do create machine learning we will be creating learning patterns based on users learning patterns. Why do most AI chat bots turn into dregs because the way we are wired to learn is to remember more negative stimuli to protect ourselves from it and there's the strong consumption of fear because of the adrenaline it gives us.

One of these days we will create a machine that's based off of these learning algorithms and that learns things on a hate/fear/reward input. When we create that machine and if we give it power that's when AI can fit one of those AI/Robots as pathos storylines. I think we need to rework the ways we think and also write algorithms that work on learning. We are en route to building some really shitty Turing Machines that can kill us.
 

gozu

Banned
Oct 27, 2017
10,442
America
It is yes. It's all made to keep you on the platform and get ad money and the algorithm doesn't care what is spreading. There's no inherit malice behind it as far as we know but if you feed this info to the algorithm it couldn't care less if it's true or not just that it gives more clicks. The fact that it's not intentional "evil" somehow makes this much worse.
This is why we can't have nice things.
 

Arebours

Member
Oct 27, 2017
2,656
It's a fun experiment. Start on a random legitimate video, enable auto play and leave it to run for a few hours. Always ends up in some despicable shit.
 
Oct 27, 2017
4,695
not surprised by any of this at all - but I'll be damned if "Fiction is outperforming reality" isn't painful to see.

Was there ever an onus on youtube to be strictly documentarian?

There's a difference between not being documentarian, and the active promotion of falsehood - particularly when it is being spread and uptaken as "truths".

However you feel about the editorial freedoms of user generated content, surely you can see a value in not allowing active promotion of falsehoods - and the broader effects of that on society.
 
Last edited:
Oct 27, 2017
7,756
Not at all surprised. And education cannot defeat this sort of thing that taps into exploiting cultural foundations/upbringing. Many people who watch Fox News are highly educated but compartmentalize their vocational discipline from the rest of their life and world views.
 
Oct 25, 2017
10,376
A while ago, when it was discovered that there Russian youtube accounts with black people pushing anti-Hillary videos, people were wondering why people would watch it. This shows it wasn't meant to be watched, but to clog up the YouTube recs.
 

Neo C.

Member
Nov 9, 2017
3,040
The algorithm is really awful, it puts you in a bubble everytime. And the more youtube you watch, the more restrictive is the bubble.
 

PoppaBK

Member
Oct 27, 2017
2,165
I'd still rather have a blind but stupid algorithm over a manually tuned system that is overseen by faceless employees that are beholden to no-one.
 

Eidan

AVALANCHE
Avenger
Oct 30, 2017
8,688
It's true, and it's comical seeing YouTube attempt to deflect its culpability in spreading misinformation. If you look at anything even remotely political, YouTube's algorithm instantly suggests you go down its right-wing rabbit hole.