Chairmanchuck (另一个我)

Teyvat Traveler
Member
Oct 25, 2017
9,195
China
I was thinking about that a lot, since I remember I have been on the web for over 20 years now, saw the late 90s/early 00s webpages, where internet was mostly just for nerds, talking about hobbies, the crass/edgy jokes of Newgrounds and, personally felt far more relaxed and not really that political.

And then the rise of YouTube, Facebook, Twitter etc. came and I feel there was a big shift to the right. Conspiracy theories everywhere, older generations learned how to use the net. Facebook mostly used by my and the younger generation to, in my experience, the older generation. Boomers, Generation X and I can not help but notice that most of those talking points of the right are being amplified through those social media channels.

How would the world look like if those social media plattforms would actually ban and moderate everything to a "normal", maybe even "centrist" level, at least do more than what is being done now. You can write the n-word everywhere there, nothings being really monitored.
 

krazen

Member
Oct 27, 2017
13,389
Gentrified Brooklyn
GUILTY. Radicalized White supremacy was always around, but thanks to social media it went from fringe groups to an accepted political belief that deserves debate.

Depending on how the future goes, for all the 'connecting the world' bullshit im ready to say it's existence has set us back as a society.
 

Duffking

Member
Oct 27, 2017
5,788
They're about 99% of the reason for the scale of it, honestly. In large part because of the "don't feed the trolls" attitude. We should have been making these spaces as unwelcome as possible for bigots and kicking them out of the house instead of letting them poison the communities. Most far right extremists these days will have been radicalised by youtube, twitter, etc.
 

Hyun Sai

Member
Oct 27, 2017
14,562
source.gif
 

iRAWRasaurus

Community Resettler
Member
Oct 25, 2017
4,729
Guilty as fuck. Easy to make supporting echo chambers to reinforce each other's ideas..hmm
 

Tamanon

Member
Oct 25, 2017
19,841
Validation is the most important part to ignorant beliefs becoming deeply rooted. So, extremely guilty.
 

oledome

Member
Oct 25, 2017
2,907
Their inaction has allowed the shit to spread, they aren't fulfilling their responsibilities. They're loathe to make any move at all but I think we are gradually moving to a point where they will have to, the expectation is greater.
 

Dehnus

Banned
Oct 30, 2017
1,900
Google? VERY GUILTY, Youtube keeps punishing smaller content creators and pushing "NAZI SWEDE MCNAZI FACE" to make more money, and at the same time exposing toddlers to his ideology.

Seriously that stuff is heinous. Like I mostly watch engineering videos, and the occasional soldering video. Add to that some left wing tubers like radical reviewer or Thoughtslime? AND BAM there he is again! Some NAZI pushed in my suggestions, and no matter how many times I report them or tell them "I don't like this channel/tuber", it keeps being pushed! While content creators that talk about Rapsberry Pi's, carpentry, music i like, homebrew programming hobby projects, engineering, extremely old games (which often have interesting assembly code) and technology? Those don't get suggest to me, but that's mostly the content I watch.

Youtube/Alphabet/Google is extremely guilty in pushing them!
 

Deleted member 6949

User requested account closure
Banned
Oct 25, 2017
7,786
They did nothing to address the problems with their platforms after 2016. Zuckerpsycho and Dorsey should be in jail.
 
Oct 27, 2017
5,973
Very. They've allowed horrible small voices to be amplified and disrupt regular political progress. Trolling with bot farms and manipulative messages designed to make people angry and divide themselves.

The whole thing has emboldened racists the world over and it feels like the world has taken a massive step backwards.

The internet is a tool but also a weapon. Those companies have the power to make sure it is used responsibly and they have failed to do that.
 

Dyno

AVALANCHE
The Fallen
Oct 25, 2017
13,578
They're a portal of congregation for ideas that would have typically been unacceptable in most circles. They've grown their base to a point that the ideas are no longer unacceptable and are commonplace in the world now. These people are now also vital to them continuing to pull in the numbers they're used to and by extent, are vital to their business. I'd say that makes them extremely guilty.
 
Oct 27, 2017
4,673
They have provide staging grounds, harboured growth and signal boosted while doing as little as possible to clean up unless some kind of major PR incident has occurred.
 

Famassu

Member
Oct 27, 2017
9,186
One of the biggest guilty parties, if not the biggest (other than the right-wing shitgibbons themselves). They have a social responsibility to thwart hate speech, but they don't.
 

swift-darius

Member
May 10, 2018
943
the internet, and more specifically fora like youtube, reddit, 4chan, et al are particularly to blame for the radicalisation of a new generation of disaffected young white men. the demographics may differ, but ultimately the "logged on"/socially detached male white 15-30 year old is a huge new market and target audience for conservative and far-right ideology

you see this across multiple spheres. gamergate of course was a huge path for this, but it crops up in a bunch of smaller or other fields too - e.g. paradox's grand strategy games get inundated with reactionaries, irredentists, crusader memes, 1453 blah blah. the refugee crisis here in europe was another huge push for conservative backlashes that gained ground among younger male majority-demographic individuals - likely the same majority demo for this website, which of course goes to show that ultimately the internet is itself as polarised as society has been lately
 

rsfour

Member
Oct 26, 2017
17,000
Right up there with trading child porn. They're one of the worst things about the Internet
 

spineduke

Moderator
Oct 25, 2017
8,795
How would the world look like if those social media plattforms would actually ban and moderate everything to a "normal", maybe even "centrist" level, at least do more than what is being done now. You can write the n-word everywhere there, nothings being really monitored.

there isn't a "centrist" position to adopt here, when one side thrives on xenophobia and racism. thats the hill conservatism seems to be willing to die on. *shrug*
 
Oct 25, 2017
5,846
I was thinking about that a lot, since I remember I have been on the web for over 20 years now, saw the late 90s/early 00s webpages, where internet was mostly just for nerds, talking about hobbies, the crass/edgy jokes of Newgrounds and, personally felt far more relaxed and not really that political.

And then the rise of YouTube, Facebook, Twitter etc. came and I feel there was a big shift to the right. Conspiracy theories everywhere, older generations learned how to use the net. Facebook mostly used by my and the younger generation to, in my experience, the older generation. Boomers, Generation X and I can not help but notice that most of those talking points of the right are being amplified through those social media channels.

How would the world look like if those social media plattforms would actually ban and moderate everything to a "normal", maybe even "centrist" level, at least do more than what is being done now. You can write the n-word everywhere there, nothings being really monitored.

I don't think you can blame this on "the olds" joining. Basically all the web's big nerdy communities were harboring some pretty regressive people, it's just that the cultural discourse wasn't so that this stuff was coming up often (and it was often disguised with the aforementioned edgy humor.)

Things would definitely be better if the sites actually enforced their TOS, although I think some of the blame is excessive. If it wasn't social media they'd still be in their various communities. Social media came along as the web went mainstream, but the web going mainstream without social media would still have caused an explosion in those communities as people realized their beliefs were acceptable somewhere.
 

crimsonECHIDNA

▲ Legend ▲
Member
Oct 25, 2017
17,754
Gatorland
Absolutely guilty. I'd go as far as to say the current state of GAF is like the prime example of what happens when you don't nip that shit in the bud.

Hell, things like Gamergate would have flatlined if it were not for those alt-right youtubers stoking the fires.
 

Holundrian

Member
Oct 25, 2017
9,451
Very guilty. The youtube algorithm is just awful. I consider myself progressive as heck and it used to be very frequent for me to get alt right/sceptic/youtube scum recommendations. This has gone drastically down after I just abused their shitty system and reported every video like that on my front page. Now it's happening extremely rarely. Which is still stupid like how does the algorithm just shift so strongly towards pushing those videos? It's just garbage.
 

chrisPjelly

Avenger
Oct 29, 2017
10,538
Extremely guilty. I STILL get the occasional recommended shithead if I dare watch anything vaguely political or controversial.
 
Feb 24, 2018
5,369
Absolutely guilty. I'd go as far as to say the current state of GAF is like the prime example of what happens when you don't nip that shit in the bud.

Hell, things like Gamergate would have flatlined if it were not for those alt-right youtubers stoking the fires.
Same with GameFAQ, why did Gamespot buy that (or whoever owns Gamespot) if they weren't going to do anything to improve it and just leave to get worse and worse?

Does make me wonder what the internet would be like if proper moderation and the like were in place way sooner?
 

Pagusas

Banned
Oct 25, 2017
2,876
Frisco, Tx
I agree with you up until you started saying they should be moderating stuff... you want to encourage corporations to moderate political messaging to align with your view point? Dangerous. I can understand fact checking, removing automatic bots posting paid for messages, I can understand anti-racist moderation, I can understand child safety, anti-violance and all that moderations, but you are entering danergous areas when you start saying we need to moderate the political messages of people. Free will and free thought have to be a thing, especially in a liberal society. People WILL have different opinions then you, and many of them will be stupid, but if we start being thought police, than the world you are building will be a truly horrible one.
 

Deleted member 2809

User requested account closure
Banned
Oct 25, 2017
25,478
FB/YT/Twitter and more, extremely guilty as they side with the nazis.
I agree with you up until you started saying they should be moderating stuff... you want to encourage corporations to moderate political messaging to align with your view point? Dangerous.
Hate isn't politics or opinions, get the fuck outta here
 
OP
OP
Chairmanchuck (另一个我)

Chairmanchuck (另一个我)

Teyvat Traveler
Member
Oct 25, 2017
9,195
China
I agree with you up until you started saying they should be moderating stuff... you want to encourage corporations to moderate political messaging to align with your view point? Dangerous. I can understand fact checking, removing automatic bots posting paid for messages, I can understand anti-racist moderation, I can understand child safety, anti-violance and all that moderations, but you are entering danergous areas when you start saying we need to moderate the political messages of people. Free will and free thought have to be a thing, especially in a liberal society. People WILL have different opinions then you, and many of them will be stupid, but if we start being thought police, than the world you are building will be a truly horrible one.

I wasnt talking about thought policing, but evidently "all n-words must die", saying the n-word should directly go into a ban, posting fake news, calling for genocide, clear racism, defense of pedo-/hebephilia etc.

Instead you have channels, facebook groups, twitter groups etc. dedicated to only that.
 

Deleted member 2474

Account closed at user request
Banned
Oct 25, 2017
4,318
I agree with you up until you started saying they should be moderating stuff... you want to encourage corporations to moderate political messaging to align with your view point? Dangerous.

they should, at the very least, be moderating provably false and dangerous conspiracy shit. facebook has been a breeding ground for antivaxxers, ludicrous covid conspiracies, people threatening to bomb and destroy 5g towers for spreading dangerous radiation, chemtrail nonsense, "crisis actor" conspiracies, etc. overt racism and hate speech should be pretty clear grounds for moderation as well. there's no political belief system grounded in reality that makes any of this shit okay.
 

Kthulhu

Member
Oct 25, 2017
14,670
Extremely. Sure bigotry and far right views existed prior to Facebook and YouTube, but no one has profited or signal boosted these beliefs to this extent until Facebook and Google.
 

shoptroll

Member
May 29, 2018
3,680
Guilty as all hell. Shit posts, controversial posts, drama posts, etc. all fuel "engagement" which is their bloodstream due to these platforms actual purpose as a advertising vehicle to their users. When you start removing those elements then the whole house of cards comes toppling down.

These companies want to act like they're "public squares" but they're not. They never have been. The Internet itself is the "public square", social media sites are just one of the taverns adjacent to it.

If someone wants to spout their racist bullshit it's not that hard to standup a server somewhere and get a domain name. FB/YT/Twitter don't have to do their job for them by giving them a platform.
 

8byte

Attempted to circumvent ban with alt-account
Banned
Oct 28, 2017
9,880
Kansas
I think it speaks volumes to say we'd be in a considerably different place without them, and I'd like to think that place has fewer nazis.