What caused them to reinstate this guy anyway? Was there some kind of statement? Why is he allowed to continue making offensive videos?
It could be an interesting examination of the games NPC AI.Explain to me what value there is in respecting this video's supposed right to exist?
Because Keemstar contacted YouTube's head of gaming to bring the guy back.What caused them to reinstate this guy anyway? Was there some kind of statement? Why is he allowed to continue making offensive videos?
But why did they fold to Keemstar?Because Keemstar contacted YouTube's head of gaming to bring the guy back.
Yeah, but when you bring a white guy to the KKK, nobody makes a big deal about it, double standards, something something ANTIFA, [insert other dumb bullshit here]
Because Keemstar contacted YouTube's head of gaming to bring the guy back.
The same reason Keemstar is banned from YouTube but they allow him to post content on a new channel because he supposedly has nothing to do with ownership of the channel
It is wrong and dangerous. I just don't think banning is the best way to approach speech. I'll call that type of speech wrong and dangerous and let everyone know how wrong and dangerous it is.
but getting banned from a platform like YouTube - or any social media platform for that matter - isn't a form censorship. people banned from the platform are free to talk their shit elsewhere online. if a private company doesn't want to give hate speech a platform on their service, removing it isn't censoring it altogether - it just no longer has that specific platform to spread shit on. it shouldn't be equated to government censorship.I don't think calling someone a dumbfuck for being a racist or misogynist is compromising with them. My view of fighting hate is actively shaming it. It doesn't mean letting it go uncontested or simply saying its a difference of view.
I just don't think it should be up to a content holder or government to censor speech, even if it is hateful. In that way, i'm just more of a social libertarian along with many other social views. There are times when laws are blatantly necessary to supplement civil rights against hate, but simply allowing people to speak is something i don't personally believe goes to that level.
Right, let's say it's that and just ignore the big 50K+ spike that started when this story came out... Are you serious?
So why does YouTube need to allow this speech to exist? Why is it wrong if they said "fuck this" and pulled it from their site?
What about this deserves the right to be seen and heard by Youtube's own standards:
I apologise for repeating my questions over and over, but I am really curious about your approach. Now we've established that you're calling hate speech wrong and dangerous, is that the extent of your activity within the marketplace of ideas or does it go beyond that? You're arguing against de-platforming and attempting to make a case for fighting hate in the market place of ideas, but how do you gauge if your approach is more effective than de-platforming? Do you have any examples, in which you put your approach into practice and gained what you, yourself would deem as favourable results over de-platforming?
How does your approach work in practice?
We should also stop jailing thieves and murderers, and instead fight crime by letting people know how wrong and dangerous thief and murder is.
The ugly truth is that if you're not willing to advocate for something being actually done to curb hate speech, you probably don't give many fucks about hate speech at all.
but getting banned from a platform like YouTube - or any social media platform for that matter - isn't a form censorship. people banned from the platform are free to talk their shit elsewhere online. if a private company doesn't want to give hate speech a platform on their service, removing it isn't censoring it altogether - it just no longer has that specific platform to spread shit on. it shouldn't be equated to government censorship.
Speech and non violent rhetoric, however extreme, is not something that should be lumped in with those other examples you mentioned, as that is a direct crime and violence.
I don't think you have any grounds to assume that about someone without knowing them.
I think people are under the impression i am defending hate speech in particular. More like, i am defending speech, hate included.
I was recommended his feminist punching video by YouTube before this whole thing blew up. I know many other people here were too.
That is not the type of shit I watch. There's no reason it should have been suggested to me.
That's not the point I'm making. According to you, "telling people it's wrong and dangerous" is enough (optimal, even) to combat hate speech, so why wouldn't it be enough to combat theft and murder as well?
My grounds is the above statement: you don't care about hate speech enough to actually do anything about it.
Because hate speech is speech. It is not direct action by person or persons in which direct intervention is required to stop it. Hate speech can be internalized in any fashion, through outright rejection, or radicalization, and if violence is not the outcome of that, then...
On the contrary, i care about hate speech, i just don't believe automatic banning is the only right way to combat its influence. Thus my defense of it not being banned outright. You seem to think that that is the only way to combat it and anyone who thinks otherwise "doesnt care". That's completely ridiculous.
Not ridiculous is your view that deplatforming works, its a reasonable viewpoint. But the fact that you have generalized those who don't follow said view.
I was where you are but deplatforming seems to work to disperse hubs of trash. There is research on this that has gathered data on how users from toxic sub reddits (I think it was a fat shaming one among others) were behaving when those subreddits were closed and if they just take that toxicity with them but the result was just full dispersing and everyone going on about normally. No doubt more research should be done on this because I think that study only observed the reddit ecosystem exclusively but it's still something of value knowing that you can make your own ecosystem healthier by removing toxic elements.It is wrong and dangerous. I just don't think banning is the best way to approach speech. I'll call that type of speech wrong and dangerous and let everyone know how wrong and dangerous it is.
So again, you don't care that much about hate speech. "It's not direct action", "no direct intervention is required to stop it", "fi violence is not the outcome..." and so on. All I see from you is justification that "it's not that bad" (as long as it's not directly and explicitly advocating for violence) and "we don't need to actually do anything to stop it.".
I've made no generalization at all: I'm addressing your specific statements and point of view. You insist no actual action is required to stop hate speech. This is objectively, demonstrably false. The end.
And you're ignorant to the results of your own suggestions as well as the alternative you're ignorantly fighting. Deplatforming isn't censorship. It's not a violation anyone's speech. It works.We're just talking past each other.
I say, every action outside of direct banning, and only in certain cases banning should be considered as speech should be protected as a priority. You don't seem to have accepted my answer and have already seemed to mischaracterize my view, so let's just leave it at that.
We're just talking past each other.
I say, every action outside of direct banning
I think you're severely underestimating peoples' morals.When you allow this on your platform, people, young people in particular don't realize it is wrong, or slowly begin to think it's ok. It sure what's so difficult about that.
And you're ignorant to the results of your own suggestions as well as the alternative you're ignorantly fighting. Deplatforming isn't censorship. It's not a violation anyone's speech. It works.
the problem is that when you don't curb and look into hate speech online, the perpetrators might end up killing people. see some of the terrorist attacks that have recently happened in the US. it's nice to say "oh, that guy who attacked that yoga studio had the right to post his misogynist bullshit online!", but for the women he killed it's already too late.
oh, like shirrako?
I've said it multiple times. I'm not arguing on the merits of success or failure of my personal viewpoint in regards to shutting down voices. If you want to ban someone from saying something whatever it may be, obviously banning them from saying it or deplatforming them will do the job. I'm arguing for protecting speech first and foremost.
In this scenario, he also posted violent rhetoric multiple times and subscribed to things he would advocate for doing. In that case, he should have been looked into by the authorities. Banning him likely would not have stopped his rampage, but it never needed to get that far.
Yes, and as i said previously, i belive that shirrako may have violated the terms of service by indirectly calling for violence against feminists.
It's speech. Awful, horrible, terrible speech. You could even classify it as hate speech but that doesn't make it illegal.
So... you now agree he should, in fact, be deplatformed? *confused*
Don't remember the name of the account that tweeted it. but it was a screencap of Shirrako's newest video. Was titled "what happens if you take a black man to the KKK". Thumbnail was a tied up black man over his characters shoulder with the KKK in the background.This user has been suspended. What was this Tweet and who posted it?
You'd think almost losing your livelihood would be a wake up call to probably not keep doing that shit, but that dipshit goes and kidnaps a Black NPC to give to the KKK, like WTF? Shame he was given a second chance.
Edgiest post tonight.I don't think such a thing is possible. Overestimating them on the other hand...
The tweet was still open on Twitter on my app cause I forgot to close it so I screencapped it.This user has been suspended. What was this Tweet and who posted it?