Moderation and content promotion, automatic or not, gives those in charge of it responsibility for the content that is promoted and the results of moderation, so, yes.
That you think moderating extremist positions on a private platform is dangerous perfectly encapsulates why I think they are dangerous and at fault for the spread of extremist positions.I agree with you up until you started saying they should be moderating stuff... you want to encourage corporations to moderate political messaging to align with your view point? Dangerous. I can understand fact checking, removing automatic bots posting paid for messages, I can understand anti-racist moderation, I can understand child safety, anti-violance and all that moderations, but you are entering danergous areas when you start saying we need to moderate the political messages of people. Free will and free thought have to be a thing, especially in a liberal society. People WILL have different opinions then you, and many of them will be stupid, but if we start being thought police, than the world you are building will be a truly horrible one.
I don't think Twitter and Reddit were conceived with hateful, divisive rhetoric in mind, but they're definitely being sustained - at least in part - by it, and they're definitely turning a blind eye for financial reasons.The only reason they drag their feet or bury their heads when it comes to being progressive and deplatforming is that they are very aware of the potential loss of money. They are profiteers of hate. They're guilty.
I think a lot of this stuff is just exposing our poor education systems. There's no inquiry. No critical thinking. No examining the other side critically. We're basically sheep to be manipulated. Not only that but media literacy is very low. Many folks believe anything just because it is online, from old school chain emails to persuasive Fox News personalities.
Youtube, FB is only exposing that fault. FB is worse since they don't believe in moderating political speech, even if it is false.
Home run. I posted above about how technology does this every time, and your post is sort of what I mean. We get a little more shocked each time we can easily see how distorted/manipulated our day-to-day reality is.I think a lot of this stuff is just exposing our poor education systems. There's no inquiry. No critical thinking. No examining the other side critically. We're basically sheep to be manipulated. Not only that but media literacy is very low. Many folks believe anything just because it is online, from old school chain emails to persuasive Fox News personalities.
Youtube, FB is only exposing that fault. FB is worse since they don't believe in moderating political speech, even if it is false.
I think this is what really frustrates me whenever anybody throws out how they shouldn't be moderating their platforms. Making suggestions and pushing content to you is disproportionately elevating voices. It is a choice they are making. Just because it was an algorithm that pushed it means nothing. The algorithm was made by humans, it gives priorities based on criteria set by humans.I recently went to Youtube without being signed in, and one of the first things Youtube recommended my "fresh" account was a Ben Shapiro video. It's blatant.