• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Baji Boxer

Chicken Chaser
Member
Oct 27, 2017
11,380
The content wasnt the problem. It was people freezeframing a shot that looked sexually suggestive, and timestamping it in the comments.
I don't know, I've seen the occasional stretching vid pop up in recommendations with pretty questionable thumbnail preview, and over a million views.
 

oneils

Member
Oct 25, 2017
3,099
Ottawa Canada
The content wasnt the problem. It was people freezeframing a shot that looked sexually suggestive, and timestamping it in the comments.

ehhh, the people uploading these videos seem to know exactly what they are doing and how to get hundreds of thousands of views for a "popsicle challenge." It does make me wonder what is happening behind the scenes.

Maybe there are channels ripping this stuff from other innocuous channels and curating it together? im not sure. but there does seem to be more to this than viewers taking advantage of innocent content.
 

Deleted member 19844

User requested account closure
Banned
Oct 28, 2017
3,500
United States
I don't know, I've seen the occasional stretching vid pop up in recommendations with pretty questionable thumbnail preview, and over a million views.
Yeah it was a combination of non-exploitative videos with creepy comments and actual exploitative videos. From what I can see now, the non-exploitative videos have had comments disabled and the exploitative videos / channels have been shut down.
 

ItIsOkBro

Happy New Year!!
The Fallen
Oct 25, 2017
9,515
YouTube's recommendation algorithm is broken. It's too weighted on what others have watched as opposed to building a profile about the watcher.

This guy's brand new account test demonstrates that. He:
1) Searches for a bikini haul video
2) Watches a bikini haul video
3) Watches a recommendated bikini haul video
4) Watches a recommended young girl gymnastics video
5) It's at this point his recommendations are entirely videos of young girls.

I'm sure that due to all the pedophiles out there the algorithm has learned that people who clicked that gymnastics video click other young girl videos.

My problem is at this point YouTube knows very little about this dude and in lieu of that it recommends the pedophile wormhole others have gone down.

First of all, at step 3, why is a young girl gymnastics video even recommended? At this point all it knows it he likes adult woman bikini haul videos.

Ok, maybe the algorithm is trying to seed in some other types of videos to build a profile, otherwise we'd ever only get recommend one type of video.

EXCEPT NOPE! He clicks the gymnastics video and now apparently YouTube thinks it has all it needs to know to recommend and endless stream of videos of young girls. But all it knows is he's watch 2 bikini haul videos, and bypasses a third bikini haul video to watch the gymnastics video. Apparently those 2 bikini videos and the search query don't even matter anymore. Well, it should.
 

Deleted member 8257

Oct 26, 2017
24,586
I'm still trying to grasp what's going on. There isn't CP on YouTube is there? Just gymnastics of young girls? If so, how could they moderate that?
 

FeD

Member
Oct 25, 2017
4,275
YouTube's recommendation algorithm is broken. It's too weighted on what others have watched as opposed to building a profile about the watcher.

This guy's brand new account test demonstrates that. He:
1) Searches for a bikini haul video
2) Watches a bikini haul video
3) Watches a recommendated bikini haul video
4) Watches a recommended young girl gymnastics video
5) It's at this point his recommendations are entirely videos of young girls.

I'm sure that due to all the pedophiles out there the algorithm has learned that people who clicked that gymnastics video click other young girl videos.

My problem is at this point YouTube knows very little about this dude and in lieu of that it recommends the pedophile wormhole others have gone down.

First of all, at step 3, why is a young girl gymnastics video even recommended? At this point all it knows it he likes adult woman bikini haul videos.

Ok, maybe the algorithm is trying to seed in some other types of videos to build a profile, otherwise we'd ever only get recommend one type of video.

EXCEPT NOPE! He clicks the gymnastics video and now apparently YouTube thinks it has all it needs to know to recommend and endless stream of videos of young girls. But all it knows is he's watch 2 bikini haul videos, and bypasses a third bikini haul video to watch the gymnastics video. Apparently those 2 bikini videos and the search query don't even matter anymore. Well, it should.

Broken or working exactly as intended?
 

ascagnel

Member
Mar 29, 2018
2,212
WHAT IN THE FUCK. Christ, USGOV needs to make regulations for these companies to follow because they have shown to not police themselves.

They did, with FOSTA and SESTA -- laws that did a lot to make it look like they're doing something about child sex trafficking, but don't actually do anything about it.,

You can thank Facebook for that one.
 

Barrel Cannon

The Fallen
Oct 25, 2017
9,297
Possible suggestion: User accounts shouldn't be allowed to comment without being linked to some sort of identification system using a government ID. That way users are won't say the stupid shit they do without repercussions.

Youtube's value would probably tank though.
 

Baji Boxer

Chicken Chaser
Member
Oct 27, 2017
11,380
Yeah it was a combination of non-exploitative videos with creepy comments and actual exploitative videos. From what I can see now, the non-exploitative videos have had comments disabled and the exploitative videos / channels have been shut down.
Good to know, though what I was talking about was from just before this topic was made. Of course it'll take time and more reports to track down a lot of this stuff.
 
Oct 26, 2017
12,125
Possible suggestion: User accounts shouldn't be allowed to comment without being linked to some sort of identification system using a government ID. That way users are won't say the stupid shit they do without repercussions.

Youtube's value would probably tank though.
years ago, i thought it was a crazy idea. John Kerry and Hilary clinton were suggesting a "real ID" to use the net, but Im all for that shit now.
THeres too much shit happening online that people are getting away with. The net isn't what it was in the 90's or early 2000's its way different, way more dangerous.
 

Deleted member 1476

User requested account closure
Banned
Oct 25, 2017
10,449
Twitter and Facebook pretty much proved that people will still say and do shit even if their real name is there.
 

Barrel Cannon

The Fallen
Oct 25, 2017
9,297
years ago, i thought it was a crazy idea. John Kerry and Hilary clinton were suggesting a "real ID" to use the net, but Im all for that shit now.
THeres too much shit happening online that people are getting away with. The net isn't what it was in the 90's or early 2000's its way different, way more dangerous.
Same boat here. We practically give away the majority of our information now, purely for advertising reasons. We might as well give information to identify and sack as much scum as we can
 
Oct 26, 2017
12,125
Same boat here. We practically give away the majority of our information now, purely for advertising reasons. We might as well give information to identify and sack as much scum as we can
also, with a "real ID" law i'd demand a sweeping "online privacy" bill. that makes it illegal to sell/hold/use meta data based around someones online habits.

that would also go a long way in effecting various hostile countries that are using the meta data for malicious ends
 

Deadlast

Member
Oct 27, 2017
572
Now is the time for someone else to rise up and take on Youtube. A hub of videos. Maybe someone who could focus on non-adult content.
 

Patapuf

Member
Oct 26, 2017
6,416
years ago, i thought it was a crazy idea. John Kerry and Hilary clinton were suggesting a "real ID" to use the net, but Im all for that shit now.
THeres too much shit happening online that people are getting away with. The net isn't what it was in the 90's or early 2000's its way different, way more dangerous.

Now imagine all the bad shit happening but now hackers and bad actors have an easy way to get at your real ID. Hell, even "good" actors having access to your personal information whether you want them to or not.

We need solutions for moderations but forcing everyone on a real ID system has many really dystopian consequences.

We need to watch to not chuck out the baby with the bathwater. There's limits to how much monitoring people actually increases security.
 
Last edited:

Deleted member 2809

User requested account closure
Banned
Oct 25, 2017
25,478
years ago, i thought it was a crazy idea. John Kerry and Hilary clinton were suggesting a "real ID" to use the net, but Im all for that shit now.
THeres too much shit happening online that people are getting away with. The net isn't what it was in the 90's or early 2000's its way different, way more dangerous.
Do you want to be china ? Because that's how you become china.
Youtube can do better. They're just being greedy as fuck.
 
Oct 26, 2017
12,125
Now imagine all the bad shit happening but now hackers and bad actors have an easy way to get at your real ID. Hell, even "good" actors having access to your personal information whether you want them to or not.

We need solutions for moderations but forcing everyone on a real ID system has many really dystopian consequences.
I mean, peoples identities online aren't difficult to find. social engineering takes at most, like 15 minutes.

but I understand your statement.

Perhaps a new amendment: RIght to online privacy/personal privacy. where it deals with meta data and your info being shared/sold.
 

Patapuf

Member
Oct 26, 2017
6,416
I mean, peoples identities online aren't difficult to find. social engineering takes at most, like 15 minutes.

but I understand your statement.

Perhaps a new amendment: RIght to online privacy/personal privacy. where it deals with meta data and your info being shared/sold.

I'm just saying, there's limits to how much monitoring poeple actually increases security.

If everyone has an "ID" they have to use, it will be hackable, it can be abused by multiple governments all over the world (it's not like this would be only applicable to the US, and even the US..). Discrimination by institutions also becomes even easier with that type of system.

And the worst part is knowing the online ID's of everyone still won't get rid of perverts. They'll learn and adapt.
 

Rendering...

Member
Oct 30, 2017
19,089
years ago, i thought it was a crazy idea. John Kerry and Hilary clinton were suggesting a "real ID" to use the net, but Im all for that shit now.
THeres too much shit happening online that people are getting away with. The net isn't what it was in the 90's or early 2000's its way different, way more dangerous.
I too wish to live in a dystopian time when merely admitting to a desire for privacy and anonymity makes you a target for extreme suspicion and distrust.

What is it you privacy lovers have to hide exactly?!
 

Blackage

Banned
Oct 27, 2017
1,182
That original video from Matt showing the method had ads from Disney, Nestle, Epic Games. If they weren't in the video I doubt they would have pulled ads.

I guess Disney, Nestle and Fortnite are things young girls who are looking up gymnastics and bikinis might be interested in so the algorithm kinda played itself, which makes this do more damage to youtube.
 

Mockerre

Story Director
Verified
Oct 30, 2017
630
Well, I'm afraid there is only one way to stop this and other worrying Internet trends like the rise of the alt-right, fake media, stupid and harmful theories and conspiracies gaining traction and so on.

As Dawkins put it in The Selfish Gene in the 70s, it's not the best idea that prevails, it's the one that is the most memetic, i.e. is the most likely to replicate. Sensational news, outrageous theories, banned content, all have a better ability to replicate than well researched facts and data, which are slower to acquire and put into the world and are also, generally, more dull than fantasy.

I know none of you will like it. But here it is: content creator real life identification. You want to post something on the internet in places of high traffic like YouTube, Facebook, news sites etc., you own it by signing it with your real credentials - so there is, again, accountability for words (and images). In the case of juggernauts like YouTube, there probably needs to also be a fee to post a video - the proceeds from this should pay for an army of moderators.

I too wish to live in a dystopian time when merely admitting to a desire for privacy and anonymity makes you a target for extreme suspicion and distrust.
What is it you privacy lovers have to hide exactly?!

You got it backwards. The Internet needs to have as much privacy control as real life does. Because as it is, Internet is just a game of shadows with total anonymity for those who seek it. And those who seek TOTAL anonymity are more often than not societal disruptive forces.

It's a case of 'this is why we can't have nice things'.
 

OrdinaryPrime

Self-requested ban
Banned
Oct 27, 2017
11,042
Well, I'm afraid there is only one way to stop this and other worrying Internet trends like the rise of the alt-right, fake media, stupid and harmful theories and conspiracies gaining traction and so on.

As Dawkins put it in The Selfish Gene in the 70s, it's not the best idea that prevails, it's the one that is the most memetic, i.e. is the most likely to replicate. Sensational news, outrageous theories, banned content, all have a better ability to replicate than well researched facts and data, which are slower to acquire and put into the world and are also, generally, more dull than fantasy.

I know none of you will like it. But here it is: content creator real life identification. You want to post something on the internet in places of high traffic like YouTube, Facebook, news sites etc., you own it by signing it with your real credentials - so there is, again, accountability for words (and images). In the case of juggernauts like YouTube, there probably needs to also be a fee to post a video - the proceeds from this should pay for an army of moderators.

Are you talking about using an in person ID to validate your account on these sites? I believe Facebook already does this for new accounts: https://www.facebook.com/help/159096464162185?helpref=faq_content. I'm not sure when they started but pretty sure this issue is still prevalent across its platform.
 

Mockerre

Story Director
Verified
Oct 30, 2017
630
Are you talking about using an in person ID to validate your account on these sites? I believe Facebook already does this for new accounts: https://www.facebook.com/help/159096464162185?helpref=faq_content. I'm not sure when they started but pretty sure this issue is still prevalent across its platform.

Facebook enfoced this in 2017. There's 10+ years of unregistered users on there who weren't asked for it. Also, the problem is not individual users, the problems are annonymous sites and groups and aggregates who propagate false information, false news, misleading theories and memes without claiming individual authorship. This creates a space where any idea, no matter how stupid or harmful can be put into circulation with no accountability. If we don't start to attach qualitative value to the information we put out, the information creep will drown us in a boundless sea of lies and false flags.

The Internet at its inception became a tool to propagate freedom, but this unchecked freedom created perfect conditions for radical disruptive elements, like the alt-right, who continue to weaponize this freedom against the principles of freedom itself.
 

Camwi

Banned
Oct 27, 2017
6,375
Well, I'm afraid there is only one way to stop this and other worrying Internet trends like the rise of the alt-right, fake media, stupid and harmful theories and conspiracies gaining traction and so on.

As Dawkins put it in The Selfish Gene in the 70s, it's not the best idea that prevails, it's the one that is the most memetic, i.e. is the most likely to replicate. Sensational news, outrageous theories, banned content, all have a better ability to replicate than well researched facts and data, which are slower to acquire and put into the world and are also, generally, more dull than fantasy.

I know none of you will like it. But here it is: content creator real life identification. You want to post something on the internet in places of high traffic like YouTube, Facebook, news sites etc., you own it by signing it with your real credentials - so there is, again, accountability for words (and images). In the case of juggernauts like YouTube, there probably needs to also be a fee to post a video - the proceeds from this should pay for an army of moderators.



You got it backwards. The Internet needs to have as much privacy control as real life does. Because as it is, Internet is just a game of shadows with total anonymity for those who seek it. And those who seek TOTAL anonymity are more often than not societal disruptive forces.

It's a case of 'this is why we can't have nice things'.
I'd say that generally doesn't work on something like Facebook. I remember a story about some shithead flying the Nazi flag in a Wisconsin town, and people (using their actual names) were defending him, including saying shit like "who are you to say that his beliefs are wrong". Blew my fucking mind.

That being said, I guess it's possible that it might stop most people from doing pedo shit, but who knows.
 

Mockerre

Story Director
Verified
Oct 30, 2017
630
I'd say that generally doesn't work on something like Facebook. I remember a story about some shithead flying the Nazi flag in a Wisconsin town, and people (using their actual names) were defending him, including saying shit like "who are you to say that his beliefs are wrong". Blew my fucking mind.

That being said, I guess it's possible that it might stop most people from doing pedo shit, but who knows.

This may be a local issue, in my country displaying Nazi iconography is a crime and doesn't fall under freedom of speech.
 

GameShrink

Banned
Oct 29, 2017
2,680
The videos themselves are fine as long as there isn't anything explicit in them, but the comments section in general has needed to go for a long time.
You shouldn't be able to communicate with people who post on Youtube unless they explicitly invite feedback through a poll.
 

Deleted member 1476

User requested account closure
Banned
Oct 25, 2017
10,449
It will never cease to amuse me how easily people choose the most dystopians ways to solve problems, without batting an eye.
 

ShyMel

Moderator
Oct 31, 2017
3,483
Along with the timestamp comments, the comments on several of the videos shown in that guy's video were creepy to the max.
 

Lulu

Saw the truth behind the copied door
Banned
Oct 25, 2017
26,680
Can someone link the video? I tried searching for it and got a bunch of click bait meltdown videos cause YT is doomed.
 

BlueTsunami

Member
Oct 29, 2017
8,510


From December

The dude made a video talking about a ASMR girl being exploited into doing sexually suggestive things and YT takes down HIS video.



Original video
 

Dreamwriter

Member
Oct 27, 2017
7,461
I'm still trying to grasp what's going on. There isn't CP on YouTube is there? Just gymnastics of young girls? If so, how could they moderate that?
It was a combination of three things. First, pieces of shit were re-uploading those videos with the thumbnail being of a sexually suggestive pose found by pausing the video at just the right moment. Second, pieces of shit were posting in the comments of both those and the original videos with timestamps of other sexually suggestive poses. The videos themselves are fine, but most videos of more than just talking can be paused at bad times (and even talking videos can be too).

The third thing is Google's algorithm. Say you stumble on one of the videos uploaded by a piece of shit - suddenly you are going to be recommended more videos like that. Click on one, you get more. Suddenly your suggested videos are filled with thumbnails of little girls in sexually suggestive poses.

It's horrible.
 

Pomerlaw

Erarboreal
Banned
Feb 25, 2018
8,536
Exactly. I absolutely love being able to drop positive comments to the amateur musicians I follow. Especially words of encouragement to the adult beginners vlogging their experience learning violin.

As an amateur musician...
Positive comments on youtube are ok. But you can also email the artist. Better yet, you can buy some of their stuff or share it with your friends.

All things considered the world would be a better place without youtube comments. I say kill them with fire.
 

Kreed

The Negro Historian
Member
Oct 25, 2017
5,109
It was a combination of three things. First, pieces of shit were re-uploading those videos with the thumbnail being of a sexually suggestive pose found by pausing the video at just the right moment. Second, pieces of shit were posting in the comments of both those and the original videos with timestamps of other sexually suggestive poses. The videos themselves are fine, but most videos of more than just talking can be paused at bad times (and even talking videos can be too).

The third thing is Google's algorithm. Say you stumble on one of the videos uploaded by a piece of shit - suddenly you are going to be recommended more videos like that. Click on one, you get more. Suddenly your suggested videos are filled with thumbnails of little girls in sexually suggestive poses.

It's horrible.

One fix to this issue would be Google preventing "any" video from being recommended by the algorithm and only videos "approved" by Google staff being shown in suggested videos. Users who wanted their videos in these suggested list could submit them to Google for review. This would also help with Russian trolling, conspiracy videos, and other nonsense on Youtube.
 

mute

▲ Legend ▲
Member
Oct 25, 2017
25,097
Nixing the comments wouldn't help (as welcome as it would be). People would just use other forums/platforms to do the same thing and share links. Maybe advertisers would be happier with that though.
 

Deleted member 8257

Oct 26, 2017
24,586
It was a combination of three things. First, pieces of shit were re-uploading those videos with the thumbnail being of a sexually suggestive pose found by pausing the video at just the right moment. Second, pieces of shit were posting in the comments of both those and the original videos with timestamps of other sexually suggestive poses. The videos themselves are fine, but most videos of more than just talking can be paused at bad times (and even talking videos can be too).

The third thing is Google's algorithm. Say you stumble on one of the videos uploaded by a piece of shit - suddenly you are going to be recommended more videos like that. Click on one, you get more. Suddenly your suggested videos are filled with thumbnails of little girls in sexually suggestive poses.

It's horrible.
My goodness gracious. This is abhorrent. But that's the thing with content creators right? How do you moderate something like that? You can't ban the gymnastics videos because some lowlife found a creepshot in it.
 

Kthulhu

Member
Oct 25, 2017
14,670
years ago, i thought it was a crazy idea. John Kerry and Hilary clinton were suggesting a "real ID" to use the net, but Im all for that shit now.
THeres too much shit happening online that people are getting away with. The net isn't what it was in the 90's or early 2000's its way different, way more dangerous.

Nah. That's just opening the door for Chinese level spying.

Do you think groups like BLM and Everytown would exist if such a system were to be put in place?

Governments are already fully capable of tracking these people down. They aren't exactly trying that hard to hide.
 

TheAbsolution

Member
Oct 25, 2017
6,391
Atlanta, GA
One fix to this issue would be Google preventing "any" video from being recommended by the algorithm and only videos "approved" by Google staff being shown in suggested videos. Users who wanted their videos in these suggested list could submit them to Google for review. This would also help with Russian trolling, conspiracy videos, and other nonsense on Youtube.
This suggestion sounds nice but it's simply not possible when you're at the size and scale of YouTube. On top of doing this for previous content of which there are Petabytes and Petabytes of data of, you'd have to do this for new content of which there more than a weeks worth of videos uploaded every minute. It's just not feasible.
 
Oct 25, 2017
7,510
Nah. That's just opening the door for Chinese level spying.

Do you think groups like BLM and Everytown would exist if such a system were to be put in place?

Governments are already fully capable of tracking these people down. They aren't exactly trying that hard to hide.
Yeah, some are not thinking of the implications of such a system.
Reactionary shit like the I.D suggestion is absolutely ridiculous. It wouldn't help.
 

Kreed

The Negro Historian
Member
Oct 25, 2017
5,109
This suggestion sounds nice but it's simply not possible when you're at the size and scale of YouTube. On top of doing this for previous content of which there are Petabytes and Petabytes of data of, you'd have to do this for new content of which there more than a weeks worth of videos uploaded every minute. It's just not feasible.

Why do recommended videos need to be immediate/quickly displayed anyway? It should take time.
 

Aaron

I’m seeing double here!
Member
Oct 25, 2017
18,077
Minneapolis
Nixing the comments wouldn't help (as welcome as it would be). People would just use other forums/platforms to do the same thing and share links. Maybe advertisers would be happier with that though.
YouTube's job isn't to police the whole Internet, but they should at least have their own house in order.

Also I'd argue the notion that it wouldn't help - you're right, this stuff would spread elsewhere, but it's the same as why de-platforming works. When you deny people a convenient place to congregate and spread their views (or in this case, child pornography links) it has a much lower chance of spreading.
 
Oct 27, 2017
2,053
If you can't manage and maintain a giant collection of videos and prevent child exploitation from being hosted on your site, that site should probably no longer exist