I fail to see how this can be protected under free speech of any sort. This is flat out lying, deceit and abuse.
Now, it goes without saying that most people won't give a hoot.
Photoshopping has been available for how long now? You still can't make a perfect doctored image that can beat an analysis. Video editing is a magnitude harder to do without obvious errors. It'd be painstaking as all hell and it still wouldn't be perfect.
Can be duplicated and edited out. The "watermark" would have to be encrypted into the images somehow.If it was explicitly made clear it was satire or parody it might be protected.
"Let men tremble to win the hand of woman, unless they win along with it the utmost passion of her heart! Else it may be their miserable fortune, when some mightier touch than their own may have awakened all her sensibilities, to be reproached even for the calm content, the marble image of happiness, which they will have imposed upon her as the warm reality."
― Nathaniel Hawthorne, The Scarlet Letter
"It was as if she had been made afresh out of new elements, and must perforce be permitted to live her own life and be a law unto herself without her eccentricities being reckoned to her for a crime."
― Nathaniel Hawthorne, The Scarlet Letter
We did and we pretty much all said this would be weaponized eventually as soon as it was revealed.I feel like this tech is something we've talked about before. Either here or in the twilight days of GAF.
You're right. I'm not dismissing that. I'm just saying, keep informed and let others know; there are ways to combat this.No, most people won't care - and if you search someone, and the first thing that comes up is someone talking about a smut video of them, or the actual video, and the refutation of it comes later in the search, you're not even getting there. That's the problem. You can get stuff taken down, but it just comes back.
Honestly this technology should be banned by law. I have never seen software so obviously harmful relative to its merits like this shit. It must not progress any further than it is doing now.
And you're gonna catch heat for saying "females."Thanks for using some men, but be careful you might catch heat, for not saying all men.
Anyway that's some crazy stuff, espacally when they use it to target females or any other individual for that matter. Also no way it should be Protected by the first amendment claiming someone did something that they didn't could fuck up their image
Honestly this technology should be banned by law. I have never seen software so obviously harmful relative to its merits like this shit. It must not progress any further than it is doing now.
Honestly this technology should be banned by law. I have never seen software so obviously harmful relative to its merits like this shit. It must not progress any further than it is doing now.
Maybe the software couldn't but the usage could be fineable or punishable by jail time.While I agree with you about the harmful effects, I don't see how it could be banned by law, at least in the United States.
Yeah you're right I just wrote that off the cuff. I haven't applied any serious thought to it.This is like trying to stop the internet. You can slow it down, sure, but others will research it instead.
You really tone policing after telling a suicidal poster to kick rocks because you didn't like his post history?
It's probably going to go hand-in-hand with VR and AR.Maybe the software couldn't but the usage could be fineable or punishable by jail time.
Yeah you're right I just wrote that off the cuff. I haven't applied any serious thought to it.
I helped my friend deepfake his wife and yeah, it's scary how good off the shelf tools are getting these days.
Wait, don't call the FBI, it's not like this (and yeah, that was clickbaity, sorry), he had a video of her drunk doing the "i'm funny how" bit from goodfellas and I helped him put her face on Joe Pesci as a surprise for her birthday.
I'm a VR pessimist but an AR optimist and neither will have the potential for social harm that this will. VR and AR have very obvious "good" applications. Every application I can think of for this ranges from "crass commerialism" to "character assassination". VR and AR both require enough hardware to be simply legislated and regulated like any other physical product. Deepfakes threaten the veracity of information itself and what "truth" means in society.
Too late, the cops are already on their way.I helped my friend deepfake his wife and yeah, it's scary how good off the shelf tools are getting these days.
Wait, don't call the FBI, it's not like this (and yeah, that was clickbaity, sorry), he had a video of her drunk doing the "i'm funny how" bit from goodfellas and I helped him put her face on Joe Pesci as a surprise for her birthday.
Photoshopping has been available for how long now? You still can't make a perfect doctored image that can beat an analysis. Video editing is a magnitude harder to do without obvious errors. It'd be painstaking as all hell and it still wouldn't be perfect.
The optimistic take is that people will just stop caring. I mean, people have been photoshopping celebrities on nudes decades before we even had photoshop and on the whole it didn't end up being a huge deal. But I think considering the general atmosphere on the internet and the proliferation of bad faith actors, vigilance is merited, even if I don't personally know what should be done about any of this.That's another thing, there's going to be a lot of reasons for people to WANT to use deep fakes, putting their own faces in movies and what not for a laugh, and share the results.
I see where you're going. Point taken... for now.
This. When deepfakes became a thing there was some pretty interesting things done with it unrelated to porn and revenge tactics. While I don't think the "fun" deepfakes are worth the risk of having this tech around, the cats out of the bag already. It be great if machine learning could ID fake videos uploaded, and report the IP and user information to the local authorities. You wouldn't be able to stop it from popping up in places, but at least anything meant to defame would be kicked to the corners of the internet. Get youtube, pornhub, and a few other sites on board or fine sites that don't, and you could curb a lot of this, with people caught using it for malicious purposes being charged with a crime.Maybe the software couldn't but the usage could be fineable or punishable by jail time.
Can be duplicated and edited out. The "watermark" would have to be encrypted into the images somehow.
IMO, this is one of the only ways forward in terms of deterrents.This. When deepfakes became a thing there was some pretty interesting things done with it unrelated to porn and revenge tactics. While I don't think the "fun" deepfakes are worth the risk of having this tech around, the cats out of the bag already. It be great if machine learning could ID fake videos uploaded, and report the IP and user information to the local authorities. You wouldn't be able to stop it from popping up in places, but at least anything meant to defame would be kicked to the corners of the internet. Get youtube, pornhub, and a few other sites on board or fine sites that don't, and you could curb a lot of this, with people caught using it for malicious purposes being charged with a crime.
First off, I'd appreciate you not bringing up cross-thread drama, if you have an issue with me or my posts you are free to use the report feature or the block feature. Secondly, maybe take some time to talk to that user and ask about the conversation we had outside of that thread through PM's before running your mouth without all the facts.You really tone policing after telling a suicidal poster to kick rocks because you didn't like his post history?
Tone policing (also tone trolling, tone argument andtone fallacy) is an ad hominem and antidebate appeal based on genetic fallacy. It attempts to detract from the validity of a statement by attacking the tone in which it was presented rather than the message itself.
If it's a task that humans don't know how to do, raw computation power can only go so far, especially since you would need examples for training (unless you could somehow work backwards from a real image).
You wouldn't be, yes, if you've encrypted to prove that "I made this, yes, but it was satirical" like one of those "I do not own anything here, all licenses are the property of their blah blah" disclaimers on YouTube. But it has to be done on the images itself rather than the metadata because of video capture technology.Then the crime of defamation would be in those that are reproducing it for purposes of defamation not the creator itself? I dunno...
Say I write a parody fake news story in a parody website with a parody disclaimer, and then someone copies it and spreads it as real, then I don't think I can be liable for defamation.
These dudes are scumbags. Should be just as punishable as posting genuine revenge porn, if not more so.
You really tone policing after telling a suicidal poster to kick rocks because you didn't like his post history?
Grammatically, let's forget the debate around its offensiveness, I don't think females is even correct. From my very cursory research, the only correct usage of the word female as a noun is to refer to certain animals. So I feel like it should be obvious why it's demeaning.
Lol yeah that's clearly the worst possible result.I hope this kind of targeted assassination of specific women doesn't end up demonizing porn in general
You're looking at it in a bit weird way I think.well tbh never knew calling someone a female was demeaning I just did a quick google search and saw a artical on why it's could be demeaning, but I've been saying women or female, male etc depending on which one I think is grammatically correct and never meant to offend anyone.
I even see the word being used often on Instagram and Twitter by both male and female ,but I'll edit my post, gotta be carefully these days, everyone take things too personally or always looking for something negative even though the message wasn't intended to be negative.
well tbh never knew calling someone a female was demeaning I just did a quick google search and saw a artical on why it's could be demeaning, but I've been saying women or female, male etc depending on which one I think is grammatically correct and never meant to offend anyone.
I even see the word being used often on Instagram and Twitter by both male and female ,but I'll edit my post, gotta be carefully these days, everyone take things too personally or always looking for something negative even though the message wasn't intended to be negative.
Like I said this is the first time I've ever heard the word being sexist, but I guess context matters and I could see why people who think the word is sexist could have seen my post of me using it as sexist, cause that's just how they think but I went back and edit it.You're looking at it in a bit weird way I think.
Look at it this way, if you're gonna use "females" like this in your speech or writing, a people would assume you're a sexist, and if you don't want people to think that you're a sexist (and why would you?) the easiest thing to do is not to just use it like that.
I don't think it's a huge deal.
Dat youtube level apology
I recommend you read this: https://www.masstlc.org/the-more-machines-learn-the-less-we-understand-them-for-now/If it's a task that humans don't know how to do, raw computation power can only go so far, especially since you would need examples for training (unless you could somehow work backwards from a real image).
Does anyone have a link of a side by side comparison of a real video and a deep fake video? Not a porn video, just something that shows off the tech.
Bit of a turnaround from "please don't bring up other posts" like 10 minutes ago.