FriskyCanuck

Member
Oct 25, 2017
4,067
Toronto, Canada
Scary times we live in, the disinformation campaigns we've already seen are just the tip of the iceberg.

https://apnews.com/21fa207a1254401197fd1e0d7ecd14cb
WASHINGTON (AP) — Hey, did my congressman really say that? Is that really President Donald Trump on that video, or am I being duped?

New technology on the internet lets anyone make videos of real people appearing to say things they've never said. Republicans and Democrats predict this high-tech way of putting words in someone's mouth will become the latest weapon in disinformation wars against the United States and other Western democracies.

We're not talking about lip-syncing videos. This technology uses facial mapping and artificial intelligence to produce videos that appear so genuine it's hard to spot the phonies. Lawmakers and intelligence officials worry that the bogus videos — called deepfakes — could be used to threaten national security or interfere in elections.
"I expect that here in the United States we will start to see this content in the upcoming midterms and national election two years from now," said Hany Farid, a digital forensics expert at Dartmouth College in Hanover, New Hampshire. "The technology, of course, knows no borders, so I expect the impact to ripple around the globe."

When an average person can create a realistic fake video of the president saying anything they want, Farid said, "we have entered a new world where it is going to be difficult to know how to believe what we see." The reverse is a concern, too. People may dismiss as fake genuine footage, say of a real atrocity, to score political points.

Realizing the implications of the technology, the U.S. Defense Advanced Research Projects Agency is already two years into a four-year program to develop technologies that can detect fake images and videos. Right now, it takes extensive analysis to identify phony videos. It's unclear if new ways to authenticate images or detect fakes will keep pace with deepfake technology.
Deepfakes are so named because they utilize deep learning, a form of artificial intelligence. They are made by feeding a computer an algorithm, or set of instructions, lots of images and audio of a certain person. The computer program learns how to mimic the person's facial expressions, mannerisms, voice and inflections. If you have enough video and audio of someone, you can combine a fake video of the person with a fake audio and get them to say anything you want.

So far, deepfakes have mostly been used to smear celebrities or as gags, but it's easy to foresee a nation state using them for nefarious activities against the U.S., said Sen. Marco Rubio, R-Fla., one of several members of the Senate intelligence committee who are expressing concern about deepfakes.

A foreign intelligence agency could use the technology to produce a fake video of an American politician using a racial epithet or taking a bribe, Rubio says. They could use a fake video of a U.S. soldier massacring civilians overseas, or one of a U.S. official supposedly admitting a secret plan to carry out a conspiracy. Imagine a fake video of a U.S. leader — or an official from North Korea or Iran — warning the United States of an impending disaster.
 

Taki

Attempt to circumvent a ban with an alt account
Banned
Oct 25, 2017
5,308
Get ready for the 2018 midterm pre-election fake videos of politicians.
 

The Albatross

Member
Oct 25, 2017
39,361
I'm usually not techno-phobic, but this is something I am pretty worried about. For people looking for confirmation bias and social media morons, it's going to be very easy to get tricked. People are already easily duping themselves with selectively chopped up obviously bogus videos, like that college football coach who reposted an obvious doctored video of Obama admitting to being a part of a secret new world order...

If you're vigilant, it'll be hard to be tricked. But the problem is a lot of people aren't vigilant and want to be tricked.
 

samoyed

Banned
Oct 26, 2017
15,191
Sigh, this is why people thinking things will go back to "normal" after Trump are out of their minds.

It should be a criminal offense for a news corporation or advertisement agency to air a deepfake of a politician. Dunno what you're even going to do in the online arena.
 

Nude_Tayne

Member
Jan 8, 2018
3,684
earth
Scary times we live in, the disinformation campaigns we've already seen are just the tip of the iceberg.

https://apnews.com/21fa207a1254401197fd1e0d7ecd14cb
I wonder if one day down the line the creators of this technology will have an Oppenheimer moment... "I am become death, destroyer of truth."

You can't stop progress and technological development, that much is given, but I just don't see much in the way of positive applications of this technology while it's incredibly easy to imagine how absolutely destructive it will in the already present post-truth era. It's not going to be pretty.

In before "era hates new tech!"
 

TheMan

Banned
Oct 25, 2017
3,264
The article is a bit late, but yeah. Some of those deepfake porn videos were very well done. Anyone with half a brain instantly saw that the tech has applications that go way beyond porn. The future is gonna be a weird place...
 

Taki

Attempt to circumvent a ban with an alt account
Banned
Oct 25, 2017
5,308
I'm usually not techno-phobic, but this is something I am pretty worried about. For people looking for confirmation bias and social media morons, it's going to be very easy to get tricked. People are already easily duping themselves with selectively chopped up obviously bogus videos, like that college football coach who reposted an obvious doctored video of Obama admitting to being a part of a secret new world order...

If you're vigilant, it'll be hard to be tricked. But the problem is a lot of people aren't vigilant and want to be tricked.

And the Russians will only need to trick a small % of voters across a handful of swing states. They can selectively target and influence just enough voters in those states, or at the very least introduce distrust of candidates among those voters.

I guarantee there are fake videos of Kirsten Gillibrand and Kamala Harris being tested and fine-tuned right now for future deployment.
 

Broken Joystick

The Fallen
Oct 27, 2017
1,932
England
Cool, more ways for Trump to dismiss every single shitty thing he says. "It's deepfake news!"

There should be a focus on somehow making watermarks for these videos that people can locate if they need to see if a video is fake or not. Obviously, I know next to nothing about how this works and how these are made, but some of them are ultra convincing at a first glance.
 

samoyed

Banned
Oct 26, 2017
15,191
How does this work with audio?
They can apply the same machine learning to various audio clips of someone to synthesize their speech patterns and accent, I believe. Currently they use existing audio or impersonation.



Because we have so many digitized recordings of major political figures out there, the more prominent someone is, the more easily they can be faked. I fully expect videos of Obama as a Muslim some time in the next 4 years.
 

UnluckyKate

Member
Oct 25, 2017
10,658
I heard about this on the sub reddit doing porn fakes with celeb faces. They were talking about having a watermark to clearly identify deep fakes from real to avoid the problem but reddit right out close the sub and the community went elsewhere I guess...

As for porn, I guess it's always someone's fetish.

But for political fakes, this is scary yes.
 

Mass Effect

One Winged Slayer
Member
Oct 31, 2017
16,972
I did a bit of research on this for one of my classes, and it made me realize that we might be fucked when this tech becomes mainstream.
 

Fubar

Member
Oct 25, 2017
2,738
I saw gifs/short videos of some of these in the past and thought it was innocent enough. The Obama video above, some others I don't remember.

But seeing a porn video a few days ago (full video, 20+ minutes long) where the actress's face was entirely replaced by Anna Kendrick was what pushed me over the edge. There were some angles that were bad, and the face just looked a bit out of focus, but 95% of it looked legitimate enough.

When people put in the effort and perfect this technology, results will be terrifying.
 

AlsoZ

Member
Oct 29, 2017
3,003
Honestly, this is probably one of the most dangerous technologies ever conceived.
Such a large part of the population is willing to believe a lot of things without a shred of fact-checking as long as it fits their worldview.
 

uncelestial

Banned
Oct 25, 2017
4,060
San Francisco, CA, USA
All Russia will need to do is target campaigns with false videos like this to voters in swing states and enough of them will be swayed to get the results they want, and social media advertising platforms (and digital advertising platforms in general) make such targeting absolutely push-button easy.
 

Taki

Attempt to circumvent a ban with an alt account
Banned
Oct 25, 2017
5,308
P.S. Also, how will this affect the use of video and audio evidence in court?
 

jph139

One Winged Slayer
Member
Oct 25, 2017
14,491
Most of the reference material we have now is porn, and there's some stuff out there that looks close to real. It's clearly not - you know you're looking at something "off" - but it's close.

The fact complete amateurs, in their free time, with a brand new technology, can get close is insane. You put political or criminal money behind this sort of thing and it's gonna change the world.
 

Mona

Banned
Oct 30, 2017
26,151
Most of the reference material we have now is porn, and there's some stuff out there that looks close to real. It's clearly not - you know you're looking at something "off" - but it's close.

The fact complete amateurs, in their free time, with a brand new technology, can get close is insane. You put political or criminal money behind this sort of thing and it's gonna change the world.

yep, crazy times ahead
 

Deleted member 14002

User requested account closure
Banned
Oct 27, 2017
5,121
They can apply the same machine learning to various audio clips of someone to synthesize their speech patterns and accent, I believe. Currently they use existing audio or impersonation.



Because we have so many digitized recordings of major political figures out there, the more prominent someone is, the more easily they can be faked. I fully expect videos of Obama as a Muslim some time in the next 4 years.


So they just took the machine learning for construction of face masks and applied it to audio sampling. That's super cool/interesting.

I wish I had the extra cash for a new video card atm. I'd really like to play with this.

Anyone know if the tools used can work with distributed processing?
 

Chamaeleonx

Banned
Oct 27, 2017
2,348
I wonder if the effect gets diminished if you have seen the person in reality and you can identify outlier things the video mentions. So for example, Obama suddenly becoming a Nazi would look pretty strange. I would imagine that you have to keep a close eye on how consistent someone is.

Obviously, reverse engineering will exist as mentioned in the BBC video.
 

Book One

Member
Oct 25, 2017
4,859
At a time when certain parties push hard on the narrative that free press and credible journalists are the 'enemy' and conspiracy theory loonies become an actual source of news for some people and the predominant social media platforms do nothing to stop the spread of fake news sources....

Yeah, shits gonna get real bad
 

samoyed

Banned
Oct 26, 2017
15,191
And the real problem isn't just identifying deepfakes, that'll be relevant for court proceedings and such officiated settings, but convincing people of what's real and what's fake even when it goes against their preconceptions.

We can barely convince people that someone did or did not say some thing, this is going to get much worse when people have video "proof".
 
Last edited:

SuperBanana

Member
Oct 28, 2017
3,781
Some of the porn deepfakes that popped up when it first became a thing were so real looking it made digital faces on Hollywood movies look like cheap shit. It was really kind of amazing.
 

Deleted member 2507

User requested account closure
Banned
Oct 25, 2017
3,188
I wonder if in the future people will need to have some kind of authentication token whenever they will appear in video or audio, a digital signature, to confirm it is not a faked vid.
Of course, this leads to a situation where one could leave it out and claim the vid was faked.
 

Lwyn

Banned for use of an alt-account
Banned
Jul 2, 2018
168
Some of the porn deepfakes that popped up when it first became a thing were so real looking it made digital faces on Hollywood movies look like cheap shit. It was really kind of amazing.

A very skilled person can outdo even the most technological companies.
 

Otnopolit

Member
Oct 28, 2017
1,604
Oddly enough, Twitter is going to be a big part of politicians and celebrities protecting their credibility by publicly saying "I was not there, that is not me speaking". I feel we will ultimately need young legislators who understand the dangers to help us through this with laws and safeguards in order to fully be protected from the alt-realities that will be created.
 

Nude_Tayne

Member
Jan 8, 2018
3,684
earth
I wonder if the effect gets diminished if you have seen the person in reality and you can identify outlier things the video mentions. So for example, Obama suddenly becoming a Nazi would look pretty strange. I would imagine that you have to keep a close eye on how consistent someone is.

Obviously, reverse engineering will exist as mentioned in the BBC video.
None of this matters when all it will take is some shitty faked video to spread on Facebook that people will give 10 seconds of attention to and think, "Yep, this Alex Jones video is real, Ocasio-Cortez definitely said she wants to turn America into a communist state. Aww, a cat video!"
 

AlteredBeast

Don't Watch the Tape!
Member
Oct 27, 2017
4,791
24 Season 3 (4? 5?) can't remember, but I knew the technology to map faces and voices would be perfected sometime, and we are almost assuredly less than 5 years away from that at this point.

You think Racist Grandma is busy sending out facebook posts now, wait until they have VIDEO PROOF of PizzaGate/Conspiracy to cover up crimes/etc. Russia is almost assuredly going to be on the forefront of deploying this technology and our techno-libertarians are going to lie back and let it happen. They aren't just the biggest enemy to public mental health, they are also going to be complicit in taking away more of our freedoms, as they were in the 2016 election.

Absolutely terrifying stuff.
 
Oct 25, 2017
5,846
Honestly I think people concerned about this are getting freaked out about nothing, because we already live in a world where Trump denies things he said and that are still on his Twitter account with no repercussions. The people who want to believe what they want already do. This just makes it easier to manufacture their fodder, it doesn't materially change the calculus.
 

mael

Avenger
Nov 3, 2017
17,138
How does this not fuck the entire world?
Oh it does, but the US is much more vulnerable to this kind of shit.
They already got an election stolen through social engineering after all.
I can already see ways that could be countered in France for example, in the US? you're fucked, man.
 

PMS341

Attempted to circumvent ban with alt-account
Banned
Oct 29, 2017
6,634
Honestly I think people concerned about this are getting freaked out about nothing, because we already live in a world where Trump denies things he said and that are still on his Twitter account with no repercussions. The people who want to believe what they want already do. This just makes it easier to manufacture their fodder, it doesn't materially change the calculus.

Making it easier does change things, though. Technological improvement towards videos such as these, as pointed out in this thread, will only make those who believe what they want feel more empowered due to "evidence" of the claims they are seeking.
 

Tahnit

Member
Oct 25, 2017
9,965
This is the kind of tech that is going to need regulation. It's just too dangerous.