"To get into his phone..."
"...I will become him."
"If you're Sean Archer, then I must be," *click* "Login successful," "Castor Troy"
"To get into his phone..."
If you have your target in a position where you can get an image of their face with any of these sorts of cameras, you can probably just grab their phone from them and hold it up to their face to unlock it anyway.
You know, at that point, anyone willing to go through that effort presumably knows a lot about you lmao
Neat, that's the stuff I got so I'm safe I think.Finger print scanners don't work off of light and darkness, they are actually capacitive touch sensors that have a resolution great enough to pick up electrical current on the ridges of your finger print while missing the valleys. These electrical currents make up a 2D image for the scanner to compare to, but it never uses actual vision capture in the process. Thus, your 3D printed finger print would need to conduct electricity for it to work.
This kind of stuff gets ridiculous. I feel pretty safe knowing my sexts are safe until someone makes a 3d model of my face
I mean, people aren't keeping top secrets documents or w/e on their iPhones are they?
It plays into situations like this: https://en.wikipedia.org/wiki/FBI–Apple_encryption_dispute
Facial ID scanning is inherently less secure than the 4-digit pin, for all the reasons explained in this topic. They don't even need you to be present to 3D scan your face to gain this type of unlock, an AI using multiple pictures of a subject could likely do the very same (i.e. construct a recreation of the shape of your face, then 3D print it out to be depth sensed).
This isn't a problem of security regarding your friends drinking and grabbing your phone at the bar.
But you can train FaceID to only recognize you when you stick out your tongue or give a weird face.
It plays into situations like this: https://en.wikipedia.org/wiki/FBI–Apple_encryption_dispute
Facial ID scanning is inherently less secure than the 4-digit pin, for all the reasons explained in this topic. They don't even need you to be present to 3D scan your face to gain this type of unlock, an AI using multiple pictures of a subject could likely do the very same (i.e. construct a recreation of the shape of your face, then 3D print it out to be depth sensed).
This isn't a problem of security regarding your friends drinking and grabbing your phone at the bar.
That's true machine learning can recreate a face easier than a fingerprint because access to pictures online. But overall my idea is that Apple is not responsible to make data that secure. If they can protect your phone from snooping coworkers or family members, and if your phone gets pocket-picked off you, it needs to be secure enough that some amateur can't use your credit card from Apple pay, those things are good enough. Overall, anyone can hack your debit card at a gas station, or pinch your wallet out of your pocket and empty your cards. The phone is already 1000 times more secure as a wallet with features like faceID, touchID, find my iPhone, lock and sound an alarm on your phone from a distance.
If I wanted to keep my phone secure because I'm a spy or something I would use a custom passcode with total of 16 characters including numbers and special characters and call it a day. Try using a brute force attack on a 16-character anything goes password lol. Gonna take thousands of years in parallel. FaceID and touchID are enough for 99.9999% of the population.
Apple, and their customers, disagree. Thats why this is coming to light -- one of the claims about the move to facial ID is that it was more secure than touch id. Groups reporting how they can break said security systems is important, because by Apple's own claims, the privacy of their customers is their responsibility.
Apple, and their customers, disagree. Thats why this is coming to light -- one of the claims about the move to facial ID is that it was more secure than touch id. Groups reporting how they can break said security systems is important, because by Apple's own claims, the privacy of their customers is their responsibility.
What about when their claims are nonsense and fail under scrutiny?
I assume the iPhone X forces a pin entry after a certain number of failed FaceID attempts right?
You know, at that point, anyone willing to go through that effort presumably knows a lot about you lmao
That's not really the point though, the point is that it isn't as secure as Apple claims. It's shameful, imo.
"Oh well" seems like a pretty poor response to this, if the video is real. This isn't about someone trying to unlock your phone -- the implications are there for government, celebrities, court cases, etc.
If only there was some sort of more secure method Apple could have also included with the phone... I can't quite put my fingerprint on what it could have been.
You'd think the "hackers" would be willing to answer basic questions about what they did.
If you know how this technology works, then what they did is obvious.
Yeah? I've read the article, and the kinds of questions they are asking belies a complete lack of how these technologies work. Like when they cite apple claiming that the phone sources additional training data for the convolutional neural network from repeated scans of the face? Uh, so? Of course it does, that's how a deep learning application like this would work. That in no way would disqualify a mask.
The fact that this is required is what made me not want this phone. Facial recognition is not effective in these times.
this guy knows whats upBut what if I make a mask of someone else and use the mask as my face ID. Then no one would know who I made the mask of cause they think it's me they're trying to make a mask of
But what if I make a mask of someone else and use the mask as my face ID. Then no one would know who I made the mask of cause they think it's me they're trying to make a mask of
where learning is a factor, or only in a fake situation where the Face ID is taught the mask over time or even set up to use the mask to begin with? Dismissing these is ridiculous.
They are dismissed because they misunderstand the point of learning in the first place. The reason they "learn" is because a person's face is not unchanging nor would the scanning be from the same exact angle each time, which is what necessitates convolution in the first place. The training data they are receiving, from the perspective of the computer (or phone), makes different faces appear more uniform to make identification easier, which grants a greater percentage of false identifications. Using a mask wouldn't make this process easier.
Which, again, belies an understanding of how the process works in the very first place. The questions they are asking aren't damning.
They also want to know how the created the mask (did the owner help? Pictures? What?).
No one is arguing about what's theoretically possible. Obviously an accurate copy of a face will work. What people want to know is if they need to worry about this in reality.
Again, it doesn't matter because training without a mask would make the ID system more prone to false ID, not less.
How they made the mask is also not important, especially when you know how these types of systems check for. As time goes forward, making these kinds of masks with AI will become more trivial and more accurate.
In reality, yes, it works. Thats why people are talking.