Pixel 4 Face Unlock vs. iPhone 11 Face ID Review By Vector

By Vector
Aug 14, 2021
0 Comments
Pixel 4 Face Unlock vs. iPhone 11 Face ID Review

Sponsored by brilliant, get smarter every day and save 20% via brilliant org, slash vector back in 2017 Apple introduced face ID on the iPhone 10. It was the first real biometric facial geometry, identification scanner, it couldn't do multiple registrations like touch ID, but what it did even better, including an especially making authentication feel almost transparent. Now Google has just released face unlock on the pixel for the basic biometric. Facial geometry. Identification system is pretty much identical to face ID. It adds some extra hardware and software for extra convenience, but some of that is region dependent, and it also is missing a key aspect of operational security, at least for now, so which is better and why, let's find out I'm Rene Richey- and this is vector both Apple and now.

Google have abandoned fingerprint authentication for facial geometry, yeah I know some people really want both, but full-on depth-sensing camera systems are still relatively expensive components so having that plus an in display fingerprint sensor that actually works reliably and securely bumps up the bill of goods and the price along with it. Since the iPhone 11 already starts at $6.99 and the pixel for at $7.99 and people, often the same people are already complaining. That's too high. All we can do for now is dance with the biometrics they brought us Google hasn't published much about how exactly face unlock works and based on all the reviews, I watched and read, they didn't say much of anything about it either. That's in stark contrast to Apple who did extensive briefings following their introduction event and published white paper level details on face ID shortly thereafter for the purposes of this video since Google is using such similar technology I'm going to assume they're, also using a similar process, if and when they choose to or a pressure to elaborate I'll update cool.

So you have to register your facial geometry, in other words, scan in the data in order to set up and start using face, ID or face unlock for authentication. Apples interface for this is really elegant, tap to get started. Turn your head tap again. Turn your head again and you're done. On the hardware side.

Flood illuminators cover your face in infrared light, so that the system has a canvas to work against even in the dark, then projectors splash a grid of over 30,000 contrasting dots on your face, along with a device specific pattern as well. That makes it harder to spoof the system digitally or physically. Next, an infrared camera captures 2d images and 3d depth data to essentially create a model of your facial geometry, Apple crops, the images as tightly as possible, so they don't keep any extra information about where you are or what's behind you in the frame, then they encrypt the data and send it over a lockdown hardware channel to the secure element on the a-series chipset. Originally, that was the 11 Bionic. Now the 13 Bionic they're, a secure portion of apples, neural engine block transforms it into math, but also retains the original data, so the face ID neural networks can be updated without requiring you to re-enroll your facial geometry each time neither the data nor the math derived from it ever leave the secure element.

They are never backed up, never hit any servers anywhere ever and that's it. You're done almost Apple gives you the option of setting up an alternate appearance at any time as well. You do it by going through the registration process a second time, so, for example, you can still use it, even if you make yourself up very differently for work for fun for personal reasons. For any reason, at all, Google's setup interface is remarkably similar in design but different in implementation. It's not as elegant, but it is more verbose.

They give you lots of text upfront detailing both Universal issues, with facial geometry scanning, like the inability to distinguish between twins or some close relatives, and issues specific to the pixel which we'll get to in a minute. Secondly, you only have to turn your head once, but it's kind of super fussy about how you do it Center your head, better turn less turn slower, but if you follow the directives and just keep on keeping on it eventually finishes anyway. The pixel has two infrared cameras, one on each side, which should make for a more robust reading of the dot patterns. Google also has their own Titan M security chip, which should function. Similarly, to Apple's secure enclaves and pixel neural core, which should function similarly to Apple's neural engine block I, don't know enough about the silicon architecture, to tell if Apple doing everything in a single SOC and Google doing everything in discreet, coprocessors makes for any advantages or disadvantages either way or if it's all just the same functionality.

Google does say they don't store the original images. The way Apple does only the models, but that neither the original images nor the models are sent to Google or shared with any other Google services or apps, which is good because Google's handling of face data has been controversial at times to say the least. Now I really liked how Apple's setup seems to be far less sensitive to small deviations in angle and speed. Theoretically, Google is simpler, but because it can complain more, it can end up taking just as long or longer and can be much more frustrating to complete, especially the first time you go through the process. I love that Google discloses so right up front the issues with facial geometry scanning as part of the process, Apple mention things like the evil twin attack on stage when they first announced face ID and Google did not.

But who knows how many people saw or remember that this as part of the set-up everyone using it will see it every time they use it, both Google and Apple. Do let you tap through for additional information with Apple being more verbose here and Google briefer. When you want to unlock you, wake up your iPhone either by raising it up or tapping on the screen. The accelerometer then fires up the system, and it goes through a process similar to registration with face ID attention. Detection makes sure your eyes are open, and you're actively and deliberately.

Looking at your iPhone, you can turn this off for accessibility reasons if you absolutely have to, but otherwise it won't unlock that helps prevent, surprise or incapacitation attacks where someone else tries to use face, ID to unlock your phone without your consent, the flood, illuminator and dot projector then go to work. The time, though, the infrared camera captures only a randomized sequence of 2d images and depth data again to help counter spoofing attacks. The neural engine then converts that to math and compares it to the math from your initial scan. This isn't the simpler pattern matching of fingerprint scans. It requires neural networks to determine if it's actually your facial geometry or not your facial geometry, including rejecting attempts to fake your facial geometry.

If you're not familiar with how machine learning and neural networks work, imagine tinder for computers. Yes, no! No! No! Yes! Yes! No! No hot dog! Something like that! They're, not coded like traditional programs, they're trained more like pets and once you let them loose, they can carry on learning without you, they're, all so adversarial. So imagine a Batman Network trying to let you into your phone, but only you and a Joker network continuously trying to find new ways to get past. The Batman Network continuously, making the Batman network better. If the math matches a yes token is released, and you're on your way.

If it doesn't, you need to try again fall back to your passcode or stay locked out of your phone also face. I'd may store the math from successful unlock attempts and even from unsuccessful unlock attempts when you immediately follow up by entering the passcode. That's to help the system learn and grow with changes to your face or look that might happen over time, even the more dramatic one's like shaves haircuts, even injuries after it's used to data to augment a limited number of subsequent unlocks, face, ID discards, the data and potentially repeats the augmentation cycle again and again, because the technology was so new at the time Apple focused on making it as consistent and reliable as possible from the right-side up portrait mode orientation and about 45 degrees off-axis, either way. That includes the physical angling of the true depth. Camera system itself.

They've since made it work, 360 degrees on the iPad Pro, but sadly haven't seen fit to bring that functionality the iPhone yet leaving it far more frustrating to unlock while lying down. For example, to unlock also literally only unlocks the phone to open it. You have to take the second step of swiping up from the very bottom of the lock screen swipe up too high, and you get notifications instead, which is bewilderingly inconsistent with the swipe down from the top-left corner. That reveals notifications when the iPhone is opened and not locked face unlock on the pixel is again very similar to the broad-strokes but different in the details. That's thanks in large part to motion sense, originally called project sold.

It's an actual daredevil style radar stance, chip that can detect when you're reaching for your pixel and fire up the face unlock system. So it's ready to go even before you start to lift or tap. It also works from any angle like the eye pads. You can unlock it, even if you pick it up upside down, or you're lying down at the time. Unfortunately, Google either couldn't or wouldn't enforce attention for face unlock at launch.

So it currently works. Even if your eyes are closed, and that means it is susceptible to surprise or incapacitation attacks. In other words, if you're asleep, restrained unconscious or merely surprised, Google has said that they'll add the feature in a future update, but it might take a while again, Google hasn't elaborated on their specific process, but it's safe to assume the flood, illuminator and dot projectors fire, the dual infrared cameras capture all or some of your facial geometry, and then it sends it to the titanium security chip for comparison to the model stored there at that point, if they match the pixel unlocks and opens by default, if you'd rather see your lock screen. Instead of going back to what you were previously doing, you can choose that option in settings and I really, really like that. That's an option.

There are two kinds of different workflows. One is all about notifications. You just want to see your lock screen and everything that might be important, but you don't want to dive into and maybe get distracted by all the apps and other stuff on your phone. The iPhone is good at that because face ID. Well, it doesn't open.

The phone does expand recent notifications. The second type of workflow is when you don't care about notifications at all and just need to get to your phone and get something done as fast as possible. The pixel is again great for this, because you can choose to go right into your phone. It's not perfect, because it can't read your mind and determine which workflow you want and just let you do either one at any given moment. You have to pick the one you use more often and stick with it until you change it, but at least it lets you make that change.

The iPhone doesn't have the option to require open eyes and attention just feels IRA's possible on Google's part, though yes, biometrics are more username than password and yes, fingerprints are subject to the same kind of attacks. Although you have ten potential fingers and only one potential face, but any security expert worth their credentials will tell you. Defense is done in depth, you throw as many roadblocks and bumps in the attack path as possible. That's your job! You're! You only have one job for now. Google is pointing anyone concerned towards their lockdown option.

You have to enable it in settings display advanced, lock, screen display, then tap show lockdown options once you've done, that you can hold down the power button and then tap lockdown to temporarily disable biometrics. Even here, though, Apple is more elegant to temporarily disable biometrics at any time on the iPhone, you don't have to flip any settings. You just squeeze the power and volume buttons at the same time and boom you're locked down theoretically. Motion sense should allow you to unlock your pixel without having to even touch it, and it does practically speak, though the radar field around the pixel is so short-range that it doesn't make a huge difference right now, unless your hands are covered in gravy or pacing or whatever, but that's still a legit difference, but it depends entirely on where you live. Motion sense operates on the 60 Hertz band and that hasn't been approved in a lot of geographies, including India, live or travel to one of those places and motion sense and all the advantages that come with it just simply turn off on both the iPhone and pixel.

You can also trigger unlock from a distance by triggering Siri or Google Assistant, which I personally like better and that even gets around the iPhones lack of simultaneous unlock and open. Both the iPhones face ID and the pixels face unlock are available to developers, so they can use them to secure apps from password managers to banking, clients to everything in between Apple was really clever in how they implemented this. When they initially rolled out the touch, ID application, programming, interface or API, they made it less specifically about fingerprints and, more generally about biometrics for developers and users. The abstract, in most of the differences away into a single local authentication framework. So, aside from gaining the ability to adjust, text rings to properly label, face ID versus touch ID functionality.

It just worked for many apps, if not most apps right away with face unlock, there's a greater degree of complexity for apps to work. They have to adopt androids biometric prompt API. If an app is using the older API, it will only look for fingerprint scans, not facial geometry scans at all, and so just drop you back to password mode. Yes, like an animal, so currently only a handful of Android apps support face unlock, although Google is gonna, make it a requirement, so that should change over time, hopefully rapidly. It's tempting to call facial biometric identity scanning a draw between Apple and Google, iPhone and pixel, and the truth is both.

Do some things I really wish? The other would adopt as well. Apple setup is slicker, but requires two steps. Google's complaint so much though it can make. It feels, like three steps. Long Apple explained face ID better in its introduction and has since detailed it to a high degree in white papers where Google's remains something of a black box, but one that does disclose its limitations.

Every time you set it up. I'd love to see a white paper from Google and a more info button from Apple during setup that would handle disclosure without mucking up the experience. Neither the lack of 360 degrees scanning on iPhone or the attention requirement on pixel will be problems for most people most of the time, but they shouldn't be problems for anyone any of the time. In an ideal world, the iPhone would work like the iPad and pixel and just unlock regardless of the orientation, and the pixel would work like the iPhone and iPad and require you to be looking at it before it will unlock same for the iPhone and having the option to unlock and open all at once again, Google has a theoretical advantage. Thanks to the motion, sense radar chip where available, but their overall process hasn't been disclosed or tested to the extent that Apple's has.

We just don't know how secure private and adaptive their neural networks really are ethical issues over the way it was trained aside, Google being Google, we can assume the very best, but no one is hammering on it. The way face ID was hammered on at launch. You know every blogger, every YouTuber, and they're hired VFX team, at least not yet, and they really should go hard on Apple. Absolutely pretty, please it's great for Apple customers, but go hard on everyone else as well. That's where for all customers now, if you want to get involved in the technology behind face ID in face unlock checkout, brilliant is a problem-solving website that uses a hands-on approach with storytelling code, writing, interactive challenges, and so much more.

There are over 50 interactive courses for you to dive into including brand new interactive content that makes solving puzzles and challenges even more fun and more hands-on, for example, from the logic course if gusto is not in the center. Jimmy is not on the left and Buffy is not on the right. What order are they in they're all built for ambitious and curious people, people who want to better understand the world and help shape it to support, vector and get unlimited access to brilliance, courses and daily challenges, just head on over to brilliant org, slash vector and get 20% off their annual premium subscription, Thanks, brilliant and thanks to all of you for supporting the show. So that's part one of my in-depth iPhone versus pixel, deep dye of comparison now, I want to hear from your hit like if you do hit subscribe. If you haven't already flood illuminate that Bell gizmo, so YouTube will actually let you know when the next show goes, live and then hit up the comments, and let me know what do you think about face ID versus face unlock both good one better or none anywhere, cook stew, as good as old-fashioned fingerprints.

Thanks for watching see you next video.


Source : Vector

Phones In This Article


Related Articles

Comments are disabled

Our Newsletter

Phasellus eleifend sapien felis, at sollicitudin arcu semper mattis. Mauris quis mi quis ipsum tristique lobortis. Nulla vitae est blandit rutrum.
Menu