So Google's been leveraging a lot of computational photography, a lot of machine learning and AI to make their smartphone cameras better, and we saw this in last year's pixel to and pixel to excel, and we're seeing even more of it this year with the pixel 3 and pixel 3 Excel, and there are a handful of new features in the pixel 3 that make this camera better than last year's camera. So let's talk about what those features are, how they work and how you can benefit from them now before I dive into all these new features, you're going to notice one very common theme among all of them, and it's the fact that they all require multiple photos and then leverage machine learning to achieve the desired result. And one of the news features that Google talked about during their press event was called top shot and top shot, isn't necessarily new. We actually saw Sony do something very similar to this with predictive capture, but basically what this does are. It will take a series of photos and then pick a photo that it thinks looks best and there's a bunch of different criteria that it looks for when it does this like, if somebody's blinking or, if they're, not looking directly up the camera or if they're, smiling and that's how it goes about selecting the recommended photo. But you also have the option of picking any of the other photos that it took.
If you want, and the other benefit of top shot is, if you don't happen to press the shutter button at the exact right moment, you might have still captured the moment that you want it anyways, because it is taking a series of photos instead of just one. The next feature that Google made a pretty big deal about is called super resume and super resume is Google's way of making a 2x zoom without having a secondary, telephoto zoom lens and super resolute takes advantage of these small tiny movements. In your hand, when you're taking a photo, and it'll take a series of photos, all of which are ever so slightly different and then merges them down, uses machine learning to create an image that is much more detailed and a lot sharper when you're zoomed in versus the standard digital zoom that you would get on other smartphone cameras and I. Think it's going to be pretty interesting to see how super resin stacks up to other smartphones that actually have a telephoto zoom lens, and the cool thing about super resume. Is that if you have it stabilized on something like a tripod, it will still use machine learning to create and mimic your natural hand movements in order to capture more detail when you're zoomed, in so with the pixel 3 and pixel 3 Excel Google's, also adding a new feature called night sight and night sight at the time of the recording of this video isn't available on the pixel 3 just yet.
But what this is going to allow you to do is take better low-light and nighttime photography and the way this works is it sacrifices that 9 frame buffer and that zero shutter lag and instead will require you to hold the phone steady. While it takes up to a maximum of 15 frames or 15 photos, and then it merges those photos together to create an image that has the equivalent of a five-second exposure, and you're, probably wondering if you have to hold the phone steady. Well, the image come out blurry and theoretically, the answer is no, because Google's merging algorithm is really smart, and it's able to discard anything that has motion blur or anything that it doesn't need. If there's any unnecessary movement, and you should still get a photo, that is very sharp, very detailed and, most importantly, very well exposed. The next feature that I want to talk about is portrait mode and portrait mode isn't new to the pixel.
We saw this on the pixel, 2 and pixel to excel, but Google's made one very significant change on the pixel 3 and pixel 3 excel. That makes portrait mode even better and without getting into a super long-winded explanation. The pixel 2 used a stereo pair of images and would leverage the split pixels from the dual pixel sensor to create two images that were ever so slightly different, and this was how Google would mimic the effect of having two lenses with just one lens, and then it would use those two images to create a depth map to properly separate the foreground. From the background, well with the pixel 3 Google is using what's called a learning-based algorithm and the benefit to this is you're gonna, get much more accurate, def mapping and better separation of the foreground in the background, and also better background refocusing. So, let's take a look at this example for a second, the pixel 2 is on the left and the pixel 3 is on the right, and you'll notice, right away that the pixel 3 does a much better job of sensing depth and how far away the background is from the foreground, whereas the pixel ? is sort of aggressively blurring everything out as if the background was much farther away from the subject and some of the subjects that are relatively on the same plane as the helmet are also being blurred out when they really shouldn't be, and if you zoom in and look at my desk you'll notice, a very aggressive cutoff between in focus and out of focus on the pixel ?, whereas on the pixel 3, the in focus ? out of focus is much more gradual.
The last feature that I want to mention is computational, Raw and I. Don't think this is a feature that Google actually mentioned during their press event, but the pixel, 3 and pixel 3 excel can take raw photos now and if you've ever edited photos in a program like Lightroom or Photoshop, you'll know how much more beneficial shooting in RAW can be, because you can push the colors a lot more. You can push the highlights and black levels without the image falling apart, like you would, with a compressed JPEG and the way that it does this is it combines up to 15 images, merges them together, and this has a few benefits one you get better low-light performance, 2, you get better dynamic range and 3. You get an image that is much closer to what you would get on a DSLR, even though you're working with a really tiny sensor. So this is a very big deal, especially if you take photography very seriously, or you take smartphone photography very seriously.
You're going to be able to edit these and make them look. However, you want, and you're also going to be able to access them from pretty much anywhere, because it's available on your Google Photos app. So that's it! Thank you for watching this video I hope you all found it helpful and enjoyed it if you did give it a thumbs up and subscribe to the channel down below and, of course, keep it tuned here to Android Authority for more videos like this, because we are your source for all things. Android.
Source : Android Authority