Apple ProRAW Review | iPhone 12 Pro Camera Test! By James Lavish

By James Lavish
Aug 13, 2021
0 Comments
Apple ProRAW Review | iPhone 12 Pro Camera Test!

So apple Perot is now released and guys we have to talk first to all you subscribers, welcome back. We've been having some great conversations in the comments below, and I enjoy hearing from you and, if you're new here we're all about making photography simple so hit that subscribe button. You won't be sorry so apple pro raw. What is it, how is it different from a regular raw image? Well, first: let's talk about what a raw image is: it's an image ready to be molded into your interpretation of what that image should be what this scene was or what you want the scene to look like once you process it. It's basically a binary form of an image, all zeros and ones with as much data as possible in there for you to mold and adjust to your liking. The difference of a raw image and a JPEG image is the JPEG is the interpretation of the image that comes from your camera and a Sony.

JPEG will look different from a Fuji JPEG, which will look different from a canon JPEG, which will look different from a Nikon JPEG they're. All interpretations of that raw data put into JPEG form, and so, as you can imagine, a raw image gives you much more flexibility to process the image just the way you want it without the interpretation coming from your camera maker or your software. That's why a raw image looks so flat and almost lifeless when you first load it into say Lightroom or capture, one. It's ready to be tweaked and adjusted to have the contrast and the colors just how you want them, and this is why the most professional photographers use raw images this way they can manipulate them to exactly the way they want them to look. Okay, then! So what is apple pro raw, and how is that different from regular raw? Well apple program gives you not just the raw data, but it's the raw data after it's already used its deep fusion and hdr3 technology to create that image.

So it's an image that has much more dynamic range than a regular raw image coming out of a phone you're, getting apple's pipeline data embedded in that raw image before you even start. So this gives you much more information to work with when you go to process that image and as we've seen in Apple's technology in the 12 12 pro and pro max, the computational photography element that apple is overlaying in these images is super powerful because the phone is taking multiple exposures all in a split second and then combining them to give you that hdr3 or deep fusion version of a computationally photographed image. It's basically the equivalent of a bracketed image where you're taking three five, maybe seven exposures and combining them together to get as much of the shadow detail and as much of the highlight detail as you can in a single image, and it all happens in an instant inside the apple phone. That said, sensors on full frame cameras are much larger than the sensors in these little phones, and they have a much larger dynamic range inherently within that sensor, to deal with. So the question is: can apple's deep fusion and hdr3 technology overlay into that pro raw image compete with, say a full frame, Sony, a7 camera and that's what we're setting to find out today.

But before we get into the photo comparisons, I want to share with you a few things. I've figured out within the iPhone 12 that may help you to set up your photos before we start. First, now that pro raw has been released to turn it on just go into the phone settings and camera then formats and turn on apple pro raw. When you're in the camera app, then you'll get a button at the top of the screen to decide if you want to take the shot in raw. A word of caution, though, these raw files are huge about 10 times the size of the standard file that comes out of the iPhone.

So I personally keep this turned off until I'm in a situation where there's huge dynamic range in the scene, or it's a photo that I know that I will definitely want to process heavily later, otherwise, I'm usually just taking JPEGs with my iPhone. Also, there's a little trick. I learned about HDR control and how to turn it off on certain shots. Just go into the phone settings and under camera turn off the HDR. I know this sounds a little counter-intuitive, but then, when you go back into the camera app itself, you get a switch here at the bottom that you can decide whether you want to turn this off, and if you saw my last video, you know there are certain situations that you definitely want to turn off the HDR and the iPhone.

So you can capture a photo a certain way and if you saw my last video, you also know that the iPhone can struggle in burst mode with grabbing consistent, sharp images. But there's a setting here that you can choose quality over quantity in your burst mode and that will help you get sharper images just go into your settings and back into camera and then prioritize faster, shooting and turn this off. This will make the quality of shots the priority over the quantity of shots, and you'll get better images. Okay, now that we're done with the settings, let's go. Look at some photos: okay, starting with the sunrise, photo here.

You'll notice, a few things right off the bat when you load an apple pro raw image into Lightroom number one. The image looks a little flat and does not look like it does on your phone. It doesn't have that same vibrant color. It doesn't have the same dynamic range and HDR. Look that you get when you're, seeing the image on your phone, that's more like a JPEG.

Secondly, you'll notice that the apple pro raw files come with a preset level for certain adjustments like whites, blacks and clarity, and I don't know why Apple does this- I haven't seen it on other files in cameras or phones, but this is consistent with every single apple program file that I've loaded into Lightroom. As for this image, it looks like I underexposed it a little and could have touched a different area of the screen on the phone in order to get a better exposure, and if I bump this up a little you'll see, I probably could have gotten an extra, almost full stop of light out of this. In any case, we'll see what happens here when we move these adjustments and just how much dynamic range is embedded in this ROG file. First, we'll level these back down to zero, and then we'll see just what kind of information is embedded in this file when we move the shadows and highlights and that'll give us a pretty good idea of what kind of dynamic range this file has captured for us. So if we move these shadows up all the way to 100, you can see there's a ton of information here in these shadows that you couldn't see and if we move these highlights down, that'll take care of some noise.

That's in these shadows and clean that up a bit, and also it brings a little of contrast into the image when you move those sliders like that, but just moving these blacks down, maybe bumping up the exposure a little. You can see there's a ton of information in this file and even though it was underexposed, the iPhone did a pretty remarkable job at capturing all of those details in those dark areas, and one of the things I wanted to see was just how much flexibility you're getting from the apple pro raw file, as opposed to the JPEG type file. That comes straight out of the phone and so taking this image of the tree. This room was pretty dark. It was morning and of course the apple did its thing with its computational photography.

But again you see that the whites are bumped up, blacks are bumped down, and the clarity is bumped up. So if we level those out and actually adjust this without the pre-adjustments, you can see just how much information is in the shadows, which is pretty incredible, considering this is just a phone and that this is all done with computational photography, and I'm guessing that a number of images were taken immediately and all stacked on top of each other. To get this file kind of, like an instant bracketed, photo of three five seven different exposures now compared to the JPEG that I took at the same time, which is actually a had file, you'll notice on the apple on the iPhone, you don't get JPEGs, you get the had files, and it's a newer, more robust format than the JPEG. So that's why apple uses this? But if we look at this file and the details of it- and we take the settings from the raw file that we adjusted, and we copy those over and paste them to the JPEG file, you'll see that the quality and the colors and the details just falls slightly from this raw file. The raw file just has a better level of detail and color than the JPEG file.

It kind of falls apart here, a little being pushed so hard with those sliders moving up so much, and if we even move them up more, you can see it begins to break down even more and then flipping the lights on here's the raw file, the automatic adjustments are in there. So if we zero those out see what happened in the shadows, and we bumped down these highlights to retain those highlighted values. Bring down these whites just a little. You can see that there's a ton of detail in here, even though there's a pretty high amount of contrast here with these lights shining right against super dark areas, so very, very bright, small lights in dark, dark areas on the tree and the raw file has a ton of information in here. As you can see remember this is where it came from.

If we copy those same adjustments over to the JPEG type image, that's what you get. It looks nothing like the raw file, there's just not as much information here. As you can see, we would have to bump up this exposure. You can tell it just doesn't look the same, and if we compare them side by side, you can see how the JPEG just cannot keep up with this raw file. The raw file is just so much cleaner.

Okay, so, let's see how the iPhone stacks up against Sony's full frame, a7r3 camera, so taking a walk the other morning. This is an image that I took literally looking directly into the sun. You can see that telltale green flare that you get with the iPhone when you're taking a photo right into the sun, which is a problem uh. I should point out, but I've only seen that happening when I'm taking a photo literally directly into the sun. So again, if we bump these levels back to zero, and we check our what's going on in the shadows and just how much detail we were able to bring out, you can see that we're able to take that sun, which was completely blown out and recover that detail by pulling down the highlights, which is amazing, you can do that with this raw file.

The details look sharp and there's plenty of it. You could see all the way back in here and through this plant and into those dark areas where there was no detail before, and now you've got all the detail and in comparison with the Sony, a7r iii. If we pull down these highlights and push up these shadows, the detail is definitely better. You have more detail up here in this area and the Sony didn't get quite as blown out through here, and you can really see the detail on these flowers. Even if I bump up this exposure a little, the full frame sensor was able to capture that absolutely able to capture a little better than the iPhone and looking at them side by side.

Here's the iPhone and here's the Sony details are good up here, there's nothing wrong with it and honestly with the sun right here, it may have been slightly harder for the iPhone to pick up these details, as it was flaring right across the plant, whereas in the Sony the sun is up above, so it wasn't flaring right through here, but still the iPhone did an awesome job. Considering just how high of a contrast this scene was. This is a is a great job by this phone. Okay, another super hard test here the sun was back behind this tunnel. The tunnel was pretty dark and this was completely blown out and this was very dark, but the iPhone was able to capture all of this detail.

So if we zero these out, bring back the highlights, so they're not blown out, and we push up those shadows. We get a ton of detail here in the tunnel. You can see the leaves on the ground. You can see this little wrapper here, all the lights up above that's pretty awesome. I mean it was so dark in there.

That's a that's an amazing amount of detail coming out of this phone and the Sony doing the same thing. Exposing for those highlights and again these were exposed to not be blown out in camera. So bringing back these highlights, so they're, not overexposed and bumping up these shadows to see. What's in that tunnel, it actually looks a little darker than the iPhone does so the iPhone here on the left, just a ton of detail and even bumping up the exposure here in the Sony. There's more contrast, but not quite as much detail in these dark areas, and you can see that there's kind of this purple or reddish fringing here in these shadow areas, as you try to bump up the exposure which is not present in the apple photo, which is absolutely incredible, and the reason that is because Apple has taken a number of photos at different exposures in order to capture this level of detail and the Sony's taken one exposure with a much larger sensor to get to this photo and then going inside the tunnel with the iPhone zeroing this out again and then bumping up the shadows, bringing down the highlights, pushing up the exposure.

Look at all of that detail. It captured that's absolutely unbelievable. Look at this! It's got all of this pollen in here, and you can see the cracks in the walls. You can see the details of the grate there's no weird coloring going on maybe just a little of a reddish tint here but watch what happens when we do the same thing to the Sony full frame camera. If we pull down the highlights, bump up the shadows and then push up the exposure, what look at the banding of colors that's created when you try to get to the same exposure as we did with the iPhone and side by side? You can see the iPhone has none of that issue when you bump up its exposure, and again it's able to achieve that with that tiny sensor, it's taking a number of photos at different exposures and then stacking them together, like an instant bracketed, photo in order to get that and the Sony as great as the Sony a7r iii is it just could not handle this darkness, and it just couldn't bring up that level of detail in another contrast shot with the iPhone we zero these out and pull down.

These highlights to keep that from being blown out, and we push up these shadows. Look at the detail in these trees and looking at the Sony, a7r iii, pulling down the highlights, pushing up the shadows and pushing up the exposure. I don't see much of a difference versus the iPhone, except for the colors. The detail is rich in both of these photos and in a similar photo. We have the iPhone, and we zero these out, pull down the highlights a little, push up these shadows and see what we can get.

In fact, we don't even need to pull down the highlights much. We can just push up this exposure and look at that detail. There's a little of noise in here, as this was very, very dark straight behind the sun, and there was no light hitting this branch, but the iPhone was able to capture all those details in this pro raw file and looking at the Sony file, pulling up these shadows, pushing up the exposure and in comparison I'd have to say that that's pretty remarkable, coming out of the phone that it was able to capture this much detail in those dark areas and a pretty darn good job compared to the Sony full frame camera. Another super high contrast photo. I mean I was pushing this phone as far as I could in these situations.

If we zero this out, and we bump up those shadows, pull up the exposure. Look at all of that detail in here, even if we pull down these highlights a little to keep it from being blown out. Look at this detail, that's incredible! It started here, and it was able to capture all of that in this little phone and compared to the Sony full frame. If we push up these shadows and push up the exposure, there is better detail with the full frame. No doubt it's definitely cleaner, and it's sharper down here, but not too bad from the apple pro raw image and then finally, and then finally just a little color test here to see just how much we can push and pull the colors with these files.

We've got this plant of red berries and if we wanted to, we could change the hue of these pretty easily without anything, really breaking any color here can be changed pretty readily without any strange banding or breaking of the colors in the image, which is pretty amazing, actually that it's so clean, there's just a ton of information in here that you can play with. I would have to say that I am super, super impressed with the pro raw files and what you can do with these. It's absolutely incredible how much information apple is able to embed in these files for you to adjust and play with okay. So where does that leave us? Is the iPhone going to overtake the large cameras anytime soon as fast as companies like apple, are making advances in computational photography? I have to think that the large camera makers have been put on notice. That said right now, these phones just cannot compete with the pure physics of a larger camera, the larger sensors, the larger lenses, the zoom lenses, the glass, the elements there's just too much physics to overcome, yet with computational photography.

But that said, in my opinion, if the camera makers don't start making advances in the software side of photography, they are going to lose market share even faster than they are now to these phones. I mean there's no reason why large cameras can't take bracketed photos and put them together right in camera, for you, like the iPhone does, or at least give you the option to do that. It would save you so much time from processing those photos manually in Lightroom, for instance, and the HDR and deep fusion technology that apple is using gives phenomenal low-light performance in these little phones, and these little cameras I mean who else out. There has had this situation where you take a great photo of your phone. You turn around and take the same photo of your large camera setup, and it's not even as good as what you just snapped so quickly with this plus with LIDAR sensors time-of-flight sensors.

They give you depth of field capability, so the phones can map out what should be in focus and what should be out of focus, and we're, seeing just how good they're getting at determining what should be in focus for portraits, for instance, even in super low light settings, something that, with large camera setups, you often have to really move up your ISO in order to amplify that sensor and get a properly exposed image. Plus these phones are now using HDR video technology, which will make it so much easier to process videos without having to use a log really flat profile, which takes a lot of time to process and color correct yourself on the computer. Now, there's probably going to be some heated conversation, maybe some colorful comments below, but before you have a knee-jerk reaction. Let me tell you a little story. First, my first job when I got out of college, was on the floor of the New York stock exchange, and this was in the early to mid 90s when computers were just starting to take off and not everybody was on the internet yet, but I had a conversation with an older trader upstairs, and we were talking about how the New York stock exchange might change, and he insisted that it would never change significantly because you just needed to have traders physically on the floor, yelling back and forth to each other in order to get the proper trading price.

Now I insisted that eventually, the floors upstairs and the traders upstairs would be connected, and we wouldn't even need the stock exchange, and he was vehemently arguing against me that it would never happen that you would always need the floor of the New York stock exchange. It was tradition, and physically you had to be in front of one another in order to make a trade happen, you couldn't do it over the phone, and you certainly couldn't do it virtually on something called an internet now flash forward a couple of decades, and it ends up. We were both kind of right. The New York stock exchange is still there, but if you go to the floor now, there are very few people on it. It's almost all done virtually you have a few of the specialists standing around the post, and they wait for the trades to come in, and they match them up with other trades that come in from another side or another place, and instead of having a large crowd of people, you just have a bunch of internet connections and that's how trades get done on the New York stock exchange.

Now, unfortunately, a lot of my friends ended up unable to work down there anymore, because the jobs left the seats were not worth anything anymore. It was not worth it to have a badge to be down there. Trading on the floor of the stock exchange, they were evolved out of that position and replaced with just wires back and forth from desk to desk across the world. Now I see this as not too different. Yes, we have a physical problem to overcome with the lenses and the glass and the elements and the size of the sensor, but these are all things that I think can eventually be overcome with the computational photography, the artificial intelligence of these phones, their ability to recreate exactly what the larger cameras can.

Now I mean the chips are getting smaller they're, getting faster they're, getting more powerful everything's getting cheaper. So even if you don't have perfect glass or perfect lenses that can match that size of a lens, you can still have enough computational photography to overcome that someday. The question is: will it be in five years, 10 years, 20 years, there's still a long way to go, but think of all the progress these phone makers have made in just the last five to seven years. I mean it is absolutely incredible what they can do now and if I were a large camera maker, I would absolutely be doing everything I can spend money on research and development to figure out how we can keep up with this type of technology because it is powerful, and it is only getting more powerful, and I can only say that I am simply amazed at what these phones can do now. But you tell me what you think: do you think that they'll never replace the big cameras? Do you think that they'll merge? Do you think that the big camera makers will eventually use computational photography? Are they going to put some sort of software overlay or options to have that kind of technology and implementation into your own camera? What do you think leave in the comment below I'd like to have a healthy debate about this and see what you guys think? Okay, I hope you found this video helpful or at least a little interesting and if you feel like it was please hit that like button, as that definitely helps.

And if you haven't already don't forget, to hit the subscribe button and that little notification bell to stay in the loop around here, and I'll. See you in the next video you.


Source : James Lavish

Phones In This Article


Related Articles

Comments are disabled

Our Newsletter

Phasellus eleifend sapien felis, at sollicitudin arcu semper mattis. Mauris quis mi quis ipsum tristique lobortis. Nulla vitae est blandit rutrum.
Menu