Articles

One Way to Get Spot-One Exposure for Your Shots

The shot taken with  spotmeter exposure and +3 EV in-camera correction, opened and adjusted in FastRawViewer

Every day one can see threads on photographic forums where members discuss the various different modes of automatic exposure, trying to find the perfect one. As a rule, these discussions result in the same question – what compensation to automatic metering ought one set to get consistently good exposure? It turns out that no autoexposure mode universally guarantees good out-of-box results.

We are planning to demonstrate that one of the ways of getting good exposure is metering while using the in-camera spotmeter on the lightest part of the scene that needs to maintain full detail (white clouds, snow, etc.) and applying the appropriate compensation to the exposure recommended by the spotmeter.

Can you Evaluate Exposure Using the In-camera Histogram?

shot 2649 with and without WB

They say that "a histogram is a graphical representation of the pixels exposed in your image" or "when judging exposure, the primary areas of the histogram to be concerned with are the right and left edges".

We are going to demonstrate the following:

  • In-camera histograms don't really allow one to analyze the shadows and highlights zones of an image.
  • An in-camera histogram changes significantly with changes in the camera settings such as contrast, picture style, brightness, etc.

So, no. By no means can the in-camera histogram be used by a RAW shooter to evaluate exposure.

Color is a Slippery Trickster

Origina ARW from SONY a6500: embedded JPEG vs. render using correct camera profile

“How do you know, when you think blue — when you say blue — that you are talking about the same blue as anyone else?"

Christopher Moore, Sacre Bleu: A Comedy d'Art

The goals of this article are twofold: the first is to demonstrate that out-of-camera JPEGs, including in-camera previews, can’t be implicitly, with no checking, used to evaluate color (as we already know, the in-camera histogram is misleading, too). The second is to show that it isn’t necessary that the camera manufacturer-recommended converter be specifically tuned to match the out-of-camera JPEG.

The Importance of Establishing Correct Reference Point

ACR. Original RAW (ORF) File

Some Internet discussions claim that it is easier to push shadows up on one camera model compared to another one. Turns out, such an impression may result from a certain trick. First, we will see the trick, and next we will expose it.

If we compare two cameras (or different settings for the same camera; or even the same shot from the same camera but processed in different ways), and we’re doing this by looking at two files in a RAW converter, we need to:

  • either be sure that those two files were processed identically (and no, that doesn’t mean pushing the same buttons or moving the same sliders in a converter / converters);
  • or, if they WERE processed in different ways, understand exactly what the difference is and how to get to the lowest common denominator, if that’s the goal.

Color Differences Between Cameras

RawDigger. Placing a Grid

When a new camera reaches the market one of the most discussed subjects is if the color it records is the same, better, or worse compared to a previous model. It often happens that the color is evaluated based on the rendering provided by some RAW converter. That is, an unknown parameter, that being the color profiles or transforms that a RAW converter uses for these models, comes into play. Another problem with such comparisons is that often they are based on shots taken with different lenses, under different light, and with effectively different exposures in RAW (while in fact the exposure settings itself may be the same).

Let's have a look how cameras compare in RAW if the set-up is kept very close to the same and the exposure in RAW is equalized.

Canon 5D Mark IV Dual Pixel Mode. Oh Yes, Highlights Are There!

RawDigger. Canon 5D Mark IV Dual Pixel mode. Main Frame

We've already received a lot of feedback where the effect of the highlights being preserved in the auxiliary subframe is attributed to the parallax and the razor-thin shape of the highlights in the still-life shot, not to what it really is: a ≈1 stop difference in clipping between main and auxiliary subframes.

Given the mechanism behind the formation and recording of dual-pixel raw data, there is no relation to the size or shape of the highlight area.

To give an example, please consider this photo by Calle Rosenqvist / Kamera & Bild, a dual-pixel raw taken at ISO 400, (you can download it from page 3 of the article, it is the street scene shot _91A0045.CR2). This is definitely not a case with some razor-thin highlights, it is a rather extensive blown out area that, as we will see, can be recovered using the data from the auxiliary subframe.

Canon Dual Pixel Technology: Gaining Additional Stop in Highlights

RawDigger. Canon 5D Mark IV Shot. Main Frame

Let's take a close look at a dual-pixel raw file from Canon 5D Mark IV using RawDigger 1.2.13

The dual-pixel raw contains 2 raw data sets, we will be calling them main subframe and auxiliary subframe.

We'll show that the difference between the main and auxiliary subframes is nearly 2x, or 1 stop, and that the auxiliary subframe can be used for highlight recovery (again, an additional 1 stop of highlights is preserved in the auxiliary subframe while it is clipped in the main subframe), effectively providing one more stop of the headroom in highlights; and the dual-pixel raw file for this camera contains 15 bits of raw data, if you consider main and auxiliary subframes together.

The Three Most Obvious Reasons to Look at RAW and Not Cull Based On Previews

"...Really, why do you even want to look at RAW files? The whole point of RAW is to be processed according to your taste into a JPEG. I never look at RAW files; I never need to. They are loaded into LR, processed, and I look at the processed images.”

FastRawViewer. Pink azaleas. JPEG prevew vs RAW

So here's a question - is it really necessary that one look at RAW when you're selecting RAW files, whether to convert or to present? Isn't a preview enough? You might not know exactly what settings were applied to it, but so what? What's so untrustworthy about embedded and rendered JPEGs and previews? And what's wrong with the preview and histogram on the back of the camera?

All these questions and more will be answered, and we intend on top of that to show how large of a gulf there is between real RAW data and the previews of it.

Very often images that are technically fine are being tossed out, and technically inferior ones are kept. Why? Because people aren't shown the truth about RAW. Here, we intend to show why people need to see and analyze actual RAW data, in advance of choosing which images to discard and which to keep and edit.

Dealing with Damaged RAW Files

FastRawViewer. Damaged shot. RAW. Zoom to 24%

"Last weekend I took a 360 deg panorama and on processing the files discovered two frames had the partial magenta coloring.

There does seem to be a problem with the green channel but with the tools I have I can't get my head around it. Looks like I will have to contact Canon."

This article may seem as if it's about a curious incident, like a musing. However, it actually has very practical ramifications and uses to a photographer. Raw data damage can be symptomatic of an underlying problem, and a glance at the raw data can give one the facts that one needs to inform a company that your camera body has problems that need fixing.

Dynamic Range: Your Fair Share of Flare and Glare

Lens Flare

Fairly often, the dynamic range of a camera is calculated in a perfunctory manner, based only on measurements derived from the sensor and electronics, and ignoring the limitations to the dynamic range imposed by glare and flare.

Glare and flare, two effects very well known in photography, occur due to the reflections and scattering of the light in the optical system comprised of the lens, any filters or adapters on the lens, the camera chamber (which, incidentally, includes the autofocus/autoexposure module located at the bottom of the camera chamber in many dSLRs), and the sensor sandwich itself.

It is very important to realize that glare effects, being so fluid, are hard to automatically recognize and compensate for in RAW conversion and post-processing. Let’s make a rough estimation of the effect of the flare and glare using the following simple method:

Pages