Please excuse the newbie question.
I always thought that the "digital definition" of "Digital White" in an RGB system was R = G = B.
In the process of creating ICC profiles for my Nikon D810, I use the ColorChecker chart.
I took a shot in early afternoon direct sunlight, last week (January 18).
When I opened the NEF in RawDigger and looked at the Histogram, I was "shocked" to find out that all histograms were not "equal"?
Here is a link to a screen capture :
I supposed the "whitest" white object in the image, the Munsell N9.5 white patch, would show up on the extreme "right side" of the RGBG2 histograms at pretty much the same locations.
But that's not how it is. I previously measured this patch to be CIELab = 94.47 -0.93 2.35, slightly "yellowish" I have an old 20 years target). In fact, I get these RAW values for the white patch :
I'm surprised these don't correspond to the *Max* of each "channels", as reported on the Histogram as :
I am curious where these Max values come from.
Here is a screen capture of RawDigger main interface, showing my ColorChecker :
As you can see, there does not "seem" to be any area with a "higher reflectance" than the N9.5 patch.