When I look at histograms for different cameras/sensors in RAWDigger, the white clipping level seems to vary considerably and is less than 16383 for a 14 bit encoded signal or 8191 for a 12 bit encoded signal.
For example, the clipping level for the 5D Mk II seems to be about 14736 and they are about 11535 on my 600D. For the G12 they are about 3968.
This suggests to me that they are not normalised to the maximum possible value for a 12 or 14 bit encode. (whereas an 8 bit jpeg always has the same maximum value of 255). If this is the case, how does the RAW processing software know what value to use for the white clipping point. Surely a value for this must be included in the raw file ?
I would much appreciate some advice on this thank you.