As usual, you guys are fantastic with all your experience. Thanks for sharing.
I'm shooting with a Sony a7iii in what I think is the most basic way. I'm in manual with auto white balance and I don't believe any styles. Probably auto iso. I just realized that the iso gain is between the sensor and the digitizer so the JPG has iso gain burried within it and that can invalidate the histogram. However, as I recall, the picture that got me in trouble was opened in Capture One with a normal transfer function and the exposure was OK. However, when I then applied a linear curve in place of the normal "film standard" (i.e. as S curve), that's when I saw the huge gap to the right in the histogram.
I'm confused. I understand perfectly the desired goal of ETTR but am totally unsure how to actually implement it. I used to use blinkies/zebras to accomplish this goal. But on a recent picture where the over-exposure warnings were clearly visible, when opening the RAW file using a "liner" transform, I discovered I was a full stop under exposed.
So, now, my question: Is the in-camera histogram and/or exposure warning developed from the in-camera JPG from which (I believe) the live display is made? It that case, it sufferes from the same problem I observed; what simulated film transfer function is used to produce the in-camera display?
Love the shot. Is that Powell's point in the background?