Ugly Hedgehog - Photography Forum
Home Active Topics Newest Pictures Search Login Register
Main Photography Discussion
Sensors and dynamic range
Page <<first <prev 4 of 8 next> last>>
Jan 16, 2021 14:57:39   #
miteehigh Loc: Arizona
 
Several camera manufacturers are now stating the dynamic range in their specs. The Canon R5 and the Leica Q2 mono are a couple of cameras with increased and extensive dynamic range. I don't own either of those cameras and am beginning to explore HDR bracketing and tone mapping as one way to extend dynamic range and then converting those images to black and white (my output preference) in post.

Reply
Jan 16, 2021 17:10:14   #
Hanson
 
Strodav wrote:
You got some good answers on what it is, but what it means is more important. Here's a graph of the Dynamic Range of a Nikon D850 from DxOMark.com. The human eye can capture a difference between light and dark of about 20 stops. That's 2^20 or we can perceive something about 1,000,000 times darker than the brightest object in a scene. Higher end cameras can capture around 14 to 15 stops as base ISO, less than the human visual system, but the graph shows that ability drops very quickly a stop or two above base. At the same time the noise (green to red bar on the right) starts to go up. It's a good idea to understand this curve before bumping up ISO and then wondering why the image looks flat and washed out or the highlights are blown out or the shadows are buried. If you are getting blinkies, one option might be to lower the ISO at the expense of shutter speed and/or aperture. At the same time, the number of bits of color a sensor can capture also goes down with increasing ISO.

Good topic OP
You got some good answers on what it is, but what ... (show quote)



Reply
Jan 16, 2021 17:14:45   #
bclaff Loc: Sherborn, MA (18mi SW of Boston)
 
miteehigh wrote:
Several camera manufacturers are now stating the dynamic range in their specs. The Canon R5 and the Leica Q2 mono are a couple of cameras with increased and extensive dynamic range. I don't own either of those cameras and am beginning to explore HDR bracketing and tone mapping as one way to extend dynamic range and then converting those images to black and white (my output preference) in post.


I've never seen a clear statement of what type of dynamic range they are using.
So these values are of limited usefulness since you can't know if you're comparing "apples to apples"

Reply
 
 
Jan 16, 2021 17:17:43   #
Hanson
 
burkphoto wrote:
There is little point to considering "sensor dynamic range" in isolation. We record images with whole cameras. A camera is a system, and every system I've ever worked with is only as good as its weakest point. In photography, with recent cameras, that weak point is seldom the sensor. It's usually the image processing.

Since part of the photography system can be outside of the camera (computer, monitor, calibrator, software, printers, profiles, etc.), it makes sense to focus on that, too, when you're striving to maximize *apparent* dynamic range.

Silver halide photographic paper reflects about 90% of the light falling on it, under the best of circumstances. Out of an 8-bits per channel image, a range of values somewhere between 12-242 and 18-236 is all we see reflected from most papers. That's around 5 f/stops. But with 12-15 stops of range possibly recorded in raw data, what do we do with the other 7-9 stops?

The answer is found in the various sliders in post-processing software. Detail that is "burned out" or "plugged up" in an out-of-camera JPEG may be there in a raw file of the same image. When that is the case, much of it can be "recovered" (tonally compressed to the point we can see it within the range of brightness that the paper or screen can reflect or transmit). Of course, it is possible to make an awful mess of an otherwise good image, by over-applying adjustments. And unless the monitor is properly capable, calibrated, and profiled, adjusting images at all may well do more harm than good.

So take dynamic range analysis of sensors with a grain of salt. In the final analysis, most PEOPLE don't give a rat's patoot what camera a photographer used, or whether it was film or digital, or whether it was full frame or smaller. They care about the art, or the communications value, or the emotional impact of the image. If it speaks to them in a way that has the photographer's intended effect, who cares?

Most of the spec wars don't amount to anything tangible unless we're making HUGE prints on 8-14 color inkjet printers, on really exotic, archival, museum grade papers. That can be important in the world of landscape photography, or point-of-purchase advertising, or copying artist's renderings from paper or canvas. But for the rest of us making 16x20 and smaller prints, or just viewing our images on screens, it's a fairly moot point.
There is little point to considering "sensor ... (show quote)


I applaud the long but thorough answer EVEN THOUGH the initial question is for a short answer for the most important factor with respect to DR. Therefore in general I would say large pixel size favors lower noise level and higher DR ( a simple answer for most layman?).

Reply
Jan 16, 2021 17:22:44   #
bleirer
 
Hanson wrote:
I applaud the long but thorough answer EVEN THOUGH the initial question is for a short answer for the most important factor with respect to DR. Therefore in general I would say large pixel size favors lower noise level and higher DR ( a simple answer for most layman?).


I'm not sure. The question was 'what determines dynamic range.' That is a pretty deep question.

Reply
Jan 16, 2021 17:23:27   #
bclaff Loc: Sherborn, MA (18mi SW of Boston)
 
Hanson wrote:
I applaud the long but thorough answer EVEN THOUGH the initial question is for a short answer for the most important factor with respect to DR. Therefore in general I would say large pixel size favors lower noise level and higher DR ( a simple answer for most layman?).


However, that is not photographically true.

Reply
Jan 16, 2021 18:08:14   #
hjkarten Loc: San Diego, California
 
To quote a GOOGLE search: The dynamic range of the subject is a measure of the range of light intensities from the shadows to the highlights. ... In low light conditions the dynamic range (that is the difference between the darkest and the latest part of the subject) is quite small.Sep 22, 2015

The following is a lengthy explanation of DYNAMIC RANGE. But read it slowly and at your leisure. It will help you understand more about your modern camera, how Ansel Adams used dynamic ranges, how your color camera stores information, and the most important reason to save files in RAW format.

Dynamic range is usually listed in the tech specs of most cameras. Dynamic Range is expressed as "bit-depth".

Start by thinking of it in terms of a B & W monochrome image. How many gray levels can you capture? If it is a silhouette, it is a simple black and white, no intermediate gray levels. It is a simple black or white pixel. If you now shift to a camera with ever increasing gray level, the "bit-depth" increases.
In order to understand what the camera companies mean when they report that a camera has 8, 10, 12, 14 or 16 "bit--depth", you have to learn about how to count in binary. (Yecch!) It sounds worse than it actually is.

I'll try to make it less intimidating, (but it's actually pretty simple) and explain the process of how bits are scientifically counted. They are counted in BINARY, or Base-2. That is 2^n. (Suggest you check with Wikipedia about counting in binary). Let's start with 2^1 = 2 levels (black or white). If you have a camera that can detect either solid black or solid white, you have only 2 bit values i.e. ON or OFF. Now 2^2 = 4 gray levels. 2^8, that means that the sensor can detect 2x2x2x2x2x2x2x2 or 256 levels from 0 to 255. (How does that add up to 256? Remember that ZERO is an important value!)
For those of you raised in the age of Ansel Adams' "Zone System", rather than spreading the gray levels over 256 values, he lumped them into ten levels. But the fundamental idea was the same.
Early sensors could only detect 3 or 4 bits. In order to now make a better sensor, capable of detecting ever more subtle changes, stay tuned!

Modern cameras on the market gradually improved their sensors and were able to detect 10, then 12, and currently 14 bits. What does that mean in practical terms? If 8 bits allows 256 steps of gray, then 9 bits = 512, 10 bits = 1024 gray levels, 11 bits = 2048 gray levels, 12 bits = 4,096, 13 bits = 8,192 gray levels, and 14 bits = 16,384 gray levels. The next generation of sensors will surely move to 16 bits. To simplify counting and communication with others, a sensor with e.g., 8,192 bit levels is said to have 8 Kbits. The range of gray levels is essentially describing the DYNAMIC RANGE of the sensor. The modern sensor that we routinely use in biology, physics, astronomy, etc., vary from 16 bits to 32, 64, etc. This is vital if you want to be able to capture the dim star and distinguish it from the VERY dim star with your personal Hubble telescope or lab microscope.
But this is all about gray levels. How do I get color?
Each color is given a separate channel - (I am simplifying this a bit). So, a modern camera with high DYNAMIC RANGE has 14 bits of Red, 14 bits of Green and 14 bits of blue. This is summarized as "14 bits Dynamic Range Per CHANNEL".
Now comes the neat stuff: The internal processor can convert a 14 bit to 8 bits in the flash of an electron (speaking metaphorically). But you actually control your output to determine if the output is stored as the original 14 bits on your high end camera, or store it as an 8 bit image, or decide to store it in both modes. You make that choice when you save your image as RAW vs. JPG.
HUH?
That's right! You make the choice. It also explains why your RAW image allows you to correct a wider range of exposure and color temperatures. When you convert your RAW image to JPG for printing or sharing on the internet, you convert the 14 bit RAW image to 8 bits. Once you do that, there is no way to go back to a higher bit level. (It is called "quantizing" the image. More about that on another occasion).
JPG throws away all that vital information and limits your ability to correct exposure and color temp/tint.
This also explains why RAW is a preferable way to store your images. More data means more ability to selectively modify your picture.

So, to again quote a google search: The dynamic range of the subject is a measure of the range of light intensities from the shadows to the highlights. ... In low light conditions the dynamic range (that is the difference between the darkest and the latest part of the subject) is quite small.Sep 22, 2015
Dynamic Range Photography Explained | Expert photography

Thanks for your patience.
Harvey

Reply
 
 
Jan 16, 2021 18:08:22   #
burkphoto Loc: High Point, NC
 
Hanson wrote:
I applaud the long but thorough answer EVEN THOUGH the initial question is for a short answer for the most important factor with respect to DR. Therefore in general I would say large pixel size favors lower noise level and higher DR ( a simple answer for most layman?).


Somewhat true, but often irrelevant, if you reference my earlier replies.

Reply
Jan 16, 2021 19:04:59   #
bclaff Loc: Sherborn, MA (18mi SW of Boston)
 
Not worth the read considering that some of it is flatly wrong.

Reply
Jan 16, 2021 19:55:08   #
wdross Loc: Castle Rock, Colorado
 
gvarner wrote:
What determines the dynamic range in a sensor? Is it just the number of pixels and the sensor dimensions? I don’t recall seeing dynamic range info featured in camera specs, e.g. +- 5 stops. Thank you for you thoughts.


As Bill at Burkphoto pointed out, dynamic range does not play as an important role these days since most cameras have more than enough dynamic range for printing. Of course if one only shoots JPEGs and never RAW, then one would want a camera with the highest dynamic range to achieve the best sellable and printable JPEG. But most cameras these days allow for one to shoot both a RAW plus a JPEG at the same time and store both. If the JPEG image is lacking, then the RAW image can be pulled out to get the image needed or wanted. It still needs to be reported by manufacturers and test lab sites. But dynamic range is not nearly as important as it was in the film days.

Reply
Jan 16, 2021 20:26:45   #
hjkarten Loc: San Diego, California
 
bclaff wrote:
Not worth the read considering that some of it is flatly wrong.


Which posting are you referring to? Please clarify, and which statements are "flatly wrong"?
regards,
Harvey

Reply
 
 
Jan 16, 2021 20:34:52   #
hjkarten Loc: San Diego, California
 
A dramatic demonstration of the importance of dynamic range is evident anytime you take a picture of a florid sunset and want to capture the range of pinks and reds. Check your histogram as well as your image.
The statement that "dynamic range is not as important as in the days of film" is in conflict with the efforts made by sensor manufacturers to continually expand the dynamic range of the chips.
Harvey

Reply
Jan 16, 2021 20:41:51   #
bclaff Loc: Sherborn, MA (18mi SW of Boston)
 
hjkarten wrote:
Which posting are you referring to? Please clarify, and which statements are "flatly wrong"?
regards,
Harvey


It was a long post where you quoted material that at length.
Since you didn't author it I see no point in debating it with you.
I simply caution you and others to pay no attention to it.

Reply
Jan 16, 2021 21:04:17   #
TreborLow
 
One should also remember that the human eye has two systems built in. One for bright light and one for dim light. They are not working at the same time! Think of the dark adapting time you need when you enter a dark theater and the reverse if you come out in bright daylight.

Reply
Jan 16, 2021 21:05:17   #
TriX Loc: Raleigh, NC
 
bclaff wrote:
Not worth the read considering that some of it is flatly wrong.


Agreed.

Reply
Page <<first <prev 4 of 8 next> last>>
If you want to reply, then register here. Registration is free and your account is created instantly, so you can post right away.
Main Photography Discussion
UglyHedgehog.com - Forum
Copyright 2011-2024 Ugly Hedgehog, Inc.