Ugly Hedgehog - Photography Forum
Home Active Topics Newest Pictures Search Login Register
Main Photography Discussion
Five-bit Color & Human Perception
Page <prev 2 of 2
Aug 4, 2022 10:10:31   #
profbowman Loc: Harrisonburg, VA, USA
 
R.G. wrote:
Unnecessary where viewing is concerned, but any further editing will benefit from the extra headroom of a greater bit depth, which is why something better than jpeg is recommended if further editing may be involved. And am I right in thinking that some colour spaces need more than 8 bit data?


A few of you have mentioned that having more color depth than 8-bits will give us more "headroom." I have not heard this word used with light (color)perception before and thus do not have a definition for it. But let me explain what happens when we brighten a dark area of a photograph or darken a bright area.

Our perception of color is not linear. While we might be able to distinguish between two similar areas at middle hues, we may not be able to make the same distinctions at the dark or light ends of the color gamut. Thus, we are not changing color depth when we lighten a dark region or darken a light region. We are simply moving to a region of color where our eyes can perceive differences in color better.

The attached image shows the same color dept steps between the members of each pair. The difference at the middle hue is easier to see than at the dark or light ends.

If after making these changes things do not look "neat," that is because we do not have good data in those bright or dark regions or our editing software is not working well. --Richard



Reply
Aug 4, 2022 10:41:35   #
Fotoartist Loc: Detroit, Michigan
 
profbowman wrote:
Fotoartist, you asked or made three good poiints. Here are my responses.

As physicists, we often try to look at limiting cases. So, I had the initially had borders on cells on a webpage in order to show that under those conditions it is tough to make distinctions between colors that are one-step apart in 5-biit color. But it also shows us that 5-bit color, while almost doing the trick, is not quite what we want. And this also shows us that 8-bit color is definitely more than we need.

As to doing my experiment in blue background colors in a table instead of green, here is the result. Note that banding does not occur. It is always better to do the experiment that just stating an opinion. :)

We do not need 16-bit color as I have shown by my many experiments here. If under only the most controlled and ideal conditions can a human see 10 million colors, i.e., that is 8-bit color in three RGB subchannels, then moving the minimum color step size from 1/256 to 1/(65,536_will not help us see more colors. All that moving from 8-bit to 16-bit gives us then is more colors that look the same. --Richard
Fotoartist, you asked or made three good poiints. ... (show quote)


Thanks for the reply. Are you telling us not to believe our lying eyes when we see banding in a large smooth sky area in our photos? I am not a proponent of 16-bit color and almost never use it, but to me this is its main purpose in photography to prevent banding and posterization in large smooth areas.

Your color strip is too short, too pure in hue, and it has not been processed with a Levels or Curves adjustment that photographers love to use. Try that and viola!, the banding will be evident.

Reply
Aug 4, 2022 11:25:22   #
CHG_CANON Loc: the Windy City
 
profbowman wrote:
Fotoartist, you asked or made three good poiints. Here are my responses.

As physicists, we often try to look at limiting cases. So, I had the initially had borders on cells on a webpage in order to show that under those conditions it is tough to make distinctions between colors that are one-step apart in 5-biit color. But it also shows us that 5-bit color, while almost doing the trick, is not quite what we want. And this also shows us that 8-bit color is definitely more than we need.

As to doing my experiment in blue background colors in a table instead of green, here is the result. Note that banding does not occur. It is always better to do the experiment that just stating an opinion. :)

We do not need 16-bit color as I have shown by my many experiments here. If under only the most controlled and ideal conditions can a human see 10 million colors, i.e., that is 8-bit color in three RGB subchannels, then moving the minimum color step size from 1/256 to 1/(65,536_will not help us see more colors. All that moving from 8-bit to 16-bit gives us then is more colors that look the same. --Richard
Fotoartist, you asked or made three good poiints. ... (show quote)


If you posted an image with a blue sky with these obvious bands of color in the image, we'll laugh you off the site. Well, of course, some would give gushing praise, probably multiple thumbs-up, along with the slings and arrows. So, you might be confused by the feedback, thinking a reduced bit depth is both useful and acceptable.

Reply
 
 
Aug 4, 2022 11:36:02   #
profbowman Loc: Harrisonburg, VA, USA
 
Fotoartist wrote:
Thanks for the reply. Are you telling us not to believe our lying eyes when we see banding in a large smooth sky area in our photos? I am not a proponent of 16-bit color and almost never use it, but to me this is its main purpose in photography to prevent banding and posterization in large smooth areas.

Your color strip is too short, too pure in hue, and it has not been processed with a Levels or Curves adjustment that photographers love to use. Try that and viola!, the banding will be evident.
Thanks for the reply. Are you telling us not to be... (show quote)

If 16-bit colors are necessary for one to do the editing they prefer, then they need to use it. All my demonstrations show is that any banding that comes through using editing software is not due to 8-bit color but due to the parameterizing of the software adjustments. I do not use Photoshop Curves or Levels and won't use them due to these errors they introduce.

If my original photo does not have banding in the sky or other places, then why would I use software which adds banding? --Richard

Reply
Aug 4, 2022 11:41:15   #
CHG_CANON Loc: the Windy City
 
profbowman wrote:
If 16-bit colors are necessary for one to do the editing they prefer, then they need to use it. All my demonstrations show is that any banding that comes through using editing software is not due to 8-bit color but due to the parameterizing of the software adjustments. I do not use Photoshop Curves or Levels and won't use them due to these errors they introduce.

If my original photo does not have banding in the sky or other places, then why would I use software which adds banding? --Richard
If 16-bit colors are necessary for one to do the e... (show quote)


Did the software add the banding?

Or, did the human make a poor decision in their choice of image format, hence the bit-depth? Not recognizing the impact of that decision when seeking to edit the image in their preferred software?

Seems like just another example of the age-old UHH truism: Success is the photographer. Failure is the equipment. (or software)

Reply
Aug 4, 2022 12:16:42   #
Fotoartist Loc: Detroit, Michigan
 
profbowman wrote:
If 16-bit colors are necessary for one to do the editing they prefer, then they need to use it. All my demonstrations show is that any banding that comes through using editing software is not due to 8-bit color but due to the parameterizing of the software adjustments. I do not use Photoshop Curves or Levels and won't use them due to these errors they introduce.

If my original photo does not have banding in the sky or other places, then why would I use software which adds banding? --Richard
If 16-bit colors are necessary for one to do the e... (show quote)


OK, I will give you that if you are content to use shots SOOC, (straight out of camera). But all photographers love to use Levels or Curves adjustments on their photographs and that is when most banding can occur in less than 16-bit files. That is the real world of digital photography.

Reply
Aug 4, 2022 12:20:57   #
DirtFarmer Loc: Escaped from the NYC area, back to MA
 
I propose an experiment.

Take a photo that includes a clear sky. Take the photo and save it as a tif or bmp to avoid any compression effects. Take the tif and convert it into an array of 16 bit integers.

Now shift the bits down 1, 2, 3..., 11 bits. That gives you 11 examples ranging from 5 bit to 15 bit integers. Multiply them all back up to 16 bits. That gives you the original brightness at the highlights but only 5 to 15 bit resolution. Convert the numeric array back to a tif or bmp. See what they look like.

Alternatively, just take the lowest 1...11 bits and make them zero.

I will do this experiment BUT as I am flat out trying to clean out the house for sale, I will not have time to do this for at least a week or two. If someone beats me to it, so be it. My thought would be to do this with Python, as I have done numeric analysis of several photos using that language and it seems to work well. I would expect that there are other ways to do this, that may be more appropriate to someone else.

Reply
 
 
Aug 4, 2022 12:23:13   #
CHG_CANON Loc: the Windy City
 
DirtFarmer wrote:
I propose an experiment.

Take a photo that includes a clear sky. Take the photo and save it as a tif or bmp to avoid any compression effects. Take the tif and convert it into an array of 16 bit integers.

Now shift the bits down 1, 2, 3..., 11 bits. That gives you 11 examples ranging from 5 bit to 15 bit integers. Multiply them all back up to 16 bits. That gives you the original brightness at the highlights but only 5 to 15 bit resolution. Convert the numeric array back to a tif or bmp. See what they look like.

Alternatively, just take the lowest 1...11 bits and make them zero.

I will do this experiment BUT as I am flat out trying to clean out the house for sale, I will not have time to do this for at least a week or two. If someone beats me to it, so be it. My thought would be to do this with Python, as I have done numeric analysis of several photos using that language and it seems to work well. I would expect that there are other ways to do this, that may be more appropriate to someone else.
I propose an experiment. br br Take a photo that ... (show quote)


Your experiment is already off the rails by not specifying the image format of the original image ...

I don't see the logic of a bit-level editor if that isn't how you normally edit your digital images ...

Reply
Aug 4, 2022 18:12:07   #
DirtFarmer Loc: Escaped from the NYC area, back to MA
 
DirtFarmer wrote:
...Alternatively, just take the lowest 1...11 bits and make them zero...


After a moment's thought that will not work. It will just raise the black level.

Reply
Aug 5, 2022 01:27:08   #
profbowman Loc: Harrisonburg, VA, USA
 
I do not mind that we might shift to another topic, but just to keep us all aware of where this discussion began, my original thesis was the 8-bit color on the three subchannels RGB is more than enough to retain all the necessary info for human eyesight to "see" color.

If editing introduces artifacts, to me that means the editing software is at fault and not color depth. Also, I am not discussing artifacts from photo compression. That is a valid topic, but one for a different thread, methinks.

As to more experiments about editors and the introduction of artifacts, I, too, would have a few suggestions of things to clean up in the proposed bit-shifting project. However, I do not have the energy or time to try it tonight. Since I do not extensively use Photoshop, I'd appreciate seeing the result if someone took my 13 six-bit blue color column and run it through Levels and Curves. If banding shows up, to me, that means it is a software artifact. --Richard

Reply
Page <prev 2 of 2
If you want to reply, then register here. Registration is free and your account is created instantly, so you can post right away.
Main Photography Discussion
UglyHedgehog.com - Forum
Copyright 2011-2024 Ugly Hedgehog, Inc.