Bill_de wrote:
We'll never know for sure since Larry is the only person who speaks for the vast majority of photographers. I only know those I interact with.
---
Actually, I do not presume to speak for anyone other then myself. But I know a good number of photographers, including a large number who are quite serious and several who are seriously accomplished artists in photography as well as in other media. Outside of my local photography club, maybe three of those besides myself use raw on any sort of routine basis. Inside the club, I'd estimate that 25-30% of the "faithful members" primarily use raw files. I have to estimate, because it is not that often a serious topic of discussion. Maybe that works out to more than 1% of the people I know who take photographs, but I doubt it.
BebuLamar wrote:
If you change the light level up or down you lose bits. For every stop you lose a bit. With 14 bit raw as I have now I can afford to lose 6 bits and still have a good 8 bit jpeg.
This sounds wrong. Am I not understanding what you're saying or am I missing something?
Change light levels how? I can change light levels a lot of different ways, and never noticed losing bits? On the other hand I never counted bits, and don't even know how to go about it. Too many years ago when I programed in C I had to know and count bits and bytes, but never when taking a picture or editing. I only know my monitor, printer, and jpgs are 8 bits because I read it somewhere. My raw pictures I'd have to look in the camera docs to see how many bits they are.
I don't think I can change bit level simply by changing light levels?
Please explain further or cite sources?
If you want to reply, then
register here. Registration is free and your account is created instantly, so you can post right away.