Gene51 wrote:
You will see that this topic will generate lots of off-topic tangential results.
You have made an incorrect assumption - camera generated jpegs use the captured raw data and YOUR choice among the settings the camera engineers have created for processing parameters. These may not be the "best" parts, but they are the ones that you choose among the available options.
In it's simplest terms, a jpeg is a small subset of the values captured by the camera. In processing the image in the camera, anything not represented in the jpeg is discarded.
Now the question - raw is neither better nor worse. It is captured information. And you shoot raw all the time, every time you click the shutter. Whether you chose to let the camera crudely process the image and discard all the other "stuff" or you chose to process the image yourself is entirely your choice. Post processing a raw image is generally done in two basic stages - capture conversion, using the parametric (mostly global) adjustments found in your typical raw converter, then the finishing work that is done using a pixel-level editor. First stage is done in broad strokes, the second stage is done with the precision of a robotically guided scalpel (when in the right hands).
You cannot "see" a raw file, you can only create rules for conversion to the bitmap (tiff, psd, jpeg, png) file, based on the embedded jpeg preview found in every raw file. For stage one, all you need is the raw file and it's preview.
Stage two requires a bitmapped file - 8 bit jpeg being the smallest and least desirable for high quality editing, with 16 bit uncompressed tiff or psd files being better because you have no loss on save, you can create layers and save them, and the 16 bit file provides greater accuracy when editing (and minimizes the chance of posterization and clipping).
Selecting the option to not save the raw file (you refer to it as shooting in jpeg), does eliminate options. If you shoot images that have relatively narrow dynamic range, colors that are not saturated - you will likely never see a difference between a jpeg generated from a raw file and processed in software on a computer, and the jpeg file processed by the camera. For images that push the envelope on saturation, gamut, tonal range and fine detail, the raw file processed by the camera can't hold a candle to the one processed in software on a computer - provided the skill level of the person processing the image is adequate.
If someone states they only shoot jpeg - it means they don't or can't see the difference, are unfamiliar with the raw workflow, or generally have a lower bar for image quality and are satisfied with what the camera produces. Or, they don't have a choice because their work is governed by rules and regulations by the client - as in Reuters, which does not accept anything but camera-generated jpegs, and other similar situations.
You will see that this topic will generate lots of... (
show quote)
So it sounds like shooting in jpeg is equivalent to sending out a roll of film to be processed as opposed to developing the film yourself, where the processing center would use standard times and technique to develop the negatives. While shooting in RAW would equate to the photographer in the darkroom having greater control over the out come of the processed negative by increasing or decreasing process time.
I'm beginning to understand.
Thanks everyone for their contributions to the conversation.