JohnFrim wrote:
To begin let's clarify that RAW and JPEG (along with PNG, TIFF, etc) are storage file standards or specifications that essentially define/describe how the binary pixel brightness data from the sensor are organized in the file, ...
Pretty close!
There are actually two parts, one is a "data format" standard, the other is a "file format" standard. Often they are very tightly linked and hard to differentiate. But the TIFF standard is a good template to learn the concept as it is well differentiated between the two. For example almost all RAW file formats are actually the TIFF file standard, but have nothing to do with TIFF data formating standards.
Incidentally raw sensor data is not image data, but RAW files (thanks to the TIFF format) are able to store the raw sensor data and one or more JPEG formatted image data sets that are used for previewing.
Its kind of like saying that a bucket is a device for storing/carrying things. We can fill it with milk, oil, or with water. Or we can compartmentalize the bucket by carrying bottles. The bottles might have water or milk in them, and both can fit in the bucket at the same time.
JohnFrim wrote:
... technically the "thing" that one sees on the computer or on paper is neither RAW nor JPEG, nor PNG, nor TIFF, etc.
Absolutely correct. What the computer does is extract a data set from the file. It has to know about the file format, and it also has to know about the specific data format. So it reads data from the file and then that data is converted to a data format computer can use directly.
Note that with a RAW file that is
not image data! The sensor data, once stored in computer memory has to be interpolated to get one specific image. But any one set of raw sensor data can correctly be interpolated to nearly infinite number of specifically different images. Image data specifies only one image. Raw sensor data specifies a set of sensor data, not an image.
With a JPEG, TIFF, GIF or PNG etc data set the image data is converted to perhaps an intermediate format for a given program. Each editor, as an example has it's own format. But just for display it is converted to what might best be called "RGB image data", because virtually all printers and monitors use RGB formatted data. That is not necessarily true though, and there are printers that use CMYK data, vector data, or perhaps something else.
But "image data" would be a (very loose) term that covers them all, and "RGB data" fits what is usually meant.
And you are absolutely right that it is not TIFF, RAW, JPEG or whatever. Ideally when viewed on a monitor the "RGB data" from any given image would be exactly the same regardless of which format it had been saved to a file in, but that is only true with lossless compressed files and merely close to true when lossy compression is used.
JohnFrim wrote:
While the original RAW file was about 17 MB and the corresponding SOOC JPEG was just over 5 MB, the TIFF is over 128 MB, so clearly the richness of the data varies between file formats. But regardless of what format I save in, I am still seeing the same "image" on my screen.
This brings up another can of worms! Encoding efficiency is different for different data formats.
The RAW data, which once again is not image data, encodes color using the Bayer Color Filter Array method and an exact color is not recorded and also the color for any specific location is distributed over data from multiple locations and the "key" to which location is not stored in the data, but is determined by an externally stored pattern of the data. That is why a RAW sensor data file is so much smaller than a TIFF image data file. The TIFF file is storing an exact color for each location and there is nothing kept external to the file. But it won't compress as well as JPEG does either.
Computer data encoding, like encryption and compression, is not something an average photographer needs to understand. It's a distraction that prevents learning photography!