ISO is the international standard for film sensitivity. It is a compromise made from ASA (US standard) and DIN (European standard).
ISO is a fixed number indicating a film sensitivity. When post processed (developed) the result can be modified.
While the manufacturers have been using the same nomenclature today, it is not the same when used with a digital camera.
A camera sensor array (SA), regardless of how recent it is, has a single sensitivity that never changes. What changes is the electrical current used to transform the analog data (light) into digital. The more electricity, the more data is captured under lower light conditions. The cost is signal noise.
The real sensitivity of a SA lies with its ability to capture a range of luminosity. This is referred to as DR. If at first the DR was so limited it made acceptance of digital cameras difficult, this is not the case anymore as the sensors are capable to capture more than a film ever could. There is a drawback…
Grain… Grain is the result of random clustering of chemical on a film, creating an effect that was sometime welcomed for artistic purposes. A digital camera cannot create grain, instead it creates predictable noise* that depends on, among other things, the voltage applied onto the SA.
So why is the term ISO still used? Simply because folks were used to it and since the effect can be simulated, it is used as a setting scale.
When using a JPG output vs a raw file the created data DR changes**. This is why there is a blatant need for cameras to have JPG/raw exposure selection, not file format selection.
Please comment before I post this in the main channel as well as the FAQ.-----------
* Noise is more accurately described as signal/static noise. I referred to it as 'digital noise'.
** A camera JPG is created from the sensor initial digital capture, a raw file and reduces the DR as well as create a lossy compression.
Credits:
RGG
wham121736