alx wrote:
Yes. Accurate precise temperature control was essential. Tolerances were best held to within .25 degrees.
Some of the major differences between wet chemical darkroom processes and digital, non-chemical processes are that we don't have:
> Temperature fluctuations
> Agitation variations due to personal technique (using small tanks) uneven nitrogen burst bubble distribution (using sink line/dip-and-dunk) or roller transport gear wear or voltage fluctuations/brownouts (long roll ciné processors)
> Chemical fluctuations (pH, specific gravity, exhaustion requiring replenishment...)
> Film and paper emulsion batch sensitivity variations (these got wilder than many folks realize!)
> Film and paper shipping and storage variations (heat, radiation, humidity...)
> Tar formation in processor tanks (lab scale high volume processing)
> Printer line voltage variations that affect tungsten-halogen lamp color temperature and brightness
> Printer calibration
> "Dichroic filter gear slop" (turn a CMY knob to .15 from zero and it might really be .17, but turn it from .30 to .15, and it might be .14)
We never saw color variations on our wide format Epson inkjet printers. NEVER. We could print the same file 18 months later, using Epson inks and papers, and the color would be spot-on. That was highly improbable with digital mini-labs using Kodak professional silver halide paper, lasers, and RA-4 chemistry. It was quite impossible with optical printers and long roll paper processors.
The color gamut, process stability, print longevity, and repeatability of digital production with inkjet output is far superior to silver halide. However, for volume printing of standard sizes up to 12" by 18", silver halide is dirt cheap, and much faster than inkjet!