Bridges wrote:
I shoot RAW almost 100% but I know a lot of people still shoot JPEG. Just a quick poll to see how many are in each camp, and to ask those who shoot both what determining factor makes you shoot one way or the other?
In situations where there is an opportunity to control lighting precisely (color temperature, intensity, directions, ratios, specularity, etc.), it can make sense to use JPEG when there is an immediate need for the image, OR when there is NO budget for post-processing before printing or posting to the Web.
It is important to understand why the JPEG standard exists at all! JPEG compression was developed out of a need to reduce file size, so that less storage space and less network bandwidth would be needed to handle billions of images. It was conceived as a
storage and distribution format, not as a capture format. In the early days of the Internet, pre-World Wide Web (1993 and earlier) people would scan to TIFF, process in Photoshop, and convert to various sizes of JPEGs. Images to be sent over the Internet were both low resolution and highly compressed. A 50KB file was big!
Another use for JPEGs was in "Desktop Publishing". That was the early use of Macs and PageMaker for pre-press preparations. JPEGs could be used in files sent to early laser imagesetters, so that halftones and text could be combined all at once, digitally, to create page negatives. Previously, text negatives and halftone negatives were made separately, and "stripped" together on a light table, a very labor-intensive process.
When Kodak developed their digital cameras in the early 1990s, they were extremely low resolution devices (around a megapixel or less). They captured raw data, which had to be post processed. But Kodak realized very quickly that users would need immediate feedback about exposure, and would (more importantly) have a reason to buy incredibly expensive pioneering devices! So they put JPEG processors in them. Journalists were some of the first users of these cameras, along with military and industrial users. They had the need for speed to press!
Thus began the practice of putting ever faster and more sophisticated JPEG engines into dSLRs, point-and-shoots, and eventually, smartphones and MILCs.
The problem is that John Q. and Mary P. Public weren't ever informed of the limitations, disadvantages, and caveats surrounding the use of JPEGs. Professionals have ALWAYS known those boundaries, and when necessary, stayed within them with EXCELLENT results. As professional photo educator and master portrait photographer, Will Crockett, is fond of saying, "Raw is for Rookies, JPEGs are for professionals." What is meant by that gentle poke is that
it is much easier to record and manipulate a raw file to get professional results than it is to *control environments* sufficiently to achieve the same look in difficult situations.I worked for decades in the school portrait industry, running parts of our optical lab, our digital lab, and then the last seven years as a training content developer and trainer. We had a 100% all JPEG workflow. We controlled EVERYTHING, from the contract, to the camera, to the shipping dock. We knew what we were doing and made it look great (well, at least until we were bought by a bigger company). So that is why I use JPEGs the way I do, in a controlled, careful, thoughtful manner, for subject matter that makes sense.
OTOH, I record (I don't SHOOT anything!) raw files most of the time, now, simply because I'm semi-retired and have time to post-process. I'm also well-equipped and know what I want. It's much easier to get the look I want when I'm able to use the camera's full dynamic range to create the end product.
Working with JPEGs is likened to working with slide/transparency films. Back when I was an AV producer, I had to control everything AT THE CAMERA, just as I did later with JPEG capture. That meant using a bag of filters for color correcting the light. It meant using incident meters, incident flash meters, color temperature meters, and reference targets (gray cards and Kodak Q-13 color patches).
When I used color negative films, I could relax a bit. Color correction was done in "post" (after processing the negative, it went to a Kodak Professional Video Analyzing Computer for visual analysis and color adjustment). As long as the exposure was within a stop or so of "normal," I could get what I wanted. If I was being anal about it, I'd still use my color correction filters in weird lighting situations, but it was less necessary.
Now, with raw capture, I use an exposure and white balance target for both JPEG and raw capture, but in most situations, I don't need it for raw capture. I work on a hardware-calibrated and software-profiled monitor, so what I see is nearly identical to what I print.
My main advice is to use the right tool for the job. When you know that JPEGs were never meant as a file to be post-processed, you can decide whether you are going to learn to expose and white balance correctly at the camera, or simply accept the limitations of bad exposure and poor white balance when you don't. You can decide whether you want to learn a post-processing tool or two, and whether accurate color means anything to you.
Since you are reading this on a photography forum, my hope is that you will choose tools consciously, and appropriately for the results you want to achieve.
I know from watching millions of images go through a lab that decent knowledge and a "pretty good" camera can make much better images than no knowledge and a high end camera, and that anyone can use JPEGs OR raw capture correctly in the right situations. Newbies shouldn't expect magic, no matter how much money they spend. Read the manuals, watch tutorials, go to seminars, and pay attention to what good photographers do and say. It's a life-long process.