emmons267 wrote:
I have a fairly good understanding of shooting in RAW and the difference and benefits over shooting in JPEG mode. There are a few areas that I'm still not clear on:
1. When you shoot RAW should you always ETTR.
2. If you shoot RAW and don't ETTR will the images be any better than JPEGs. I know there's more data but will the exposure be any different than a JPEG.
3. Since the histogram on the camera (Canon T6i with 18-135mm lens) is from a JPEG how do you determine how far to expose to the right.
4. If you are exposing to the right I'm assuming you should not shoot RAW+JPEG since the JPEG would be over exposed beyond repair. Is this correct.
I have a fairly good understanding of shooting in ... (
show quote)
In the sequence you asked....
1. No... You do not need to use ETTR when shooting RAW. They are two different things. A RAW file simply gives you greater data to work with later. ETTR is a scheme that may or may no be necessary, depending upon you and your camera (see below).
2. No, RAW images may or may not be "better" than JPEGs (see below).
3. If you find yourself often or always having to increase exposure in post-processing, you should consider dialing in some ETTR using Exposure Compensation in the AE modes (P, Av and Tv on your camera). This is just a "tweak" that works with the AE modes. Don't overdo it... +1/3 or +1/2 stop typically. Maybe +2/3 stop at most. All you are trying to do is bias the exposures slightly brighter. The reason for doing so is because under-exposure amplifies noise in shadow areas of images. It's better to be slightly over-exposed than to be under-exposed.
Note: If you use "Auto" or any of the preset "scene" modes, ETTR isn't possible because Exposure Compensation isn't available in those modes. Or, if you shoot fully manual... M
without Auto ISO... Exposure Compensation isn't available, but you can do similar ETTR tweaking to your images by slightly biasing your exposures manually. (M
with Auto ISO is no longer manual... it's another auto exposure mode like P, Av or Tv... If Exposure Compensation is possible on your camera when using Auto ISO, ETTR would be possible. If E.C. isn't possible with it, I would probably not use Auto ISO. I'd never use Auto ISO in conjunction with the other AE modes.)
4. You can shoot RAW + JPEG any time you need both types of files, regardless whether you're using ETTR or not.
Some further explanations.... First, please understand that every image taken with any digital camera starts out as a RAW file. When you have camera set to save RAW, the full data is saved, but the image is essentially "unfinished". You'll need to finish it later in your computer using RAW conversion and image editing and optimization software programs. Many aspects of the RAW file can be easily adjusted. Exposure and color are "as shot", but there's more data to work with if they need to be changed. For example, an 8-bit JPEG has a color palette of approx. 17 million colors. That sounds like a lot until you consider that 16-bit files that are possible from RAW have a palette of 23
trillion colors! That's upwards of 3000X more colors! (Note: Your camera's RAW are shot 14-bit, but post-processing software interpolates it to 16-bit.)
Other parameters of a RAW such as contrast, saturation, noise reduction and sharpness are not "set". If you use the manufacturer's own RAW conversion s'ware (Canon DPP, for example), it can recognize "tags" with these factors that are stored in the RAW and apply them "as shot" if you choose to do so. But they are not recognized by most third party conversion s'ware (such as Adobe Camera RAW that's built into Lightroom, Photoshop and Elements). You can freely adjust and change all these settings in a RAW file.
OTOH, when you set your camera to save JPEG, the RAW file is instantly converted or "processed" in camera, using the various settings you've chosen. Once this has been done, once the file has been converted from RAW to JPEG, all sorts of data the camera deems "extraneous" is discarded. As a result, JPEG files are much smaller than RAW files.
In fact, at first glance a JPEG will usually likely look "better" than corresponding RAW file (CR2 in your camera). That's because the JPEG is "finished" while the RAW file is not. BUT, if anything needs adjusting later in post-processing, with the JPEG there is far less of the original data to work with, so there's a lot less flexibility to make those adjustments. As shown above, it's usually better to make most types of image adjustments to a 16-bit file.... which means starting out with a RAW file.
Most of the time... for most purposes... you should save your finished images as 8-bit JPEG. That's the most universally usable type of image file. All computers and devices have some means of viewing a JPEG built in. All printers can work with 8-bit JPEGs... not all support other types of files or files with greater bit depth. And in most cases there's no improvement in quality by making a print from a file type with more bit depth. In fact, it might just slow down printing and waste more ink.
So... if you need to use the images immediately, set your camera to save JPEGs. But you have to be more certain to get all the different settings of the camera correct.
If you want to work with your images and try to produce better... or may want to makes some changes to them later... and have the time, software and computer to do so, shoot RAW.
It sometimes can help to shoot both... RAW + JPEG... while learning post-processing. That way you can compare the two and see if you can work from the RAW to improve upon what the camera produced with the JPEG. The problem with RAW + JPEG is that it fills up memory cards and hard drives rapidly... so eventually you will probably want to discontinue shooting that way and simply choose one or the other type of file to save, depending upon the situation.
If you want to work with RAW files and strive to get more accurate exposures using ETTR, it would be a good idea to calibrate your computer monitor. An uncalibrated monitor will probably cause you to incorrectly adjust your images in either case. Most monitors are WAAAAYYY too bright out of the box, causing you to adjust your images too dark. They also rarely render accurate color, which can cause you to mis-adjust those, too. It's also best to use a wide gamut, "graphics quality" monitor that's better able to display the full range of colors, detail and sharpness in the image. (But no monitor I've ever seen has been able to show as much as a fine print on smooth matte paper can, when made with a high quality photo printer.)
If you do much printing, a calibration device will essentially pay for itself over time, in savings of wasted paper and ink or the cost of re-printing if you outsource your prints. Calibration needs to be re-done periodically (I do it every month or two), because as they age computer monitors gradually change brightness and shift color rendition.
Personally I shoot RAW most of the time and post-process my images. If and when for any reason I need immediate access to finished images, I shoot RAW + JPEG.
I used ETTR quite a bit more with my older Canon cameras. Newer models handle high ISO better... less noise. And newer cameras have greater dynamic range. ETTR was a lot more important with my old Canon 30D, than it is with the 7D Mark IIs I use now, for example. It also varies a bit by model... original 7D that I used for some years tended to need a bit more ETTR than the Mark IIs I use now. But I still use it a little whenever I'm shooting with an auto exposure mode... usually leave +1/3 Exposure Compensation dialed in, might increase that to +2/3 in certain circumstances (ultra high ISOs, for example). I can't speak for other brands, but in general I've found Canon models I've used have more "headroom" in the highlights than people think... it's actually harder to "blow out" highlights than it might seem. Computer monitor dynamic range limitations often "clip" both highlights and shadows... In other words, when I make a print I usually find there's better detail in both ends of the range, than appears on the screen... even though I'm using a calibrated, wide gamut, graphics quality monitor.
The more you work with your camera, the images it produces, and your particular computer and software, the better feel you'll have for how much... if any... ETTR you might need.
NOTE: Any Exposure Compensation that needs to be done for unusually bright or dark scenes and subjects is
in addition to any ETTR you might find yourself regularly using. For example, let's say you normally use +1/3 ETTR and have that dialed in... But now you're shooting a snow scene that's about 1 stop brighter than average and will make the camera want to under-expose. You'd need to dial in an additional +1 stop... or a total of +1 and 1/3 stops of E.C. In a darker than average situation, the opposite would be true.
The metering method you use also can make a difference. I use Canon's Evaluative metering most of the time and because that's a broad type of metering, find that a little Exposure Compensation is usually all that's needed. But sometimes I use Spot Metering and that's a lot more sensitive to subject tonality that's any brighter or darker than average, because it's not metering a large area of mixed tonalities. As a result, I find I have to be more careful with my Exposure Compensation when I'm using Spot Metering.