amfoto1 wrote:
As far as image quality loss, it really doesn't matter. Assuming you were able to perform exactly the same crop either way, the resulting image size, resolution, etc. will be identical.
However, it's far easier to perform an accurate crop with your computer, which no doubt has a much larger screen than your camera. Also, in your computer you can work on a copy of the image and save the original as is, in case you screw up the changes you're making. You probably can't do that in-camera... any changes you make are to the original and there's no "undo".
Also, if you shoot RAW files (Nikon NEF I think), those are better to work with than JPEGs... RAW are worked in 16-bit mode, while JPEGs are 8-bit. In other words, RAW have a lot more latitude for changes made during post-processing. If the in-camera crop can only be done to a JPEG file, then it would usually be better to wait and do the crop during the RAW conversion process.
However, when you press the shutter release, any digital camera initially makes a RAW file.... And when you set the the camera to "RAW" it saves the entire file. But when you set the camera to "JPEG", it basically does a very fast post-process of the RAW file in-camera... according to the settings of the camera such as contrast, saturation, sharpening, noise reduction, etc.... and then "throws away" whatever data the camera deems unnecessary, reducing the image from 16-bit to 8-bit and more. When you shoot RAW + JPEG, you'll see the difference in size between the two files of the same image. The data that was "thrown away" when making the JPEG might be important when you are making other changes. Final cropping and down-sizing of an image is usually one of the last steps in post-processing an image, so that other changes and edits are made with as much of the original data as possible available to work with. When you do an in-camera crop, you are essentially reversing the process and making at least some of the image down-sizing one of your first steps. And ending up with an 8-bit image, which will have less latitude to make additional adjustments and edits to later.
8-bit images have roughly 17 million colors... which sounds like a lot until you consider that 16-bit images have around 23 trillion to work with. Oh, and your camera actually captures 14-bit.... but software interpolates that as 16-bit (actually some Nikon capture 12-bit or have the option to capture 12-bit to be able to shoot faster and save space on the memory cards... But 12-bit is also interpolated to 16-bit during RAW post processing.)
Ultimately, for most purposes (printing, online display, etc.), in the end you'll want an 8-bit image and a file type that virtually anyone can view on a computer without need for add'l software... a JPEG. It's more than enough for most purposes. But in the process of making that JPEG, a lot of the adjustments and work are better done in 16-bit mode, before the image is reduced to the final JPEG. There are some exceptions where the image is kept 16-bit... such as commercial usages where a client may specify a 16-bit TIFF or PSD file because they plan to do additional work on it later. But for the vast majority of uses an 8-bit JPEG is more than enough.
Finally, think of a RAW file as a "digital negative" and the corresponding JPEG as the final print that's made from it. I carefully archive my RAW files just as I did negatives back in the days of film. I can always make another finished JPEG/print from them. The original RAW/negative itself is irreplaceable. If a camera converts the image to JPEG in the process of cropping an image... if it doesn't save a RAW original... for that reason alone I wouldn't use in-camera cropping.
As far as image quality loss, it really doesn't ma... (
show quote)
Thank you so much.