gvarner wrote:
If you can view the EXIF data on your photo, divide each ppi dimension by the resolution to get the physical size of the photo in inches. E.g 3600x6000 divided by 300ppi equates to 12"x20". You could theoretically print up to that size without distortion. A 50% crop would bring you down to a 6x10. At least this is how I look at it but I've been known to be wrong. Just ask my wife. 😜😜
There is a myth known as the "300 dpi requirement." The actual minimum resolution for photo prints is usually defined as 240 PPI on an 8x10 inch print (input to the print driver or raster image processor in ORIGINAL, UN-SCALED PIXELS FROM THE CAMERA provided for each linear inch of paper dimension). More pixels than that are redundant, perhaps purposely so for editorial discretion. Editors have traditionally rounded up to 300 (PPI, really!) to allow them the ability to enlarge or crop slightly.
The "300 dpi requirement" came from the graphic arts community way on back in the days of scanners used to create digital files for printing. Scanners measure resolution in "dots" or samples per inch. PRINTER resolution is measured in "dots" per inch as well. But FILE resolution is a completely different thing! It is measured in JUST pixels. A pixel is just a number that has no physical size. But when you express resolution as PPI, you are saying, "cram 240 pixels into each inch of output, and represent it with however many dots the printer uses to reproduce content." Your 240 PPI image may be represented by 600 laser spots per inch, or 2880x1440 ink dots per inch, or some other number of dots applied by the printer in use. The printer driver software or raster image processor makes that conversion.
FILES have an EXIF metadata table attached to them. It contains a "resolution header" that is measured in dpi. That convention comes from the process of scanning, where one dot from the scanner yields one pixel in the output file. The header value of "300dpi" refers to the *scanner* resolution, not the *file* resolution! The file resolution is 300dpi x W x H. If the scan was a 10x8 inch print, the file will contain 3000x2400 pixels. THAT number is always what you reference when attempting to size an image. It represents a range of potential.
One little known fact is that smaller prints DO benefit from higher PPI values, and larger prints actually need fewer PPI. A 4x5 print may benefit from a higher resolution than the 8x10 (but it's more like 360 PPI than 480 PPI). And enlarging beyond 8x10 requires fewer and fewer PPI as the print gets bigger and the natural viewing distance increases.
Think about it. A 1920x1080 pixel, 55" HDTV looks nice and sharp from six to nine feet away in your living room. But that 1920x1080 image — one frame of HD video — makes only an 8" x 4.5" photo quality print at 240 PPI! But if you re-size that 1920x1080 image to the size of a 55" HDTV screen, print it, and view it at the same 6' to 9' distance, it looks fine! Where this whole scheme falls apart is when some "pixel peeper" (usually a camera club judge) walks up to the huge print to see some tiny detail. Then he sees the pixels, just as he would when viewing an HDTV screen at one foot!
It's always nice to have more pixels, up to the point where the photosensitive sensor sites are so small they don't capture enough photons to avoid noise. More pixels leave more potential to crop images, or to enlarge them, or both.