cyclespeed wrote:
Good day,
At our last club meeting we were discussing uploading photos to our website. When resizing we must not exceed 800 pixels on the longest side of the image in order to have it displayed properly. However when it got to resolution dimension we received different advice from two members who have some IT experience.
One said 150 d.p.i. was good and allowed for quality display i.e. little if any pixelation on screens.
The other claimed 72 d.p.i. worked best since screens maximum resolution display is 72.
They both agreed that fine prints should be at 300 d.p.i.
Can you help clarify this issue for us please. Thank you
Good day, br At our last club meeting we were disc... (
show quote)
The only thing that really EVER matters is the
size of the image in pixels... especially when you are talking about posting images on web pages.
Web designers base their standards — and the specifications they ask you to follow — on the average monitor in use at the time they are designing the web page. It is very likely in this case that the designer wants you to limit the long side to 800 pixels so it will fit on a 1920x1080 HDTV monitor, with room to spare for other design elements. Depending upon the operating system and browser, the image will appear larger or smaller on higher resolution monitors. Some monitors SCALE smaller pages to make them larger on "retina" (4K, 5K, high resolution) displays.
The whole PPI vs dpi thing is an historical nightmare, and a subject of confusion between two industry groups: The Graphic Arts community and the Photo Lab community. Since I worked in both for decades, I'll try to straddle the fence and explain.
Printers print DOTS. The term originally came from halftones, which were made with lined screens that broke an image up into fine dots of different sizes, depending on brightness and the number of lines per inch on the screen.
The term, 'dots' had nothing to do with pixels. It referred to real, physical ink dots on paper, not digital representations of image brightness values. The term 'dots' persisted when scanners were developed. Scanner manufacturers STILL insist on saying their machines scan at "6400 dpi" or "600 dpi", when they are stuffing those samples of an image into FILES that contain DATA representing PIXELS. I guess you COULD say that if there were a grid of 600 square dots per inch on a page, the scanner would turn them into 600 pixels per scanned inch in a file.
The arbitrary "150 dpi" figure is an OLD inkjet printer rule of thumb. The relatively low resolution office inkjet printers and laser printers your IT friend understands need at least 150 PIXELS (not dots!) spread over each linear inch of output. The printer converts the pixels to a LOT more dots in its driver software. For example, my desktop Epson outputs/deposits up to 5760x2880 dots per square inch on paper, but it only needs 360 pixels per square inch of input to maximize the print resolution. And usually, 180 PPI images print fine, at 1440x720 resolution.
Any IT person who still thinks monitors are 72 dpi hasn't bought a monitor or a monitor graphics card in 15 years or so. Monitors come in many different shapes, sizes, and native resolutions. They can display anything from 72 dpi to over 300 dpi, depending on make, model, size, and the graphics card or chip in the computer, tablet, or phone. Most monitors have a native resolution, where one input pixel from the graphics card equals one screen pixel, but they can also emulate a range of other resolutions.
It is very important to understand that the human eye can resolve a limited amount of information. Kodak figured out through extensive testing in the 1980s that the average person cannot see more detail than can be represented by around 240 pixels per square inch in an 8x10 print viewed at 1x to 1.5x its diagonal dimension (about 12.8 to 19 inches). More input resolution than 240 PPI is wasted! As the normal viewing distance for MOST prints is 1x to 1.5x the diagonal dimension, SMALLER prints need MORE resolution (higher PPI), while LARGER prints need LESS resolution. View an 8x10 at 240 PPI image at 13 inches. Then view a 16x20 of the same exact file at 26 inches. Can you see ANY more or less information? No! It is the same effective resolution. If you made the 16x20 with four times as many original, from-the-camera pixels, and viewed it from the same 26 inches, your eye wouldn't see any more information. Try it. Now, OF COURSE, you can view the individual elements MORE CLOSELY in the 16x20 print made at high resolution, so it's important to consider the subject of the photograph when choosing both input and output resolution... If I have a group of 400 people, I want all the pixels I can possibly gather to represent them.
The old "300 dpi" rule of thumb was a graphic arts community standard. It applied to SCANNING. In other words, if you were going to reproduce an image at 4"x5", you had to have 1200x1500 pixels in the scan, so you scanned it at 300 dpi. (Again, scanner drivers use dpi, not PPI, even though they save pixels!) The 300 dpi rule of thumb came from designers who knew that their printing processes ONLY NEEDED about 200 PPI for quality reproduction, because that's all they could resolve!
What they wanted was a sneaky way to get enough resolution that they could use if they changed the layout and had to enlarge a photo.Much of this information comes from the books of Dr. Taz Talley, a speaker I met years ago at a GATF (Graphic Arts Technical Foundation). More of it comes from my attendance at Kodak DP2 Print Production Software training sessions in Rochester in the early 2000s.
My biggest frustration when I see some of these posts is the tendency of myths to be perpetuated and distorted and used to support other myths. There actually IS some truth about the resolution setting in the header of a file! While it has absolutely nothing to do with image reproduction size
on the Internet, it does come into play with certain page layout software. SOME page layout software uses the resolution header value to size an image when it is imported and placed on a page. In other words, if the header says "300 dpi or PPI", and the image is 1200x1500 pixels in size, the layout software will use that resolution header value to scale the image to 4x5 inches when it makes it available to place on the page! Adobe PageMaker users were notorious for REQUIRING the resolution header be set correctly, because they wanted images submitted for publication "at reproduction size and nominal reproduction resolution." Of course, the page designer knew she could ENLARGE the image by 50%, because her cheesy offset printer could only resolve 200... 300 was cushioned.
I hope this helps.