CaptainC wrote:
So sorry, but I AM going to jump on the PPI/DPI issue. ANYTIME you are referring to digital file, the ONLY correct term is PPI.
DPI is for how many DOTS per inch a printer lays down to print pixels. A file can be set at 180, 240, 300, or 360 PPI and a 1440 DPI printer will still lay down 1440 dots in every inch.
If you size an image for projection at 1024x768 (which is probably correct for a LOT of projectors) you could set the PPI at 78, 312, or 264 and it would look exactly the same on a screen. Because the projector only cares about the X by Y dimension. In fact, ALL digital devices look only at the x by y - projectors, laptops, iPads, etc.
Only for printing does the PPI setting affect image quality.
I JUST LOVE correcting the PPI/DPI thing every time I see it. That and the it-will-never-die 72PPI resolution myth.
So sorry, but I AM going to jump on the PPI/DPI is... (
show quote)
CaptainC,
I am going to have to disagree with your statements DPI vs PPI.
All imagery sent through a computer device and displayed on a visual monitor is "Printed" to the screen. It has been that way for near 75 years. RADAR screens printed enemy aircraft and warships on glass cathode ray imaging screens, and our antique Microsoft keyboards still have the key, "Print Screen".
PPI comes from Picture Element, and is the measure and photon sensitive device used to convert photons to electrical energy.
DPI as you stated is the number of dots per inch a paper printer can imprint ink onto a page.
If you take a good standard photo image of say 4 x 4 inches save it at 0 (or near 0) compression as a .jpg file, at 300 DPI, and print it you should get a reasonably good printed copy at 4 x 4 inches.
If you take that same original image and save it as a .jpg file with 72dpi, and print that image you will see a nice 4 x 4 inch picture, but the print quality will be rather poor.
When we decrease the file resolution we remove a lot of the details contained within that file, digital data. In the early 1990s most of us were on dial-up internet connection with a bandwidth speed of around 24,000 Bd. To attach an image to an e-mail, or web page would take up to several minutes to download and render to the computer screen.
Today many of us have Fiber-Optic T-1 internet connections and the same file will download in the blink of an eye. However, because of the inception of "Cloud Computing" much of the ultra high bandwidth is being eaten up by the Cloud. In that regard it is still important to keep internet files as small as possible, and that means reducing the image resolution to the minimum acceptable level for modern computer monitors.
It does not matter that the monitors we have on our desktops can reproduce ultra high resolution photographic images. What matters is how we get those image to the monitor.
Have a very good week, :-)
Michael g