paulrph1 wrote:
I am asking for similarity of quality. Even though those films you mentioned are not exactly the same they can be compared somewhere somehow. 64 Kodachrome equals 64 ektachorme. At least in some regards. Where 64 Kodachrome does not equal 400 ISO film of any types as far as grain goes. Maybe it is too complex of an issue or but I am not the one who has been bringing up the point that they are not the same in picture quality. Alls that I know it that someone said that taking a picture with a cell phone will not be as good as taking a picture with a full frame sensor. Maybe it is like taking a picture with a 110 or brownie as compared to a 35mm.
I am asking for similarity of quality. Even thoug... (
show quote)
Beyond the fact that quality is a hugely subjective experience, the reality is that digital image systems are really rather complex - to some extent even more so than analog (film) and so there are many variables to consider when making a comparison:
1) sensor size: this is the easiest to compare - as with film, larger format (i.e. - sensor size) generally equates to higher quality, or at the very least shallower DOF (depth of field) so that backgrounds can have pleasing "bokeh" to give the overall picture a potentially nicer look. This is the most likely source of your acquaintance's statement that a smart phone photo "will not be as good" as a FX sensor image would be. But of course, since all art is subjective and quality depends on what the final image is supposed to look like, even this isn't necessarily true. For birds-in-flight, it likely will be, but for street shooting? Your call.
2) photo-site size - think of a digital imaging sensor chip like a matrix of tiny light sensors (which is essentially what it is) - with thousands of columns and rows making for millions of individual so-called "photo-sites" and each photo-site being on the order of microns small. As a general rule, larger photo-sites equate to better low-light performance - this is because "noise" is actually the photo-sites are reading the infrared photons coming off the heat of the electronics, as the only thing a photo-site does is measure how many photons have fallen on it. That is, photo-sites are "color blind". However, while larger photo-sites are generally better, simple division doesn't necessarily mean that a larger chip with the same number of sites as a smaller chip will have larger photo-sites; there is a necessary spacing between the photo-sites themselves and advanced techniques can minimize those to allow for larger individual photo-sites. The point is, it ain't a simple thing to know.
3) to point #2, the way most color digital camera sensors work is by having filters of the three primary colors overlaid such that each photo-site is covered by either a Red, Green or Blue filter so that it only reads the associated color. The vast majority of imaging chips are covered in a 2x2 array where 1 R, 1B and 2G make up the quad of filters (the Bayer pattern), though Fuji's X-Trans sensors do not use the Bayer pattern but a proprietary 6x6 pattern of filters (still maintaining the 2:1:1 ratio of G:R:B). The only other commercially available digital cameras that do not use the Bayer pattern of colored filters are the Leica M Monochrome, which has no color filters (since it is designed to only do B&W shooting) and the Sigma DSLR that uses the Foveon chip, which is designed more like color films are, with one color layer on top of the other. One can debate forever if this is a batter approach - here's an interesting article on it:
http://www.imaging-resource.com/news/2014/04/08/sigma-qa-part-ii-does-foveons-quattro-sensor-really-outresolve-conventional4) As @SharpShooter pointed out, though, the final print quality is where the rubber meets the road, and the next step in achieving a digital image is to interpret the data coming off the ship (remember, it's a gazillion points of data that each represent a specific reading of only R, G or B light at that point) - the so-called RAW file. And different software can interpret the same RAW file slightly differently; there are those who say the camera manufacturer's own software (included with the camera generally speaking) does a better job that Adobe's or Apple's or other offerings. When you set the camera to take JPEG images, all you are doing is telling the computer that's built into the camera to do the interpretation programmed into it by the manufacturer, but the principle is the same.
5) and of course, once all THAT is dealt with, there's the question of post processing and the actual device used to make the print. The same image printed to paper as ink-jet, dye-sub, canvas, glass, aluminum and whatever else might look much better or worse to your eye; again, e=depending on what the subject is.
So, in the analog world of film, where molecular grains of silver halides are capturing the latent image to be brought put in chemical baths, the "resolution" of the film was generally never discussed - only the "grain" characteristics - higher speed film has more grain (analogous to higher ISO setting on a digital camera can lead to more noise).
So there - doesn't that seem easy?