Mongo wrote:
A low resolution display will mask the performance of the optics. You still get the same image with the same optics.
I think I posted it in a different thread, but as an example, some camera designers will aggregate pixels to get a higher sensitivity, and the spatial distribution of those aggregated pixels is usually 2x.
Me? I want all the gamut I can get! Low noise, high dynamic range, and lots of pixels packed really really tight together. Anything else is a compromise.
Mongo … first off - thanks for transferring here, so we can discuss this on MY OWN Topic Post, rather than another's … which was being seen as a "hi-jacking" - whatever the hell THAT means … anyway, I was most interested in what you had to say, there - but to dwell on it all was just to belabor the "hi-jacking" ….
Second - here's what you wrote over there - so you can refer back to it, if you wish ….
"Chris, the ringing artifacts are effectively a point spread function, similar to an MTF. The level of ringing is a quantum effect and creates a "blur." The size of that blur is more recognizable in high density sensors (such as on some dSLRs).
The point spread function (PSF) of an optical system (normally characterized on axis) is used as a quality metric of that system as it describes the input/output relationship of the optical system. Functionally, it is measured in the spatial domain, and MTF charts and other charts used to characterize system performance are used for this. Changing the aperture of a camera lens, changes the PSF, and it is normally at it's best at the lens' "sweet spot."
If one has a low resolution sensor, the optical system impulse response, which can be described by the PSF, matters less, because the sensor has less spatial resolution. In some cameras, adjacent pixels are aggregated, to create a larger pixel size. This tends to make the "blur" less apparent when looking at the aggregated pixels. However, it hasn't gone away, it is still there, it is just the sensor will not let you resolve it.
Everything is interrelated. There is a trade between resolution and dynamic range. So a camera with a lesser spatial resolution but a higher dynamic range can be perceived to perform as well as a camera with a higher spatial resolution with a lesser dynamic range. (A certain camera manufacturer lost market share over that issue, because most of the people buying cameras looked at the spatial resolution, not the color gamut. A story for a different day.)
With a suitable density (spatial resolution) imaging system, the ringing from diffraction effects can be seen near high contrast edges. I used the twigs in the trees when I looked at the OP image.
So to go back to the OP's question, there may indeed be diffraction effects which are limiting the resolving capability of the camera, but the area inside the red circle is something else. To me it looks like some kind of flare. It could be caused by many different things, but I have seen flare like that from filters, such as a UV filter, which tend to have less optimized anti-reflective coatings than the lens elements. Perhaps a lens system designer could provide suggestions as to what source might be more probable."
Now, then … first - which camera-maker lost market share for dwelling on this issue?
Second - could you please elaborate on what you mean by a) the spatial resolution b) color gamut … and how that relates to the Topic here - which is the relationship between diffraction, and high-MP, high- density DSLRs … and whether it happens sooner or later … meaning - is there greater leeway with LESSER RES cameras, as you have implied - with your comments, earlier. I understand you want as much RES as you can get … so, you would be an advocate of 46MP or 50MP designs - right? … But, is there some logic to the idea a lesser res camera like the 12MP D700 may actually be an advantage in this regard?