Apaflo wrote:
A camera's sampling, aliasing, and AA filter have nothing at all to do with the wavelengths of light. The above is a grossly incorrect understanding of what happens in a digital camera.
The sampling rate of significance is the pixel pitch of the sensor. A 24 MP APS-C sensor such as that of a Nikon D7200 has a higher sample rate, and is more critical for the AA filter, than are sensors such as in a D800 full frame camera.
Here we go again. What part of LOW PASS FILTER or sampling RATE (neither of which has anything to do with pixel density) did you not understand? The pixel spacing of a Nikon D7200 is 3.89 um or 3890 nm while the wavelength of violet light (the highest frequency/shortest wavelength that the eye can see) is ~380 nm. Now explain (if anyone is interested) how the pixel spacing can have anything to do with the spatial sampling of a wave that is 1/10th the distance between pixels.
After posting the below, I am unwatching as you again take a simple question from the OP off to yet another divergent argumentative path - carry on and argue with yourself...
From Wiki:
An anti-aliasing filter (AAF) is a filter used before a signal sampler to restrict the bandwidth of a signal to approximately or completely satisfy the sampling theorem over the band of interest. Since the theorem states that unambiguous reconstruction of the signal from its samples is possible when the power of frequencies above the Nyquist frequency is zero, a real anti-aliasing filter trades off between bandwidth and aliasing. A realizable anti-aliasing filter will typically either permit some aliasing to occur or else attenuate some in-band frequencies close to the Nyquist limit. For this reason, many practical systems sample higher than required to ensure that all frequencies of interest can be reconstructed, a practice called oversampling.
Optical Applications
In the case of optical image sampling, as by image sensors in digital cameras, the anti-aliasing filter is also known as an optical low-pass filter (OLPF), blur filter, or AA filter. The mathematics of sampling in two spatial dimensions is similar to the mathematics of time-domain sampling, but the filter implementation technologies are different. The typical implementation in digital cameras is two layers of birefringent material such as lithium niobate, which spreads each optical point into a cluster of four points.[1]
The choice of spot separation for such a filter involves a tradeoff among sharpness, aliasing, and fill factor (the ratio of the active refracting area of a microlens array to the total contiguous area occupied by the array). In a monochrome or three-CCD or Foveon X3 camera, the microlens array alone, if near 100% effective, can provide a significant anti-aliasing effect,[2] while in color filter array (CFA, e.g. Bayer filter) cameras, an additional filter is generally needed to reduce aliasing to an acceptable level.[3][4][5]