I keep reading about low pass filters (anti-aliasing filters) and am curious about what they are and what they do. I did a search on UHH and found several somewhat answers as well as reviewing a link posted referencing a Wikipedia article. I am now more confused than ever. Can someone please explain what this is and what it is for in simple terms. Thanks.
It's something that I've lived without for the last ~1 1/2 with my em1. I've never had an issue with it, er, without it.
MikeFromMT wrote:
I keep reading about low pass filters (anti-aliasing filters) and am curious about what they are and what they do. I did a search on UHH and found several somewhat answers as well as reviewing a link posted referencing a Wikipedia article. I am now more confused than ever. Can someone please explain what this is and what it is for in simple terms. Thanks.
Mike, it's for when things pass really low, it filters them!! :lol:
SS
Something I prefer not to have in my cameras. I like a true image and have no concerns about moiré.
ozdude
Loc: Brisbane Australia
SharpShooter wrote:
Mike, it's for when things pass really low, it filters them!! :lol:
SS
:thumbup: :thumbup: :thumbup: :thumbup:
MikeFromMT wrote:
I keep reading about low pass filters (anti-aliasing filters) and am curious about what they are and what they do. I did a search on UHH and found several somewhat answers as well as reviewing a link posted referencing a Wikipedia article. I am now more confused than ever. Can someone please explain what this is and what it is for in simple terms. Thanks.
Whenever analog data is digitized, there is a loss of information due to the fact that digitized data has limited precision and that precision is (at least conceptually) lower than in the analog signal being digitized. But there is another loss of information in that digitized information is necessarily in discrete samples. That second kind of information loss is already present from the very start at the sensor because the sensor first detects discrete analog measurements at each pixel site. Even for film cameras there was this kind of loss because of the discrete layout of silver molecules on the film but because these sites were so close together the effect was of no worry. At some time in the future when we have digital cameras with terapixel sensors we may, once more, stop having to worry about this issue.
But today, the pixel separation is still significant and it causes error in extremely busy scenes with rapidly changing light intensities in one color or another. Areas where this happens are termed "high frequency". The scene just has too much change for the sensor to pick up the variation - that variation happens between the pixels. There is the possibility (and Murphy's law would say certainty) that there will be regions where a block of pixels pick up only the bright peaks and another block of pixels that happen to be where they pick up only the dark valleys in the true (analog) scene. The result is what are called Moire patterns.
As noted, we could solve this problem by adding more pixels (though another scene might have even higher frequency changes) but that would mean buying a new camera once technology gets that far advanced. But a more practical solution is to introduce a low-pass filter to blur out the high-frequency changes in the scene before it reaches the sensor. Human vision is not so good at detecting these high frequency changes - no doubt because our eyes have low-pass filters to accommodate the spacing of the rods and cones in our eyes.
By the way, this is a large part of how JPEG compression works - by filtering out high-frequency changes. If you ask for more compression, the filtering will be more severe.
MikeFromMT wrote:
I keep reading about low pass filters (anti-aliasing filters) and am curious about what they are and what they do. I did a search on UHH and found several somewhat answers as well as reviewing a link posted referencing a Wikipedia article. I am now more confused than ever. Can someone please explain what this is and what it is for in simple terms. Thanks.
It's probably something everyone can live without. That is, until some hack writes an article about it and posts that article on some blog.
--Bob
Bear123
Loc: Wild & Wonderful West Virginia
Moral of the story, either take the air out of your tires or take the long way around. :thumbup: :thumbup:
pecohen wrote:
Whenever analog data is digitized, there is a loss of information due to the fact that digitized data has limited precision and that precision is (at least conceptually) lower than in the analog signal being digitized. But there is another loss of information in that digitized information is necessarily in discrete samples. That second kind of information loss is already present from the very start at the sensor because the sensor first detects discrete analog measurements at each pixel site. Even for film cameras there was this kind of loss because of the discrete layout of silver molecules on the film but because these sites were so close together the effect was of no worry. At some time in the future when we have digital cameras with terapixel sensors we may, once more, stop having to worry about this issue.
But today, the pixel separation is still significant and it causes error in extremely busy scenes with rapidly changing light intensities in one color or another. Areas where this happens are termed "high frequency". The scene just has too much change for the sensor to pick up the variation - that variation happens between the pixels. There is the possibility (and Murphy's law would say certainty) that there will be regions where a block of pixels pick up only the bright peaks and another block of pixels that happen to be where they pick up only the dark valleys in the true (analog) scene. The result is what are called Moire patterns.
As noted, we could solve this problem by adding more pixels (though another scene might have even higher frequency changes) but that would mean buying a new camera once technology gets that far advanced. But a more practical solution is to introduce a low-pass filter to blur out the high-frequency changes in the scene before it reaches the sensor. Human vision is not so good at detecting these high frequency changes - no doubt because our eyes have low-pass filters to accommodate the spacing of the rods and cones in our eyes.
By the way, this is a large part of how JPEG compression works - by filtering out high-frequency changes. If you ask for more compression, the filtering will be more severe.
Whenever analog data is digitized, there is a loss... (
show quote)
thanks for the comprehensive and easy-to-understand story...
If you want to reply, then
register here. Registration is free and your account is created instantly, so you can post right away.