View Single Post
  #6  
Old 25-09-2013, 11:36 AM
irwjager's Avatar
irwjager (Ivo)
Registered User

irwjager is offline
 
Join Date: Apr 2010
Location: Melbourne
Posts: 532
The thing is that it's physically impossible to restore the narrow band of the spectrum that you've filtered out from the R and G channels - it's irrevocably lost. You can't color balance your way out of it as color balancing means boosting or attenuating whole channels - you might restore the original level of the narrow spectrum that you lost, but you will have boosted the other part of the channel as well. The best you can hope for is a compromise (e.g. some sort of 'off' colour).

The best course of action is to image R, G and B *without* filter and L *with* filter.
Mathematically speaking, light pollution removal is a trivial problem (as long as the data is not clipped because of it) - it's just a bias in the signal that can be removed, so restoring correct colour in R, G and B is not a problem; just model and subtract.
The sole reason why you would want to use an LP filter is to be able to attain longer exposures without the LP swamping the sensor. Since luminance is solely used for the brightness level of your image, not color you get the best of both worlds - it has no effect on colour, while it *has* an effect on how many photons you can gather for your brightness data - best of both worlds!

The only side effect? Objects (ex. stars) that give off light in the spectrum that was filtered appear a tiny bit less bright - they still have the correct colour though, so who cares!
Reply With Quote