View Single Post
  #43  
Old 01-08-2007, 01:01 AM
iceman's Avatar
iceman (Mike)
Sir Post a Lot!

iceman is offline
 
Join Date: Sep 2004
Location: Gosford, NSW, Australia
Posts: 36,761
Hi Steve

The effects you describe are certainly noticeable while doing high-resolution planetary imaging, such as the Jupiter image in this thread.

Monochrome RGB imaging certainly reveals that the blue channel suffers worst from the atmospheric dispersion and turbulence ("seeing"). We also notice the focal point can change slightly between the red, green and blue wavelenghts - moreso in bad seeing, and moreso when the object is lower on the horizon.

With RGB imaging, the settings can be changed during capture, so things like brightness, gain, exposure can be adjusted to suit the conditions, rather than having to worry about it too much in post-processing.

The post-processing routine usually involves aligning all the individual frames; ranking them in order of "sharpest" to blurriest (based on edge-detection type algorithms), and then "stacking" the sharpest frames to increase the signal to noise ratio.

The better the seeing, the better (more accurate) the alignment and ranking, and the better the seeing, the more frames can be stacked together to give you more signal, smoothing out the image.

More post-processing is then done using sharpening algorithms such as wavelets and typical blur/sharpen filters in photoshop.

Of course the individual colour channels can then still have adjustments applied including levels, colour balance etc.
Reply With Quote