Quote:
Originally Posted by Troy
Hello Ray, very interesting.
I'm not totally convinced it's a sharpening artifact(I know you did a simulation but hmm).
Have you tried a apodizing mask?
To see if the white line disappears when the mask has been used. This would try and negate the effect of the airy disc pattern. 8-)
|
Hi Troy. The secondary ring appears at about 0.5 arc sec from the main edge, so I think it is a good candidate to be a sharpened diffraction ring, as per the simulation. Apodising mask is a great idea - will try it at some future date.
Quote:
Originally Posted by cybereye
Ray,
Are your tests just on decon or have you tried wavelets as well?
Cheers,
Mario
|
Hi Mario, basically the same effect with wavelets, deconvolution, unsharp mask....
Quote:
Originally Posted by riklaunim
That's a diffraction pattern that will show up at high resolution - Airy disk. Oversharpening will make it clearly visible. The same thing is responsible for bad edges on Mars or Venus.
|
hi Piotr. agree - have seen a few bad edges on mars this season
Quote:
Originally Posted by Paul Haese
This is a common problem in SCT's with less than perfect seeing. As the seeing degrades the effect becomes more pronounced. Edge sharpening tends to enhance the problem more, so careful sharpening via masks can be a better way to deal with the problem.
|
I guess SCTs have stronger ring structures, so these observations seem to reinforce the idea that this is a result of the Airy pattern.
The basic problem I have is that the image on the screen is a sampled representation of the original image blurred by a variety of atmospheric effects (probably a Gaussian blur is a fair representation of these), the optics PSF (definitely not a Gaussian) and the camera diffusion spread function (again Gaussian may be reasonable). The sharpening algorithms (as far as I know) assume a smooth blurring function and try to correct for it - amplifying artefacts from the optics PSF in the process. This just seems like a dumb way to do it, since it would surely make more sense to correct for a real blur function than an unrealistic assumed one.
What I think might be possible is to find the actual blur function by imaging a star as well as the target and using the star shape as the kernel for the sharpening alogrithm (some of the available algorithms allow arbitrary kernels) - have used with success on DSO images, but not yet on solar system stuff. Has anyone ever tried this?
regards Ray