View Full Version here: : JUPITER reprocessed by splitting R,G,B and processing separately - worth it?
Robert_T
27-02-2006, 10:26 PM
Hi All, noticed a few (Mike and DP) have been splitting avi's into separate colour avis and processing separately in Registax before recombining for final image. I've resisted this because I'm basically lazy, but decided (in the DP spirit of investigation) to give it a go on one of my recent Jupiter Avi's (1/10s at 5fps for 100 sec). Attached shows the original processed as a RGB combined avi image at upper left and then lower and upper right showing the separately processed and recombined colour channel image - the latter resampled up to see how well it stood amplification. Stacked between 350 and 400 frmaes for each colour channel.
I had a little trouble re-aligning the separate colour channels well for the combined image and I'm not fully happy with the colour, but on the whole it does seem to show finer detail than the original... hmm maybe there's something in this?
cheers,
asimov
27-02-2006, 10:41 PM
Yes mate..theres certainly something in it!:confuse3:
Looks real nice with (like U said) finer details springing forth. I've got enough on my plate without attempting this procedure ;) ...I'll leave it to the experts & continue to do it the 'old fashioned way'...:lol:
iceman
28-02-2006, 06:08 AM
Hey Rob, there's definitely finer detail in that new image, and it's standing up well to resampling.
Re-Aligning the colour channels after recombine in AstraImage is probably the most difficult part, but "generally", the red channels need to move -1 or -2, and the blue channels +1/+2. The poor AstraImage preview window doesn't help either. I usually zoom in x2 to try and get it right, and a bit of trial and error each way.
Your image has quite a red bias in the new one - this can be easily overcome by using the weight when recombining (try 1.1 on the red channel), or by using the colour balance sliders afterwards.
Separate R/G/B processing takes longer, but i'm convinced it produces a much better result.. DP and I reckon about 5-10% better, sometimes more depending on the image and the seeing, and the care in processing.
Robert_T
28-02-2006, 11:49 AM
Thanks Guys,
Mike I'll try doing the colour align on magnified image, didn't think of that. Do you crop in registax before alignment. I didn't and found I had to move green almost 16 in one direction.
cheers,
Letting registax only "see" one colour at a time should give you a much better alignment, especially if you're feeding it colour data that has a couple of pixels worth of dispersion error between the channels...
There's more advantages too - you can judge the quality and sharpness of each colour channel seperately and choose how to process each one so that you end up with three channels of equal sharpness and detail.
Bird
davidpretorius
28-02-2006, 03:10 PM
rob, ppm centre works well here. if you convert all avis to bmps and then centre them then registax can't move em.
if you splilt the movies and then process each colour separately, then you end up with a different reference frame more than likely, which because the tracking is not perfect could be in a different place.
hence into astra image, you have to move it. i magnify at least 3 times and then align each colour.
depends on how good it is looking to start with. if the data looks great, go the whole hog, ie split, ppmcentre, registax astra image etc
iceman
28-02-2006, 03:17 PM
As DP said, I use ppmcentre to centre, crop (and sort) before throwing into registax.
So my routine is:
- Capture avi
- Use virtual dub to save as bmp's
- Use ppmcentre to centre, crop and sort
- Use netpbm tools to split into r/g/b bmp's.
- Registax each r/g/b
- etc
I have cygwin unix tools on my windows XP laptop, and I have a small shell script which runs the ppmcentre and netpbm tools comamnds. I just pass as an argument a "directory name" which holds the bmp files, and it does the rest.
Robert_T
28-02-2006, 05:13 PM
AAAGGGHHHH... Too much to process... brain can't cope :P
Do you know if Bird has released his PPM centre for MSCOmmand Line phobes yet?
Netpbm - haven't heard of that one, is it shareware:shrug:
thanks guys,
Sorry Robert, I've been snowed under with work and imaging...maybe when this Jupiter season is past I'll have some time to finish it.
I tend to use the same settings all the time with ppmcentre, so it's not too much hassle...one of us can help you get the right settings and then you just re-use them.
Bird
Striker
28-02-2006, 06:25 PM
Great work Robert I like both of them.
So much work processing planets these days....I thought we were bad.
avandonk
28-02-2006, 10:08 PM
I could be out of line here but do you planetary guys take into account the size of the Airy Disk. At f32 it is 22 micron and at f64 43 micron for green light even with perfect optics. It is even worse for red.
Just wondering.
Bert
davidpretorius
01-03-2006, 07:20 AM
great question bert!
i find blue to be the biggest pain in the ass for spread and focus due to longer wavelength.
I will ask resident expert bird this weekend!
Hi Bert, my feeling on this is that there's no problem with oversampling the image as long as there is sufficient light for the camera to keep the signal to noise at a reasonable level.
Shorter exposures, and larger sampling, make it easier to recover the final image in the presence of unpredictable turbulence and added noise from the camera etc.
Also, the airy disk for a star is maybe not quite the same thing as the combination of airy disks from a planet - in the latter case you have many overlapping airy disks, and so there will be transitions of colour and light/dark between these disks that are smaller than the disks themselves. Sampling at a larger size than the disk allows you to detect these edges more clearly.
If you think about what you see on a planetary image live in the eyepiece, you don't see a collection of circular airy disks - if you did then the planet might look like a collection of sharply defined circular spots. Instead you see the sum of all the disks from an infinite number of points across the image, and so there must be detectable features that are smaller than the individual disks. Not resolved features, but just detected features.
On Jupiter, say, this may allow you to detect the presence of a tiny dark spot corresponding to a storm in the polar region that would not otherwise be seen.
regards, Bird
Also, remember that we're aligning and stacking somewhere between a few hundred and a few thousand frames, and that every frame is slightly different due to atmospheric effects and tracking - ie the image falls onto a slightly different place across the ccd grid in each frame, and the actual resolved details change from one frame to the next, images are distorted or larger/smaller from one frame to the next, and this impacts on the theoretical size of the airy disk in each frame.
This is advantageous for us, as it means that we can see more detail than otherwise would be possible if all the frames were identical. To take best advantage of this dynamic environment requires some amount of oversampling on the raw data and then careful selection and processing of some subset of the raw frames later on.
I've felt for a while that the current crop of software packages used for post-processing are not getting the most out of the raw data, as they all use very simple algorithms. My gut feeling is that there is probably another 50% or more of resolving power possible just from improvements in image processing.
regards, Bird
...and finally, this isn't all theoretical stuff. If you look at some of the best images going around, including this recent one by Chris Go:
http://www.cloudynights.com/ubbthreads/attachments/845703-jupiter02270619-37c.jpg
taken with a C11, it's quite evident that detail much smaller than the airy disk size is visible. In particular if you look at the white storm in the upper right of the image, to the right of the (now red) oval BA, this white storm is approximately 0.3 arc seconds across (smaller than the airy disk size for a C11), and yet it is clearly resolved, including a distinct dark edge. Looking around the image shows many features much smaller that are also partially or fully resolved!
regards, Bird
avandonk
01-03-2006, 09:59 AM
It was obvious to me that the resolution you are all getting is better than the Rayleigh criteria for your optic, but as you explained that is for two identical point sources. I think what is happening the Airy disk is actually the central peak of a Bessel Function and you guys are detecting the peaks which are far smaller in size than the so called Airy Disk (diameter to first minima). Hence the importance of the recording level, too high and resolution suffers. I was just wondering why the images you guys are getting were so good. I think I can see why now.
Thanks
Bert
This also highlights something else - we have to be even more critical in the collimation, cooling, etc of the scopes, because there is the potential to see detail way smaller than the airy disk, or rayleigh criterion. It would an error to use the airy disk size as a criterion to judge whether you have the scope "close enough" in setup, it may be better to use 1/4 this size or something as I think that is closer to the true resolving power on extended objects like planets.
regards, Bird
Robert_T
01-03-2006, 11:46 AM
Bird/Bert, I won't pretend to understand all this...but I guess what you're saying/concluding is that planets are different to stars and resolution possible for planetary feature may be different (better?) than that possible for resolution (separation) of stellar point objects?
cheers,
I think the resolution is the same, but stars are featureless points of light, so in a sense it doesn't matter how large/small the airy disk is for a star- theres nothing else to see.
A planet has lots and lots of detail, and so we can see the interaction between all these airy disks, giving detail down to scales much smaller than the disks themselves.
regards, Bird
davidpretorius
01-03-2006, 12:09 PM
lets just nod our heads rob and say "hmmmm...yes" a few times, whilst putting fist to our chin. At least we can pretend to know what it is they are talking about!!!
Robert_T
01-03-2006, 12:28 PM
hmmmm... yes:confuse3:
avandonk
01-03-2006, 12:35 PM
I have only now got my head around this as I last studied optics in the early seventies. Below are four pictures.
The first is an image of a star and is what you see if you telescope is diffraction limited.
The second is a 3d representation of the Airy Disk
The third is airy disks resolved and fourth are airy disks unresolved
If you imagine the peaks a lot smaller then even in the unresolved case they would be resolved. This is also the reason bright stars are bigger than dim ones, as for really bright stars the dim rings (or Bessel function maxima) become quite evident and seem to merge.
Hope this makes it clearer.
I had forgotten I knew this. Everything is much clearer now.
Bert
davidpretorius
01-03-2006, 02:03 PM
hmmmm... yes:confuse3:,
although the 3d graphs help my brain get around this. well done bert
Robert_T
01-03-2006, 02:19 PM
I think I get it Bert, less brightness at any point makes the peaks finer/narrower/smaller and there will be less overlap between the peaks and hence better resolution between them...?
cheers,
avandonk
01-03-2006, 02:59 PM
Correct. I have only just realized this myself.
Bert
Bert, I'm not sure about this interpretation - it may be ok but I have not seen the airy disk presented this way (in 3D).
The equations I've seen for the size of the airy disk only talk about mirror diameter and the wavelength of light, not brightness. But your explanation implies that the airy disk size can be made smaller by decreasing the brightness.
e.g. in my copy of the Suiter book on star testing, he gives the formula for the size of the airy disk as:
r = 1.22Lf/D
L=wavelength, f=focal length, D=diameter of mirror or objective.
Am I interpreting your comments correctly?
regards, Bird
edit - I think I understand your comment now, are you saying that a dimmer object will have a smaller central bright spot and a larger dark gap, but the overall size of the airy disk remains constant...
Applying this idea to extended objects like planets is a bit interesting, you might divide the object into an infinite number of very dim points, leading to some odd conclusions.
Just thinking aloud here...
avandonk
01-03-2006, 03:53 PM
That gives the resolution limits for two point sources (stars) of equal magnitude. The first minima is the border of the Airy disc. With a dimmer object the minima will be broader and for a real case less than the noise.
With planetary imaging a smooth part is fine, it is at the border between light and dark features you want resolution. And as you said that is about a quarter of the diameter of the Airy disk. So the border is blurred by this amount.
What set me thinking was that the Airy disk for F64 was 43 micron which for a ToUcam is 8 pixels. Yet the detail I have been seeing seems to be far better than that. This has been a worthwhile exercise in understanding what is happening.
So by your criteria the resolution limit by diffraction is really about 2 pixels for F64 and for F32 1 pixel, Is this what you actually see with very good seeing.
The Airy disk is not uniform. The intensity follows a bessel function for a circular aperture. The 3D picture has intensity as the vertical variable.
Bert
vBulletin® v3.8.7, Copyright ©2000-2025, vBulletin Solutions, Inc.