#1  
Old 19-08-2008, 06:31 PM
Bassnut's Avatar
Bassnut (Fred)
Narrowfield rules!

Bassnut is offline
 
Join Date: Nov 2006
Location: Torquay
Posts: 5,064
Stacking

Im in the middle of an insane uber-megadata capture mission, some 100 off 20min subs. I just wonder whats the best way to tackle stacking. Id normally just experiment, but its just too time consuming to be totally random. Im thinking mean or median combine in groups of 10 subs with data reject etc, and just sum these hopefully clean group masters . Ive seen others do the reverse, sum subs and mean groups, but the 1st way seems logical. The subs (3nm Ha) are only 1500 odd ADUs so summing at some point seems the way to preserve dynamic range.

I just cant stack all the subs at once, the disk caching is just unbearable.

Anyone had experience with this?.
Reply With Quote
  #2  
Old 19-08-2008, 06:50 PM
jase (Jason)
Registered User

jase is offline
 
Join Date: Sep 2006
Location: Melbourne, Victoria
Posts: 3,916
33 hours of data. I look forward to seeing the result. There are a few ways of tackling the situation, but before we delve too deep. Run your subs through something like CCDInspector or another data interrogation algorithm... your making an assumption that all 100 subs are perfect. Its less than ideal if you combine premium data with mediocre. You could be left with 80 subs ofter the data reject. A good indication is to evaluate FWHM of each sub.

As you indicate, work in data sets. 12 subs usually provides ample information for outlier pixel data rejection algorithms. Once you've got your 8 or so combined subs, combine them again. I've heard of imagers doing the initial combine using a lower sigma reject value to keep some outlier pixels in the data. These are then addressed on the second combine pass. What you want to avoid is an aggressive sigma reject on the first pass. Too aggressive and the algorithm has the potential to mistake data for noise. Clearly, not good.
Reply With Quote
  #3  
Old 19-08-2008, 07:03 PM
Bassnut's Avatar
Bassnut (Fred)
Narrowfield rules!

Bassnut is offline
 
Join Date: Nov 2006
Location: Torquay
Posts: 5,064
Thanks Jase, helpfull hints on the data reject, but you missed what combine method at each stage you would recomend ?, does it matter you think?
Reply With Quote
  #4  
Old 19-08-2008, 07:41 PM
jase (Jason)
Registered User

jase is offline
 
Join Date: Sep 2006
Location: Melbourne, Victoria
Posts: 3,916
Not sure why you'd want to use sum. Average or median has got to be better. Sum simply increases the noise in proportion to signal. There is no rejection. Personally, I'd go with sigma reject but at a low rejection rate. I've found with six subs, you want a rejection rate of around 10%, the higher the sub set (say 12 to 16), the rejection rate is lower 4-5%. If you really want to use sum to do the work, decease the subs per set, so that the second pass will have a higher data rejection rate.
Reply With Quote
  #5  
Old 19-08-2008, 08:47 PM
winensky's Avatar
winensky
Registered User

winensky is offline
 
Join Date: Aug 2007
Location: Ballarat Vic
Posts: 268
Thanks Jase. For a relative newcommer I have often wondered how vigourously to apply sigma rejection and hadn't even thought of it as a function of sample size but that makes perfectly good sense.
Reply With Quote
  #6  
Old 19-08-2008, 09:16 PM
rat156's Avatar
rat156
Registered User

rat156 is offline
 
Join Date: Aug 2005
Location: Melbourne
Posts: 1,694
Hi Fred/Jase,

CCDStack does a sigma reject on your stack, then sums the pixels left over, it's fine to do a sum if you've already rejected the outliers. I have used all three combines (Median, mean and sum) using the same data and really you'd be hard pressed to see the difference. For large data sets that have had the outliers removed mean and median are really the same, so use mean, it makes sense mathematically. Sum and mean give you different pixel values, but relatively they should be the same for a large data set, sum may allow for slightly different histogram manipulation.

Cheers
Stuart
Reply With Quote
  #7  
Old 19-08-2008, 09:46 PM
jase (Jason)
Registered User

jase is offline
 
Join Date: Sep 2006
Location: Melbourne, Victoria
Posts: 3,916
Indeed, you're correct Stuart. CCDStack is quite unique in that regard. I guess the point I was making is that you would rarely use sum on its own without a form of rejection...no point increasing noise in proportion to signal. I should get back into using CCDStack. It is a great program, but a little clunky at times. Perhaps I'm just not use to it. MaximDL is my heavy artillery, followed by PS as the ground troops. I'm surprised you don't notice any difference between algorithms. With mega data it would be hard. When blinking combined images of just a few subs, I've found differences, be it subtle.
Reply With Quote
  #8  
Old 20-08-2008, 08:01 AM
Jeffkop's Avatar
Jeffkop (Jeff)
Star-Fishing

Jeffkop is offline
 
Join Date: Jan 2008
Location: Tuckurimba
Posts: 885
Ive just read this thread .. really interesting guys, I also have pondered the differences between the combining algorithms .. but just so I can get a better grasp on things .... What are outlier pixels ???
Reply With Quote
  #9  
Old 20-08-2008, 09:33 AM
jase (Jason)
Registered User

jase is offline
 
Join Date: Sep 2006
Location: Melbourne, Victoria
Posts: 3,916
How much time do you have Jeff?

In a summarised form…(there is a lot of theory and math behind this)

The basic concept is to calculate the best “mean” from pixel values of a collection of images you are combining. Algorithms, such as Sigma Reject use a standard deviation or sigma value which is based on variance. It describes the spread of values around the mean. In order to calculate the likely value of the mean in the real world (that introduces non-statistical errors), the user specifies a deviation from the mean value they deem acceptable. Any values outside this range are ignored (rejected). It is these pixels that are outside the range which are known as outlier pixels. Outlier pixels can have a profound affect on the calculation of the mean value. In short, choose a sigma or deviation value that rejects the fewest pixels, but still removes satellite trails, hot pixels, cosmic ray hits, etc. It’s best to choose a large sigma value however, if it’s too large it will not reject any values, thus you’ll simply be left with the mean pixel values. Too small a value will reject more – in other words the threshold defined to classify outlier pixels extends into the real data – not good.

Happy to provide further clarity as required. I should note that Median works is a subtly different way...and for data rejection algorithms to work effectively, the data set you are combining needs be normalised (i.e. stretched to the similar value across the data set). You specifically don't stretch it yourself, the algorithm does this for you. Actually stretching is the wrong terminology, its uses pixel math to achieve the task. It would be difficult to calculate the mean value if this was not performed.
Reply With Quote
  #10  
Old 20-08-2008, 09:44 AM
Jeffkop's Avatar
Jeffkop (Jeff)
Star-Fishing

Jeffkop is offline
 
Join Date: Jan 2008
Location: Tuckurimba
Posts: 885
Outliers outlined outstandingly

Ahhh ... now Ive got it Jase

Ok so its a term for pixels that are to ultimately be deleted .. or the Outlier pixels are the desired exclusions from the combining process.

Thanks once again
Reply With Quote
  #11  
Old 20-08-2008, 09:54 AM
Terry B's Avatar
Terry B
Country living & viewing

Terry B is offline
 
Join Date: Mar 2006
Location: Armidale
Posts: 2,789
[quote=jase;355681


Sum simply increases the noise in proportion to signal. There is no rejection.

[/quote]
Are you sure this is correct? According to Richard Berry in his book (Astronomical Image Proocessing) he states that summing increases the S/N ratio by the square root of the number of images taken. ie 4 images reduces the noise by 2x and 9 images by 3 x.

Last edited by Terry B; 20-08-2008 at 09:54 AM. Reason: typo
Reply With Quote
  #12  
Old 20-08-2008, 10:14 AM
jase (Jason)
Registered User

jase is offline
 
Join Date: Sep 2006
Location: Melbourne, Victoria
Posts: 3,916
Give it a try Terry. The sum is simply the sum of corresponding pixel in all images (of the data set you are combining). Pixel values increase, so you are perhaps increasing the signal, but at the same time noise also increases as there is no data rejection being performed. Undoubtedly, noise increases at a lower rate than signal, so I will acknowledge that it may not necessarily be "proportionate" as I originally indicated.
Reply With Quote
  #13  
Old 20-08-2008, 12:05 PM
Terry B's Avatar
Terry B
Country living & viewing

Terry B is offline
 
Join Date: Mar 2006
Location: Armidale
Posts: 2,789
Quote:
Originally Posted by jase View Post
Give it a try Terry. The sum is simply the sum of corresponding pixel in all images (of the data set you are combining). Pixel values increase, so you are perhaps increasing the signal, but at the same time noise also increases as there is no data rejection being performed. Undoubtedly, noise increases at a lower rate than signal, so I will acknowledge that it may not necessarily be "proportionate" as I originally indicated.
I have done this many times but not for images but for photometry. I sometimes take 3 x 5 min exposures of dim stars through a blue filter. I measure the S/N for the individual stars compared to summing the images. The S/N is much better on the summed image allowing measurement of the magnitude of a much dimmer star. I could take a 15 min exposure and hopefully have an even better S/N. I don't do this as sometimes the star is bright enough to allow measurement on the individual frames and I can average the result to inprove the accuracy.
Reply With Quote
  #14  
Old 20-08-2008, 05:39 PM
rat156's Avatar
rat156
Registered User

rat156 is offline
 
Join Date: Aug 2005
Location: Melbourne
Posts: 1,694
Traditional calculations for S/N ratio however fall down when taking photos when then main source of noise is not random, but sky background. Unfortunately the "noise" in this case is always positive, so it's summed as well. The only solution to this is dark skies I'm afraid.

Narrowband imaging also benefits greatly from many sub exposures as there is little background.

Cheers
Stuart
Reply With Quote
Reply

Bookmarks

Thread Tools
Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +10. The time is now 06:49 PM.

Powered by vBulletin Version 3.8.7 | Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Advertisement
Bintel
Advertisement
Testar
Advertisement