View Single Post
  #9  
Old 02-09-2014, 02:40 PM
Shiraz's Avatar
Shiraz (Ray)
Registered User

Shiraz is offline
 
Join Date: Apr 2010
Location: ardrossan south australia
Posts: 4,918
I don't think that argument applies here Barry.

The basis of the idea seems to be that if there is a pixel with zero target electrons in most frames, any frames where that pixel includes one (or more) target electrons will be rejected by the stacking software because it is different - so you lose data.

However, with current sensors, no pixel will ever have zero or fixed signal - there will always be read noise, dark noise and sky noise electrons that produce frame-to-frame variations in pixel signal (of maybe ~10 electrons on average) that completely mask variations of 1 or 2 electron due to the occasional presence of a target photon. Thus, as I see it, you cannot lose signal because there is no way to design software that can distinguish between frames with and without a couple of photons of target signal (how nice would that be!). When the data is stacked, the noise will be uncorrelated from frame to frame and will integrate out, but the signal will be correlated and will accumulate - even if there is none in some frames there will be some in others.

Should we ever get sensors with zero read and dark noise and image through extremely narrow band filters, it may be that the argument applies - but not currently.

As you noted, Pete indicated that the discussion applies to shot noise limited conditions, so read noise is assumed to be an insignificant contribution to total noise.

Last edited by Shiraz; 02-09-2014 at 02:57 PM.
Reply With Quote