View Single Post
Old 06-03-2013, 08:14 PM
Shiraz's Avatar
Shiraz (Ray)
Registered User

Shiraz is offline
Join Date: Apr 2010
Location: ardrossan south australia
Posts: 4,824
Originally Posted by rally View Post

A short exposure will simply get zero photons from such a weak signal source or that the shot noise will the same or similar to the signal noise.
The only way to capture this sort of detail is a very long exposure.

Hi Rally.

thanks for raising this interesting offshoot from the original post. Did some basic calculations on a standard CCD (10e read noise) and the one you describe (.5 e read) with broadband imaging of NGC253 as the target under dark skies with a 12 inch scope at 1 arcsec scale.

The result are very interesting and I am a bit hesitant to publish because they will be ridiculed - but here goes anyway:
  • standard CCD: 30:1 SNR at roughly 13x5 minute exposures - headroom maybe 600,000:1
  • low read noise CCD: 30:1 SNR at roughly 4000 exposures of 1 second - headroom maybe 20,000,000:1 - could comfortably work faster than 1 Hz

At 1 second or less exposure, much of the seeing and most tracking errors would be defeated if a stacking program like Registax was used. Resolution would potentially be determined more by the scope than the seeing. The mount would not need to be anything special at all to keep a scope on target for 1 second periods.

In anticipation that the idea of detecting a few photons in each sub is not accepted, try the lucky imaging websites - a quote from is: "Basden et al6 have further demonstrated that it is possible to achieve close to full quantum efficiency and photon counting operation even at signal levels of a few photons per pixel per frame, something that is actually quite a high signal rate in terms of photons per pixel per second, given that these devices can be operated at many tens or hundreds of frames per second' They are using scopes with 100x the aperture, but they are also running at 100x the speed I propose.

I am looking at a little under 1 photon per pixel per second. And you don't have to be worried about detecting part photons either - when you get down to this signal level, it becomes a statistical problem - some pixels have exactly one photon, some have 2 and many have exactly none. Add 4000 such exposures together and you have the equivalent of a long exposure result, provided read noise is small enough. I have not looked at narrow band imaging yet.

There are already examples of high framerate imaging on the planetary imaging forum - some imagers use up to 200 frames per second to cut through seeing with bright targets - read noise is the killer though. It will be a major upheaval in how thing are done if very low read noise chips become available to let the same thing be done for deep sky imaging.

Bit of a diversion from the 694, but still interesting. We are accustomed to thinking that we need really long exposures to go deep, but this is only because of read noise polluting low level parts of the image. If you can get rid of thermal noise (most chips largely do) and also read noise, there is absolutely nothing left but signal and you can then use many short exposures - or whatever else you would like to do. However, I suspect it will be a fair while before we have access to this type of technology. You can buy EMCCDs from Andor, E2V and others, but they cost a heap.

I understand that there will always be some hidden gotchas in new technology, but even if the eventual performance gets to 1/10 of what the simple calculations suggest, it will be game changing. Thanks for bringing it up.

regards Ray

Last edited by Shiraz; 06-03-2013 at 09:36 PM.
Reply With Quote