View Single Post
  #22  
Old 28-03-2013, 11:30 PM
Shiraz's Avatar
Shiraz (Ray)
Registered User

Shiraz is offline
 
Join Date: Apr 2010
Location: ardrossan south australia
Posts: 4,918
Hi Richard

FWIW, my take on your original question:

The signal in a pixel from an extended object (eg nebula or galaxy) is:

S = B*A*pixangle*opticsefficiency*time* QE

Where B is the object surface brightness, A is the aperture area and pixangle is the solid angle subtended by a pixel.

If you keep pixangle the same for the two systems (by using different cameras) and assuming all else is equal, increasing the aperture area A will proportionally increase the pixel signal – nothing else changes much.

The resolution of the two systems will be largely determined by the atmospheric seeing. Ignoring tracking blur and charge diffusion, the total PSF for each system is the convolution of the atmospheric PSF and the optics PSF. Assuming Gaussian approximations apply to the PSFs and that seeing is specified in angular FWHM terms,

FWHMtotal = SQRT(FWHMseeing^2+FWHMoptics^2)

Now, FWHMoptics = 1.03*lambda/aperturediameter, (eg 0.58 arcsec for the 200mm)

so for the 200mm system in 2 arcsec seeing, FWHMtotal = 2.08 arcsec
and for the 300mm system, FWHMtotal = 2.04 arcsec

ie, there is not much difference in FWHM of the combined PSFs – scope resolution is not a significant factor and both systems are basically seeing limited in 2 arcsec seeing. Since the pixels are 1x1 arc sec in each case, the sampling is also similar at about 2 pixels/FWHM (the ideal seems to be somewhere around 2.5-3, so the images will be slightly undersampled).

entirely different answer if you kept the same camera. And of course this does not include any consideration of SNR and dynamic range.

regards Ray

Last edited by Shiraz; 29-03-2013 at 07:51 AM. Reason: make clear its surface brightness
Reply With Quote