Thread: Spectroscopy
View Single Post
  #318  
Old 01-05-2009, 11:14 AM
sheeny's Avatar
sheeny (Al)
Spam Hunter

sheeny is offline
 
Join Date: Jun 2005
Location: Oberon NSW
Posts: 14,438
Not quite, Mark, but you've got the right result.

The dispersion formula for the SA is

D = (10000 x p)/(100 x d)

where p = pixel size in µm and d = distance the SA is from the CCD (and the 100 is the l/mm for the SA).

Resolution is determined by Raleigh's criteria:

r = fL/D

where r = radius of the airey disc image of the star, f = scope focal length, D = diameter of scope (so f/D = FR of scope), and L = wavelength of the light.

For my scope (C8), the theoretical resolution (ignoring seeing and imperfections) is:

r = 9.4 µm for red light at f/10 (4 pixels diameter at 5.6 µm)
r = 5.9 µm for red light at f/6.3 (just over 2 pixels diameter)
r = 3.1 µm for red light at f/3.3 (just over 1 pixel diameter)

The smaller r is, the easier it is to detect fainter lines, so the "purity" of the spectra improves. Purity is a function of resolution and dispersion. I hadn't heard the term purity till I read Kaler.

Obviously, as the FR reduces, this limits how far away the SA can be placed from the CCD. For a f/3.3 scope, a full aperture light cone of say 30mm at the SA, will focus at just 99mm, which limits the dispersion to 5.66 A/pixel. Not quite sure what would happen at d>99mm in that case, I assume focus could still be achieved but the spectra would be dimmed as not all the light from the star would make it through the SA.



Al.
Reply With Quote