View Single Post
  #15  
Old 24-09-2019, 11:13 PM
ericwbenson (Eric)
Registered User

ericwbenson is offline
 
Join Date: Sep 2009
Location: Adelaide, Australia
Posts: 209
Hi,

The difference between OSC and mono is a recurrent subject on just about every astro msg board! So some simple math to demonstrate some of the differences:

Imagine 4 adjacent pixels that fit inside the star spot diameter (or nebular feature diameter) all getting the same flux because the object being imaged doesn't vary over such a small angular scale. Let's also imagine the object is white and keep the total imaging time for mono and OSC equal.

The mono imager with a color filter collects 100 photons in each pixel in 1 minute, i.e.
100 100
100 100
therefore a total of 400 photons for R, G and B over three minutes, ignoring time for filter changes, downloading etc.

The OSC unit cell is:
RG
GB
Each pixel collects 100 photons in 1 minute, and 300 photons in 3 minutes:
300 300
300 300

Comparing the total number of photons collected in 3 minutes for both cases:
Mono OSC
R 400 300
G 400 600
B 400 300

So OSC gives you a better green channel result but worse red and blue channels. While this might be fine in daylight scenes, this is generally not what one wants in astrophotography. The sensor QE is weaker in the blue and the red for almost all CCD/CMOS sensors, so really the opposite (more red/blue compared to green) is desired, and you can't do anything about it, the OSC filters are permanent, you'll just keep collected more green than you want!

It gets worse, the fixed color filter in the bayer array work by absorption rather than reflection like the dieletric color filters in the filter wheel. The dielectric filters have a flatter/sharper passband resulting in greater color differentiation and letting more light thru where you want it (so higher effective QE for the equivalent spectral passband).

Another advantage to mono is a smaller impact from read noise since you can make your pixel four times the area as compared to OSC with the same digital resolution limit. Or conversely for the same read noise impact you have 2 times the resolution.

The penalty for mono are 2 extra filter changes and downloads for every OSC exposure, so that means the individual exposures need to be long enough to swamp this extra time cost.

Of course putting a narrowband filter over a OSC camera is a disaster for the collected signal since you are now looking thru two filters simultaneously all of the time. You now have to do everything a mono imager does (multiple filter changes and downloads) but with <1/4 of the Halpha and SII signals, and about 1/2 of the OIII. (3/4 of the pixels but non-optimized Bayer response).

Another problem with OSC is stacking. Before stacking multiple frames they must be aligned, which involves interpolating from a group of pixels in the source frame to a slightly shifted pixel in the destination frame. Of course OSC raw frames need to be converted to normal images (i.e. debayered) before this can happen, and during the interpolation phase noise can leak from the pixel of one color to the neighbouring pixel of another color (e.g. the weak blue pixels leak their noise into the brighter green pixels). Now that the noise has been mixed adjacent pixels the stacking cannot recover the true signal as easily as for mono stacking.

The final major advatange of mono is the ability to take luminance frames. Full resolution with at least 3x the signal of an individual RGB frame, the LRGB technique can substantially reduce the time for a nice color image of a deep sky object. My guess here is about a x2 time reduction for faint objects. NB There is usually no benefit to LRGB for brighter objects, like star clusters, since your sensor is not signal starved per subexposure.

Regards,
EB
Reply With Quote