Log in

View Full Version here: : Colour photos from LRGB frames


Robert9
23-12-2015, 08:21 PM
Hi,
A question on mono LRGB photography.
Colour filter arrays on the sensor of colour cameras appear to be in blocks of 4 pixels, each block containing 2xG, 1XR and 1xB.
Why are there twice as many G as there are R and B? Is it related to the human eye being at maximum sensitivity in G? Or alternatively, are the pixels less sensitive to G and therefore need more exposure? In order to reproduce this colour balance should I be taking, for example, 200 frames of G and 100 frames each of R and B when making mono CCD photos with my mono camera?
Robert

Atmos
23-12-2015, 08:39 PM
Most sensors are most sensitive around the yellow/green part of the spectrum so it isn't a sensitivity issue. You are correct, it is basically an issue of trying to best simulate the way our eyes perceive light. OSC cameras and colour film have always been built around the idea of capturing an image that is as close as possible to correct to what we see with our eyes colour wise.

Robert9
24-12-2015, 09:58 AM
Colin,
Thanks for that info. In terms of B&W astrophotography then, should one be taking twice as many green frames as the red and blue? Or does one just rely on post processing to achieve the correct (if there is such a thing :lol:) colour balance?
Robert

Atmos
24-12-2015, 10:01 AM
You don't need to bother about twice as many green framed as you deal with it all when doing the White Balance. With LRGB, more frames virtually just gives a better SNR but that is what the L is mostly for anyway.

Robert9
24-12-2015, 12:01 PM
Great. Thanks Colin.