Quote:
Originally Posted by Peter.M
The DSLR is made to match the human eye on earth, and unfortunately a sensor does not discriminate between NIR and visible photons. Anything with a temperature will emit black body radiation in the NIR and this is what the cameras filter is there to block (otherwise our photos would all look like they were taken from a COPS helicoptor), unfortunately the filter does not stop sharply where NIR becomes visible, and this causes low QE for the longer wavelegnths.
The human eye percieves red and blue relatively poorly as shown here.
http://en.wikipedia.org/wiki/File:Eyesensitivity.png
|
That just proves that if there is plenty of blue
& red in the DSLR histogram then those colours are
in reality actually quite bright.
In fact even brighter as the Bayer Matrix has 2 greens for every red or blue pixel.
I think the processing theory is correct if a DSLR palette is used
& enhanced for the dynamic range of all colours to dig out
that little bit more detail than the eye can see.
Also you can get a situation where if the Ha narrowband is used for
luminance then the colours end up all wrong.
Certainly if we look at the Hubble Palette the 3 colour narrow band photo
is being deliberately used to give that detail that would otherwise be unseen.
In the same way microwave radio pictures can be built up with colour.