View Single Post
  #35  
Old 24-06-2015, 03:59 PM
rustigsmed's Avatar
rustigsmed (Russell)
Registered User

rustigsmed is offline
 
Join Date: Mar 2012
Location: Mornington Peninsula, Australia
Posts: 3,950
Quote:
Originally Posted by Shiraz View Post
Hi Rusty.

The job of the Bayer filter is to absorb about 2/3 of the photons that hit it. Since the majority of the photons never make it to the underlying detector array, the sensor as a whole must have a QE below about 0.4, even before taking into account whatever the detectors themselves do. This is a huge hit to QE, but it is necessary to encode the colour data.

The Bayer filter also mucks up the luminance data, but debayering can fairly effectively disentangle lum from colour (essentially by smart guesswork), although there is still some loss of resolution and fidelity in the process (but not a whole lot).
yep thanks Ray,

I understand that not all pixels are being utilised which is why I was visualising it as resolution issue rather than QE issue.
I was looking at it like they were really 3 separate 'mini-mono-sensors' in one (but stuck together) each with a filter in front with the blue and red being 25% the 'total' sensor size and the green being 50% the sensor size.
in thinking this way only those specific 'coloured' photons would hit those just like a mono camera would receive a photon with a R or G or B filter in front (or nb).
ie there is not much difference between a mono camera with a blue filter when a 'red photon' hits it than to a dslr with the difference being the dslr is 'really' made up for 3 really tiny sensors. am I going crazy?
Reply With Quote