Quote:
Originally Posted by RickS
. It can't be... two thirds of the data is being "thrown away."
|
The data doesnt actually get thrown away, you are mathematically recreating data that wasnt there.
For each physical pixel you create
R + (Surrounding GGB) = Composite coloured pixel
G + (Surrounding RGB) = Composite coloured pixel
B + (Surrounding GGR) = Composite coloured pixel
G + (Surrounding RGB) = Composite coloured pixel
from the surrounding pixels, basically you are binning 2x2.
From Wikipedia:
Bryce Bayer's patent (U.S. Patent No. 3,971,065) in 1976 called the green photosensors luminance-sensitive elements and the red and blue ones chrominance-sensitive elements. He used twice as many green elements as red or blue to mimic the physiology of the human eye. The retina has more rod cells than cone cells and rod cells are most sensitive to green light. These elements are referred to as sensor elements, sensels, pixel sensors, or simply pixels; sample values sensed by them, after interpolation, become image pixels.
The raw output of Bayer-filter cameras is referred to as a Bayer pattern image. Since each pixel is filtered to record only one of three colors, the data from each pixel cannot fully determine color on its own. To obtain a full-color image, various demosaicing algorithms can be used to interpolate a set of complete red, green, and blue values for each point.
Different algorithms requiring various amounts of computing power result in varying-quality final images. This can be done in-camera, producing a JPEG or TIFF image, or outside the camera using the raw data directly from the sensor.
The only time you throw data away is if you are doing narrow band imaging with an OSC. Typically HA imaging, in this case you are only using the red filtered pixels, hence throwing away all Green and Blue pixels.