As a person who took their first shot of the Moon just two nights ago I don't have any technical background concerning how the colour balance is altered or how it differs with different equipment, but I think there are some points that are probably quite basic if you are talking about images as percieved by the human eye.
If you were to take an infrared image of an object, how do you get the colour balance correct? The only "reasonable" answer would be totally black as this would be the perception seen by the eye.
If you take an image in the visible band with an imager that is lacking in sensitivity in one area, say red, should you push the red to the point that it makes up for the imbalance? The image, after all, is a "true representation" of what the imaging equipment saw (more or less).
Because we don't really perceive colour in most objects, it is difficult to really decide what is correct. I guess we go by what we see in books and therefore what we expect to see.
I think that unless you are performing some observation for scientific analysis, the correct colour balance is the one that creates a pleasing result. Unfortunately, it is all very subjective. Of course if you took a shot of M42 and "deleted" the red channel, you would get a result that most people would say was "wrong" (including me). If it looks "natural" then you can't be too far wrong. If it looks like you have been tripping out on LSD, you probably have it wrong!
|