View Single Post
  #14  
Old 02-12-2015, 04:24 PM
Shiraz's Avatar
Shiraz (Ray)
Registered User

Shiraz is offline
 
Join Date: Apr 2010
Location: ardrossan south australia
Posts: 4,918
long post

OK, have I got this right? based on reading to date:

If colour management is used:

1. the ICC profile in a file tells what the colour characteristics of the sensor (and image processing system) were when the image was generated.
2. When the data is to be displayed (or printed), the image data is transformed into the colour space of the display device by way of information in the ICC profile of the display (or printer). Thus, this profile contains information on both the characteristics of the display and of the gamut to be displayed. For true representation of the original data, this gamut should be as close as possible to that of the original sensor, so it must be carried over from the original file header. The ICC profile for the display is used to establish LUTs that translate between numerical image values and screen pixel brightness, so that what shows on the screen is a true representation of the image. The LUTs can reside in the monitor, in the PC display hardware/software or possibly even a bit of both.

This process means that the final display should be similar in dynamic range and colour to the original scene (within the limitations of the display technology)

However:

Windows supports colour management, but does not require it - some windows software uses it, some doesn't. If colour management is not used, what you see on the screen is pot luck and will depend on a reasonable alignment between sensor characteristics and those of the display system.
If a display does not have a properly defined ICC profile, but colour management is used, a default profile may be assigned, based on an sRGB gamut and "typical" LCD display. This may be a fair way from providing a true rendition of colours. The only way to get a true profile for your display is to use a hardware calibrator.
The sRGB gamut is widely used (on the internet) and is generally quite effective in reproducing most real colours on video displays. It was initially developed to support typical CRT displays and covers less colour space than the NTSC or AdobeRGB gamuts (eg 72% of NTSC). LCD screens without colour management may only very roughly approximate CRT characteristics - particularly some laptop displays where energy is limited and unimpeded background LED light will be used as white, regardless of the colour temperature. Thus a generic sRGB assumption may be of limited use.

As far as I can tell, this means that what I see when I finish processing an image may be a fair bit different from what someone else sees when viewing it over the internet - the chances that the colour/brightness on my screen will exactly match those on any other are remote.

The only exception would be if my data is prepared on a properly calibrated system that supports colour management, saved with a standard gamut and then viewed by another calibrated system (with colour management) using the same gamut. The most widely used gamut is sRGB, so this should be used if possible for transmitting images over the WWW.

Be very grateful if anyone who really understands this stuff would correct the above. Thanks, ray

Last edited by Shiraz; 02-12-2015 at 05:48 PM.
Reply With Quote