On Fri, 02 Feb 2001 22:16:33 -0800 Tom Christiansen (tomchr@softhome.net)
wrote:
> Thanks for the explanation. Now I just need to get some kind of idea of a
> good value for gamma. Windows defaults to 2.2. My scanner software defaults
> to 1.4. If I change the scanner software to gamma=2.2 images look WAY too
> bright... Why the difference?
Probably your monitor adjustment and calibration.
A 'good value' is a difficult question which has occupied vast amounts of
argument. Unity gives equal precision to all values, which might seem a good
idea from the data's POV (and Timo Autiokari's) but our perceptual needs are
more idiosyncratic - we are very sensitive to brightness variations in the
midtones and shadows, less so to highlight differences so precision in the
former is usually more important. The optimum value of gamma tends to vary
depending on the type of image and what is important to the photographer. Any
value is a trade-off.
To be more practical, Macs use a system gamma of 1.8 and are historically the
de facto standard in imaging. Therefore 1.8 is a 'good value' if you are
working with designers who will be using Macs. However Windows machines are
much more numerous. If you are aiming at cross-platform medium such as the WWW,
2.2 is probably a safer assumption.
There's some stuff on my site about all this, and it's well worth reading Prof
Charles Poynton's FAQ about Gamma (link at my site). But don't be too taken in
by it - he is talking only about video systems and optimal presentation. We
tend to have to worry about data precision as well, through iterations of
editing operations.
Regards
Tony Sleep
http://www.halftone.co.uk - Online portfolio & exhibit; + film scanner info &
comparisons