Hi all,
I was reading the book Perfect Exposure today (Hicks & Schultz) and read the
section on film testing. Struck me that for people like myself who have got
rid of the wet darkroom, a method of calibrating a personal film speed and
developer should be developed around scanning negs.
What it suggested for a simple test was that you shoot some frames with
black velvet, an 18% grey card and a sheet of matt photographic paper (fixed
and washed) bracketed around your meter reading. Then:
"the white paper should be detectable lighter than a piece of fogged and
developed film, and the black velvet should be a good bit darker than the
film base plus fog. If you have a densitometer, you can take actual
readings. The black velvet should be at least 0.10 log units darker than
fb+f and the white paper should have a density of around 1.1"
Now ideally I see a piece of software scan in a blank frame to get the fb+f
value, and a totally overexposed frame to get a maximum density value. At
that point a frame of the velvet/18%/paper can be scanned in and some small
target areas of each tone identified.
The software can then scan in the rest of the frames and determine which one
has the greatest contrast without flushing out either end. Another test
could quantify the 'grainyness' of each patch by examining the distribution
of values in the area, so different developers or under/overdevelopment
could be studied
When the best patch has been identified the 18% grey section could be used
for rudimentary gamma curve…. Or even better, shoot one of those Kodak 20
odd step monochrome grey scales in the image to get a more accurate version.
I imagine auto-exposure on the scanner could be a problem with this.
Any comments?
Regards
Ned
_________________________________________________________________
Get your FREE download of MSN Explorer at http://explorer.msn.com/intl.asp.