Filmscanners mailing list archive (filmscanners@halftone.co.uk)
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: filmscanners: Sharpening scanned images for printing
To Austin:
I think Harvey's point is that there may come a situation where someone
wants a sharp scan of a blurry image. Why not, it's art! ;-)
Austin wrote:
> You must be referring to color. I only talk about B&W, and there is no
> "inherent flaw" in scanning B&W, if you do not scan B&W in RGB. The
> "inherent flaw" you speak of is simply bloom and smear, which isn't really a
> "flaw" but a characteristic of how CCDs respond to different wavelengths of
> light.
I think on one level there can be no doubt a CCD film scanner (don't know
enough about drum scanners to say) will loose sharpness, color or BW. It has
to, it's an analog generational loss. Light passes through film and gets
scattered, it then passes through a lens which possibly introduces flare,
diffraction, and aberration, then it hits the CCD which is prone to blur,
smear, and blooming, and finally the electronics introduce noise. It's
analog, the question can really only be "how much" is lost, not "if".
The whole question about sharpness occurs to me as "how sharp is sharp"? Is
a file the right degree of sharpness when a print from it is as sharp as a
traditional darkroom print? Is that a contact print or an enlarged print?
Cold light head or point source? Should the image at 100% magnification on
screen look as sharp as the film through a 100x microscope? Or, is the right
sharpness as sharp as one can make it in Photoshop before offensive
artifacting occurs? Even if that makes it appear sharper than the original?
On screen or in print? What output: film recorder, offset press, or Epson?
If the output process softens an image is it fair to oversharpen in in
anticipation?
These are rhetorical questions which I pose purely to establish that
sharpening is a choice, a variable, not a rule.
To Harvey, who wrote:
>> Then why do (real) hi bit scans require less sharpening than low
>> bit scans?
Harvey is it possible that by and large (certainly more so in the past than
today) the higher bit scanners have been the higher quality scanners? I mean
highbit used to come at a steep price, and from quality components. Still
does for "real" bit depth as you put it, by which I think you mean extended
dynamic range.
To Austin who wrote:
> I don't know (nor do I believe at this point in time, at least for B&W) that
> they do. Perhaps you can explain why you believe they do. For B&W, it is
> entirely counterintuitive that they would.
I also don't know if it's true or not. I think you are right that contrast
is a condition for the perception of sharpness, and I don't know that low
bit depth or high bit depth has any particular bearing on that.
> Clearly, a pure monotone image
> has the highest level of sharpness one can have, and adding more tones, just
> makes things less sharp.
First a quibble with a word, just for the sake of clarity. A monotone image
would be one shade, which has no contrast, so I know you mean a two-tone
image. But what you say could be correct, but only if those two tones are of
greater luminosity difference than the difference between the last two tones
of a multi-tone image, and that is not guaranteed to be the case. IOW, you
can have a two tone image of two very closely spaced gray values, which
would have a low contrast appearance when compared to an image with 100
tones which span from white to black.
I think what Harvey may be suggesting is that in the case of a scanner which
is capable of a wider dynamic range, by virtue of not being limited by bit
depth, you can get a wider range of clean tones than on a low bit depth
scanner, and perhaps through that extended range you get a greater sense of
overall contrast. If that is indeed what he means, I disagree, because it is
contrast on a micro level which gives the sense of sharpness (contrast
between pixels in direct proximity of each other) more than on a macro level
(tones across the entirety of the image), and I don't know that bit depth or
dynamic range has a direct relationship on micro contrast at all.
> What I have said is that people who sharpen might want to look at the rest
> of the process to find the source of why they sharpen...if the image is
> fuzzy on the film, it'll be fuzzy on the scan. Not the grain, but the
> image.
True.
> Most people don't sharpen grain, they sharpen the image.
Everybody's different. That's what makes the world a richer place. ;-)
Todd
|