Filmscanners mailing list archive (filmscanners@halftone.co.uk)
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[filmscanners] Re: Understanding dpi
Yike... someone asks a relatively simple question, and the theory book
gets thrown at them. If I were Bill, who originally posed this
question, I would have run out of here with blood coming out of my eyes.
Sometimes we speak about not having enough knowledge being a dangerous
thing, but heck, sometimes having too much knowledge is deadlier! ;-)
Brad, you've got it, you understand it, and all the rest of this stuff
being tossed at you right now is minutia and hair splitting of
terminology. Some people I know have a much less civil term for it.
The unfortunate problem is the simple and easily misused term "dots".
As noted, it is often exchanges with terms like pixels, or locations, or
samples or values, or....
People who have a strong printing background, especially those who come
from an offset or web press upbringing, get very bend out of shape when
someone brandishes the word "dot" because in printing a "dot" can mean a
whole set of things, a dot of one color ink, a dot made up of several
elements in a screen or matrix, a rosette, etc. etc.
I would ask, since I see no way people who are relatively new to
scanning & printing are going to fully understand or change their
nomenclature to be more accurate, that we agree here, when discussing
scanning (which has NO dots, really, anyway) that when someone uses the
terms dots or dpi, that we agree they don't mean dots in one of the many
printing meanings, but they mean a pixel or element or sample point, and
be done with this.
If, on the other hand, they are particularly speaking of printed dots,
in the reference to the larger resolutions of things like inkjet
printers (which claim 2400 or 5600 or whatever "dpi"), then the
discussion can, and maybe should, go into the differences between a
colored dot of ink, and the resolutions involved in creating the
perception of color that might be contained in something like a "cell".
In the scanning field, I see no advantage to muddy the waters with
explanations of screens and cells, and half toning, just because someone
perhaps incorrectly used the ubiquitous abbreviation "dpi" when they
meant ppi, for example. Jeez, talk about confounding the issue.
If we can all agree, although an incorrect usage (that no one is EVER
going to remedy), that when someone refers to dpi and a scanner, that
they mean ppi, then all of this unnecessary confusion could be avoided.
So, yes Brad, assuming the two scanners both have the same basic optics
and electronics and none of those factors were limiting factors in the
recording of the resolution of the higher resolution unit, then your
understanding of the basic theories behind scanning resolution and how
it would manifest in the output within a printing environment are valid.
In making this statement, I have neglected several factors, some of
which I will now add so someone doesn't give me grief:
1) sampling accuracy lessens by more than the basic assumption when one
is sampling irregular shaped, positioned and sized objects like grain or
dye clouds (oh please, let us not get into what's grain and what's dye
clouds again!) (this is something often discussed as Grain Aliasing).
Basically, this is because the larger the area being sampled "under" any
one "recording element" the more the information is "averaged" since the
sensor is unable to resolve the differences of shape and color when they
are all under that one sensor element. Each sensor element is a one
pixel "eye". It can only "see" the information under it as one color
(made up of the average R G and B values) and will report it as a square
regardless of the shapes under it. (assuming square sensor elements)
I could show you this more clearly if I could do it graphically.
2) how much of the area of the image is actually being sampled under
each element does matter as well. If I had an area one CM square, and
each sensor element was exactly 1mm x 1mm with no border, a full 100
samples of the area would give me 100 "average readings" covering the
full area.
If I have a one CM square, and each sensor element is only .1 mm x .1 mm
square, and it takes the same 100 samples evenly spaced over the one CM
square, I have only sampled 1/100th of the total area. Although each
sample itself is a more accurate rendition of the R G B values directly
under it (much less averaging) the process has completely ignored 99% of
the total surface area from being measured, and the accuracy of the
sampled area relative to the actual surface may be quite incorrect.
If you use an image manipulation program, like Photoshop, and select a
one pixel eyedropper resolution, and randomly select points within an
area that may appear gray when looking at the image at screen
resolution, each sample may prove very different than the "gray" you are
seeing. You may end up with red, or green, or blue or yellow or ...
anything in between, and although accurate for this exact pixel
location, it gives a very poor average of the gray you are seeing.
So, yes, there are surrounding issues, and no, the answer is rarely as
cut and dry as it may appear, but as Einstein alluded to: If you can't
explain the concept to a child, you probably don't fully understand it
yourself (believe me I don't, but I try...)
I need some air ;-)
Art
----------------------------------------------------------------------------------------
Unsubscribe by mail to listserver@halftone.co.uk, with 'unsubscribe
filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or
body
|