um, what info are you looking for EXACTLY. I dont know ALOT about optical scanners, but can tell you the basics of what I know, and if you need anything more indepth, I may know that, too. Anyhow, it uses a grid and each box in a grid for things like grey scale are then divided into 24bit per box, after all boxes are filled (with, I assume, all 0's in 24 bit representing pure white) with needed values it saves it as a .bmp file. The CCD is a light and intensity detector to determine scales of luminous from the image. Resolution is how dense the above bit map image is (as in, how many boxes total in the grid), bit depth is how many bits are used per box/pixel. You got to remember that every color has a certain frequency given via when a light passes through it as well as the returning intensity. These are picked up by the CCD and can then determine what the bit setting for the said pixel needs to be to represent the color. Quality usually depends on the quality of the CCD light recepters and how many bits the scanner uses. Higher the bit count, higher the color count. Algorythms in the scanners chip logic probably determines how to interpret the CCD receptros info into a visible display.
If you argue with an idiot he will drag you down to his level and beat you with experience.
I am not a fast writer.
I am not a slow writer.
I am a half-fast writer.