(Please CC me)
I have what must be a very common problem, but I can't seem to solve it myself. I prefer to use linux, if possible. I want to scan photos to preserve them. I'd like high quality files. If practical, they can be as large as necessary to preserve all information. I also want to convert these images to various smaller images suitable for different purposes, such as viewing quickly on a computer screen, and also thumbnails. I am having some difficulty at each stage. I have tried xv, imagemagik, netpbm, and gimp. gimp in particular shows the problem of very large, but low quality jpeg files that I mention at the end of this message. I have access to some HP scanner attached to machine running NT. The installed program does not reveal what resolution I am scanning at, only the "magnification". When I look at the resulting image with, say, xv, I find that indeed, in addition to being larger than I can view on the screen, the image is scanned at higher resolution (xv reports width and height in pixels). (Of course, I would prefer that the size of the image does not get huge just because I ask for higher resolution, but that does not seem to be an option.) By cropping and enlarging, I can see how much "magnification" I need to capture more or less all the information. For a studio photo, I believe I need maximum "magnification" to get all the information. Even a high quality picture from a cheap camera seems to continue to reveal more detail as I increase the resolution of the scanner to its max. The resulting files can be from about 70MB to 300MB (24 bit tiff) depending on the size of original photo. Am I misreading the results? Do I really need such large files ? Does anyone have information or a link on what kind of resolution is required to get an optimum scan of a photo? I can't find anything on the web. I find that these large tiffs can be compressed by about a factor of 2 to 3 using bzip2. I want to convert these to something viewable on a screen or for printing on some cheap paper. It appears that jpeg compression is a good choice for photos. I have some basic orientation now on the primary uses of the various compression formats and the differences between the file formats and compression algorithms. I find that different programs give very different results when converting to jpegs. I have tried xv, imagemagik, netpbm, and gimp. I have tried these programs with several "quality" settings and other parameters. (I typically use the "integer" setting). I would like a quickly loading jpg of, say, 50 KB to 300KB (I don't really know). But by the time I lower the quality setting enough that the resulting file size is down to 3MB to 5MB, which still takes a long time to load, the image is seriously degraded. It looks much worse than many 20KB jpegs that I have seen. I suppose I can make multiple scans, some at lower resolution and then convert those. These experiments are time consuming and I am kind of working in the dark. I wonder if there are web documents or books that discuss how to make high quality scans of photos, and how to make high quality lossy images for casual viewing. I would appreciate any information. Thanks! John _______________________________________________ Gimp-user mailing list [EMAIL PROTECTED] http://lists.xcf.berkeley.edu/mailman/listinfo/gimp-user