The Tao of Resolutioncrhoadhouse (98 pencils) | Sun, 2006-01-29 21:07
For print work you want your images at 300 dpi (dots per inch).
Lineart: 1200 dpi
For web design you want look at the pixel dimensions.
Long version: Are you seated comfortably? Then let's begin...
The topic of resolution has long been a stumbling block for many designers. "How do I know my image will look good on such and such device?", "What should I scan a line art image art to prevent jaggies?", "Why is your office so messy?" are questions I hear over and over from my clients and students. So lets have at it.
What you're doing whenever you scan a image, take a digital photo, or create artwork in Photoshop, is that you're using a grid of pixels to represent an image.
Open an image in Photoshop and zoom in really close, you will see that the image is made up of individual cells called pixels. When viewed as a whole these pixels make up your image. Simple enough right?
The overall quality of the image is dependent on the device that the image is viewed on or output from. An image viewed onscreen can look perfectly acceptable, but when the same image is printed on a color printer the results are disappointing. The individual pixels that make up the image are easily seen and are often referred to as "jaggies". Devices such as printers have a higher output resolution than monitors (1200-4800 dpi vs 72, 92 dpi for monitors) making the individual pixels easier to see.
So how do we guarantee that the image that looks so pretty on screen will look good when we print it? We need to understand what is going on when we print an image. As I mentioned before your image is made up of pixels. The printing device has an imaging resolution. Namely how many spots of toner or ink it can pack into an inch. Common office laser printers vary from 300 to 1200 dpi. Ink-jets vary from 1200 to up to 5000+ dpi, imagesetters and platesetters (used in the offset printing process) vary from 1200 to 5000+ dpi.
Does that mean that we need to have an image with a resolution of 5000 dpi to have it look good on one of these devices? No, the output resolution of printing devices doesn't relate one to one with image pixels. Printing is a binary operation. By that I mean that a b/w laser printer for example only has black toner. It doesn't have any shades of gray. It must fool the eye into thinking there are shades of gray by using the white of the paper and the black of the toner to give the appearance of grays, this process is called "halftoning". If you look really close at an image in a newspaper you will see it is make up of little dots that vary in size. Areas of lightness in the image have small dots allowing more of the paper to show through and areas of darkness have bigger dots showing more of the black toner. Our eyes are easily tricked so at a normal viewing distance those halftone dots and the paper blend together to give the appearance of a grayscale image. For color printing most printers use cyan, magenta, yellow and black halftones to give the appearance of a full spectrum of color. A cyan dot that is close to a yellow dot blurs in our vision to green, and so forth.
When it is sent to a printing device the image is halftoned. Halftones come in various densities. For news print the halftone will be fairly coarse, say around 65 to 85 lpi. LPI stands for "lines per inch" basically how many halftone dots are used per inch to represent the image. Magazines and books will use a higher lpi such as 133 lpi all the way up to 300 lpi. The paper stock that is being printed dictates what lpi you can use. Newspapers for example use cheap pulp paper and ink tends to absorb into it and spread. So if you have a high lpi the halftone dots would bleed into each other and make a dark image with little detail.
In general to ensure that your image will not look pixelated when printed you want to have an image resolution that is twice the linescreen (dpi = 2 x lpi). This was derived from a formula called Nyquist Theorem. Wikipedia states it as thus: "When sampling a bandlimited signal (e.g., through analog to digital conversion) the sampling frequency must be greater than twice the signal's bandwidth in order to be able to reconstruct the original perfectly from the sampled version." So when scanning (digitizing) a photo you want to sample it twice the frequency of the linescreen to ensure that it will reproduce faithfully.
So for example you have an image that is going to be reproduced in a magazine at 150 lpi. Using the formula you determine that the image should be scanned at 300 dpi. If it was going to be in a newspaper at 85 lpi you could scan it a 170 dpi (170 = 2 x 85). Too little resolution in your images is noticeable but having more will not negatively impact the quality of your image reproduction. In other words a 300 dpi image printed at 85 lpi won't look any different than an image at 170 dpi printed at the same lpi. The main downside to having an image at a higher resolution than needed is that the file size will be larger. This could be an issue if you have many images making up a layout. But in general I feel its better to have too much than too little.
We now have grayscale and color images covered as far as print output. Lineart is a different animal. Lineart simply refers to any image that is strictly black and white, such as a scan of a signature, a comic strip, a pen drawing, etc. These images need to be at a higher resolution. Why? Because there are no gray tones the printer won't create a halftone pattern. Halftones not only fool our eyes into thinking there are tones that aren't there, but they also make the image appear smother. This has to do with antialiased edges which is a whole other screed that I'll talk about in another post.
To get the best quality out of a lineart image we want to scan or create it at a resolution of 1200 dpi. This matches the resolution of most midrange output devices. We could for example make the lineart 5000 dpi for output on a high resolution device, but our eyes wouldn't see any difference between the 5000 and the 1200 dpi image.
Computer monitors have a low linear resolution (the number of pixels it packs into an inch) but a high spatial resolution (how many colors that screen pixel can represent). This low linear resolution tends to blur the pixels together so we need not worry about high image resolutions to have proper reproduction. The high spatial resolution means that we don't need to use halftoning to fool the eye into think there is color there. Each pixel on a computer monitor can display 16.7 million colors.
When creating images for use on screen the old rule of thumb was to make them 72 dpi. This stemmed from old Macintosh 14'' monitors that had a fixed resolution of 72 pixels per inch. Now we have monitors and video cards that can display higher resolutions such as 120 dpi. So what I would look at is the pixel dimensions and use Photoshops "Save for Web" feature to preview the images in a web browser to make sure they are the proper size for your particular need.
Hopefully this has shed a little light on the mysteries of resolution. Oh yeah, and my office is messy because I'm a slob. Thank you and good night.
Commenting on this Blog entry is closed.