What is the difference between DPI (dots per inch) and PPI (pixels per inch)?


There are lots of other questions on Graphic Design which partly cover this, e.g. What DPI should be used for what situations?.

However, I have become frustrated at the number of questions and answers which confuse the two terms. I think understanding the difference is important.

So here's a place to answer this question well and clear up the confusion!

4/13/2017 12:46:00 PM

A pixel (the word was originally coined, iirc, by IBM and derives from "picture element") is the smallest indivisible unit of information in a digital image. Pixels may be displayed, or they may be printed, but you can't divide pixels into smaller pieces to get more information. How many channels and bits per channel make up one pixel is the measure of how subtle the information in a pixel may be, but the basic fact is that 1 pixel the smallest increment of information in an image. If you do video, you know that pixels don't have to be square -- they are non-square in all older video formats. Square or not, a pixel is still the smallest unit of a picture.

An inch (okay, so you know this already -- bear with me) is a unit of linear measurement on a surface, which could be a screen or a piece of paper.

A dot is, well, a dot. It can be a dot on a screen, or it can be a dot produced by a printhead. Like pixels, dots are atomic. They're either there, or they're not. How much fine detail a screen can display depends on how close the dots are (what they used to call "dot pitch" in the old CRT days). How small the dots are from an inkjet, a laser printer or an imagesetter determines how much fine detail it can reproduce.

Dots per inch is fairly easy. A screen has so many dots (each comprising R, G and B elements) per inch of screen. It's the same on paper. A 1200 dpi printer can lay down 1200 dots in one linear inch. In describing screen detail or printer output, dots per inch is the correct term.

PPI is where the confusion comes in. An image has so many pixels. Its metadata contains an output size in inches, cm, mm, M&Ms, whatever. It's the width in pixels divided by the output width in the metadata that "per inch" comes from. So the same image with different metadata may be 72 ppi, 150 ppi or 8000 ppi. The image information is the same; all that's changed is the metadata.

A quick and easy demo that somewhat illustrates the point is to make some marks on a piece of elastic, say five to an inch. Stretch the elastic to twice its length. The number of marks hasn't changed, even though the "marks per inch" is now 2.5.

You can see this in Photoshop if you turn off Resample Image and change the size. The ppi value changes to reflect how small the pixels must be reproduced in order to hit the measurement value in inches/cm/mm etc. Note that in this case the Pixels fields are disabled. You can't change those values unless you resample.

Mass confusion entered in when image pixels were mapped to screen dots in web browsers. A 200 pixel image shows up as 200 pixels in a browser. How large it is, measured with a ruler, depends on the dots per inch of the screen. The image metadata might say it's 200 ppi or 72 ppi or 1 ppi, it will still occupy exactly 200 screen dots. The world remains fixated on "72 ppi for the web," so the question of "what's the right resolution for web images" keeps coming up, and the correct answer, "it doesn't matter," keeps being supplied ad nauseam.

If you're still with me, there's one last step that brings the two together.

A 720-pixels-wide image at 10 physical inches wide has a resolution of 72 pixels per inch. If you print it on a 1200 dpi printer, there will be 1200 dots per inch on the paper, but the image is still 72 pixels per inch. That's why it looks like crap. On the other hand, a 7200 pixels wide image printed at 1 inch wide will exceed the resolution of our 1200 dpi printer. Photoshop (let's say) and the printer driver decide which pixels to throw away and which to actually print. Some of the printed dots will be averaged among adjacent image pixels, but, regardless, some of the image information has to be thrown away. The output will be 1200 dpi, but the resolution of the printed image will have been reduced to at most 1200 dpi by the software.

2/21/2012 1:04:00 AM

Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow