OK… let me just start out by saying to other than the 0.00001% of you out there that will get this, I’m sorry. I’m soooo sorry. You can skip this one. But this has been irking me. In fact it’s been nagging at me to figure it out for months now. I know it could be figured out simply, but just never got around to it. I know that your eye can really only see so much quality. I’ve gone over this a little before, but it’s come up a few times at work, and I’ve mentioned that there was a formula to figure it out. Except I didn’t know what it was. And every time I’ve searched for it I’ve come up empty. So I did what any respecting nerd would do and figure it out. Here’s basically the simplified version of what I got (all measured in inches):
52.35 / (Distance to object * 0.01745) ≤ (Square root (image pixel width squared + image pixel height squared)) / image measured diagonally ≤ 83.2365 /(Distance to object * 0.01745)
So what does that actually mean? Well let me start at the beginning.
This all came about because we were trying to figure out an image for a billboard. Now billboards are quite a bit away and in reality chances are good you’ll never actually see it like you would a poster, or ad in a magazine, or anything on TV. As such it probably could be of really, really low quality (and I’m talking really low here) and no one would ever know because they are so far away they’d never be able to tell the difference. It’d only be when you are standing up close to it you’d ever notice anything was off. All because our eyes don’t see perhaps the way most people think of them.
A person’s eye doesn’t see in a resolution per say. We measure everything these days in 1080p, or monitor resolutions, or dpi (or ppi which is the same thing basically)…. but in reality that doesn’t actually matter to our eyes. At different distances our eyes can make out less and less information. So basically if something is up close it needs to be a lot higher resolution in order for us not to see all the little bits and pieces that make it up. The further away something is the lower quality it can be without us noticing.
What this means is, as made “famous” by Steve Jobs when they introduced the Retina iPhone, roughly anything over 300 PPI held about 10 – 12 inches from the eye becomes extraneous. There’s been some debate over this as not everyone has 20/20 vision, and the numbers are a little loose, but for all intents and purposes (or at least good enough for me) it works. (at the end I’ll do the “perfect vision” math.)
So we have some baseline to figure out what the minimum pixels per inch need to be at a specific distance in order to resolve into something no pixellated for the average human eye (that the majority of people agree on). Well, kinda. Apparently the formula isn’t exactly so straight forward. Basically a perfect eye has a resolution of about 0.6 arc minute. However, the average person is more like 1 arc minute… To figure this out, the formula you use is to multiply the distance to the image times pixels per inch times twice the Tan of π divided by 360 (equal to half a degree in radians). To simplify things (and loose a little in accuracy… but again, close enough for practical work), two times the Tan of π divided by 360 can also be approximated with π divided by 180… or roughly 0.01745.
Ok. So Distance to an image times 0.01745 equals the pixels per inch needed for resolution of an image to an average human eye. Since this is all about figuring out if an image is of high enough quality, and we are using PPI as the standard, you can rewrite this as 52.35/(Distance to object in inches*0.01745)… aka part one of the formula.
OK so we know what the PPI is needed. ow to work out the PPI of the actual image. This one is a bit more straight forward but still needs math.to figure out the image resolution, you first need the diagonal resolution: square root of the sum of the width squared plus the height squared. Then take that and divide it by the diagonal measurement (in inches since that’s what I’ve done everything in so far).
Or more imply put: (Square root (image pixel width squared + image pixel height squared)) / image measured diagonally in inches
And that’s it for the average person. You just need to make sure the PPI of the image is the same or greater than the needed PPI based on the distance to the object.
But what’s this? You’re perfect and have 20/20 vision? Well here’s where that last bit of the formula comes in. The numbers change a bit but the formula mostly holds. Just 1 number changes. The 52.25 becomes 83.2365. All the same math applies.
The reason for it being at the end is that most people are going to fall between “normal” and “perfect”. As such anything you design should also fall between these two values, or at least use them as baselines. As such, in a much simplified version:
Normal vision PPI ≤ your image’s PPI ≤ Perfect PPI
Ok so some “real world” examples:
- iPhone 4 has a DPI of 326. Using the formulas to be something the human eye can’t make out from about 10″ away it would need to be between 300 dpi (average) to 477 dpi (perfect). This is about the average qualification… although not above perfect vision.
- iPads (the 3rd & 4th gen ones) have a resolution of 2048 x 1536 with a 9.7″ screen, which works out to 264 dpi (ok actually 263.9, but close enough). Held at 15″ from your eye, the range is 200 – 318. Again fits in the range expected for a “retina” level device.
- My TV (and pretty much most TVs these says) is 1920 x 1080, or more commonly referred to as 1080p. Each TV will vary in terms of their DPI as the size goes up but the resolution stays the same. My TV is 50″ (diagonally) so the math says it has a DPI of 44.06. My couch is 16 feet from the TV. So I’m actually only seeing between 24.8-15.6. Not even close to the 1080p my TV is pushing out (if curious the other post has way to punch in your TV size and choose 1080p (or 720p) to see your custom distance.)
- From what the internet tells me, the average billboards design tend to be between 20 and 30 dpi and are seen at a distance of over 50 feet. At 50 feet it would need to be between 5 and 8 dpi to be resolved. Apparently we like our Billboards well above what our eyes can actually see.
- Most laptop screens these days that people use are 1440 x 900 on a 15″ (diagonally) screen. This means that the average screen has approximately a DPI of 112.458. Average distance to a laptop screen tends to be about 18″. As such, range is 166 – 265. In this case not “retina”. However apple does have “retina” options for their 15″ screens with resolution of 1920 x 1200. Which puts the PPI at 150. Well under the range.
So what does this mean? Well for anyone that’s actually read this far, first congratulations. And two, it means that after reading all this, you should have some basic idea now of what you actually need to make your images in order not to waste time and effort to make something higher than what it will be seen in.
If your interested in a spreadsheet version of the formula, click this for the PPI excel file.
Also here’s where I got most of my information to piece this all together: http://ipad.about.com/b/2012/03/08/apple-redefines-retina-display-for-the-ipad.htm http://en.wikipedia.org/wiki/Retina_Display http://blogs.discovermagazine.com/badastronomy/2010/06/10/resolving-the-iphone-resolution http://graphicdesign.stackexchange.com/questions/3651/what-dpi-ppi-should-a-4×6-meter-outdoor-billboard-be-done-at http://www.apple.com/macbook-pro/specs-retina/ http://en.wikipedia.org/wiki/Display_resolution http://en.wikipedia.org/wiki/Pixel_density
Note: If my math is off, please oh please tell me and I’ll fix it. This is what I’ve managed to put together from different sources, and I’m definitely not a mathematician, just a geeky guy who is trying to figure out this issue. I try to get my sources right, and my math close enough for practical purposes. Helpful criticism is always appreciated.