Image to print confusion

Messages
5
Name
Paul
Edit My Images
Yes
Hi all... Noob here and this is probably a stupid question but.

If 12 megapixels is 4000 x 3000 pixels and a 4k TV is 3840 x 2160 pixels (about 8 megapixels) and the image on my 42" 4k TV is razor sharp, why can I not enlarge a 12 megapixel photo to a print the size of my TV?

Thanks

Regards
 
Last edited:
Hi all... Noob here and this is probably a stupid question but.

If 12 megapixels is 4000 x 3000 pixels and a 4k TV is 3840 x 2160 pixels (about 8 megapixels) and the image on my 42" 4k TV is razor sharp, why can I not enlarge a 12 megapixel photo to a print the size of my TV?

Thanks

Regards
Hi and welcome to TP

In regard to your question and my understanding, so FWIW

You are confusing pixel dimensions in regard to a TV and the requirements of a print.

If you want a print @300dpi (this is the optimal print resolution for many printers) you will need a considerably higher pixel count!

I have no idea (without calculating it) the X x Y inches a 42inch TV is but let's say a print of 42" longest dimension would need to be 12,600 pixels to print @300 dpi.

HTH:)

PS I use Gigapixel AI to enlarge my files as needed to create my prints as/if required.
 
Last edited:
Welcome to the site. :)

Just had a look, and the dimensions of a 42" TV screen is 38.1 inches x 24.9 inches. Interestingly, they say the ideal viewing distance for an 42-inch TV is around 6 to 7 feet. That's possibly further than most people may view a print of that size from, dependent on placement. It says the pixels per inch inch of a 4k 42" TV is 104.9, far less than the usual 300 dpi printing resolution.

On large prints/screens, mostly viewed from a distance, lowering the dpi is not unusual. Dividing 4000 by 38 gives 105.2, so very similar to the 42" screen resolution. You could print a 12Mp image of 38.1 inches x 24.9 inches at 105 dpi, and side by side with the TV at 6 to 7 feet, the images should look pretty similar, for image sharpness anyway.

The TV should be the brighter and more saturated image, because it is a backlit projected image, compared to reflective print, unless adjustments are made to compensate.

If I have any of the above wrong, someone please correct me. :)
 
Welcome to the site. :)

Just had a look, and the dimensions of a 42" TV screen is 38.1 inches x 24.9 inches. Interestingly, they say the ideal viewing distance for an 42-inch TV is around 6 to 7 feet. That's possibly further than most people may view a print of that size from, dependent on placement. It says the pixels per inch inch of a 4k 42" TV is 104.9, far less than the usual 300 dpi printing resolution.

On large prints/screens, mostly viewed from a distance, lowering the dpi is not unusual. Dividing 4000 by 38 gives 105.2, so very similar to the 42" screen resolution. You could print a 12Mp image of 38.1 inches x 24.9 inches at 105 dpi, and side by side with the TV at 6 to 7 feet, the images should look pretty similar, for image sharpness anyway.

The TV should be the brighter and more saturated image, because it is a backlit projected image, compared to reflective print, unless adjustments are made to compensate.

If I have any of the above wrong, someone please correct me. :)
That's pretty much exactly how I figured it and came to the conclusion that it's down to viewing distance... With that logic, to get to 300+ dpi I'd need 36+ mp which could then be viewed at 2ft, right? Or, 48mp at 1.5ft right?

Sooooo with my little mini 3 Pro in 48mp mode, theoretically I could print the size of a 43" TV and have a nice sharp image even if viewed crom 18 inches away, right? Incidentally, 12mp is 4K...

Why can TV and monitor manufacturers, and photo and video equipment manufacturers and cinema etc just get together and get a bloody unified standard sorted???

I've looked at aspect ratios and the difference between cinema and TV and photos etc and it's a massive bloody ball ache. Anyways I digress...

Is my logic sound?


Thanks again everyone
 
I think it is subjective and also I'd be surprised if it is a linear relationship, but I've never checked this. My limit is usually around 150 pixels per inch for the size of prints I do - up to A3+

I am usually starting from the viewpoint of the size of my image in pixels and then dividing that by 300 pixels per inch for best resolution, or 200, or 150 depending on what I need it for (note pixels per inch is actually different from the dots per inch that a printer delivers).

If you look at billboards, their resolution is not good, but you never view them from close up, and far away they look fine. If it is a larger print, the chances are you are viewing it from further away, so can get away with a lower resolution. What is acceptable to one person may not be acceptable to another.
 
"note pixels per inch is actually different from the dots per inch"
This is something I can't seem to get my head around.

I can't see how 100 dpi is different to 100 ppi.... If I divide an inch in to 100 parts, the parts are the same size regardless of what I call them. Therefore I cannot see why 100 ppi on a TV is considered good but for a print, 100 dpi is considered poor. I get the distance plays a part of it but...
 
Riiiiight... Now this site seems to be working in a way that fits with my logic. It considers a 36" x 27" print, from a 12mp image (110ppi) to be great quality when viewed from 3' and superb when viewed from 6'


Take a look at the image analyser on this site. Quite a handy little tool I reckon.

 
This is something I can't seem to get my head around.

I can't see how 100 dpi is different to 100 ppi.... If I divide an inch in to 100 parts, the parts are the same size regardless of what I call them. Therefore I cannot see why 100 ppi on a TV is considered good but for a print, 100 dpi is considered poor. I get the distance plays a part of it but...
I suspect it is in part due to the difference between a static image (print) and a moving image(tv) - with a print, we can look at the different parts of it, follow a leading line into the image, focus on various point of it, then return to the image as a whole. We can spend 5 minutes on what would be a single 1/25s frame in a tv show.
With a TV picture, it is constantly changing, so we follow the point of maximum attention - so the small details are less significant - and our brain filters out small imperfections and 'fills in the gaps'.
 
That's pretty much exactly how I figured it and came to the conclusion that it's down to viewing distance... With that logic, to get to 300+ dpi I'd need 36+ mp which could then be viewed at 2ft, right? Or, 48mp at 1.5ft right?

Sooooo with my little mini 3 Pro in 48mp mode, theoretically I could print the size of a 43" TV and have a nice sharp image even if viewed crom 18 inches away, right? Incidentally, 12mp is 4K...
Had to look what a Mini 3 Pro was, but to muddy the water a bit, not all 48Mp sensors are equal. Not sure what cameras sensors you have available, camera, drone, phone etc, but try taking a picture of something with a lot of detail with each sensor, and the resolution/sensor size with have a big influence on the detail and sharpness from each sensor depending on viewing distance/magnification. I happen to have a phone with a 48Mp sensor, and I don't use it, because the quality, for me, is not really there for anything beyond displaying on the phones screen.

I also have a compact camera with a 20Mp sensor, and a crop sensor DSLR also with a 20Mp sensor, and the latter gives the best image quality because of the size of the sensor, the better lenses used, and the better light gathering capability. The DSLR is the one to use if I think I may need to print/display large, or just want the best version of the scene.
Why can TV and monitor manufacturers, and photo and video equipment manufacturers and cinema etc just get together and get a bloody unified standard sorted???

I've looked at aspect ratios and the difference between cinema and TV and photos etc and it's a massive bloody ball ache. Anyways I digress...

Is my logic sound?


Thanks again everyone
There are, in effect, dpi/ppi 'standards' depending on the media used, and the expected viewing distance of the different media. For small to medium prints, 300dpi seems to be the usual for images that can be handheld, or viewed up close. As prints get larger, and people are not viewing so close, the dpi can be lowered, if needed. Monitors are viewed from a certain distance, and TV's viewed from further and further away as the screen gets larger, the ppi can change to take that into account.
 
Hi all... Noob here and this is probably a stupid question but.

If 12 megapixels is 4000 x 3000 pixels and a 4k TV is 3840 x 2160 pixels (about 8 megapixels) and the image on my 42" 4k TV is razor sharp, why can I not enlarge a 12 megapixel photo to a print the size of my TV?

Thanks

Regards

You can. TV's are viewed from a distance which looks sharp. The closer you move to the TV the more the image becomes pixelated.

Lets assume that a 42" TV is 38.1" wide and 24.9" high (a number found online but may vary from TV to TV)

You have an image of 4000 pixels wide which means you need to print that image at approx 105ppi (4000 / 38.1). Just like your TV, the print would appear a more pixelated close up but perfectly sharp when viewed from a distance.
 
This is something I can't seem to get my head around.

I can't see how 100 dpi is different to 100 ppi.... If I divide an inch in to 100 parts, the parts are the same size regardless of what I call them. Therefore I cannot see why 100 ppi on a TV is considered good but for a print, 100 dpi is considered poor. I get the distance plays a part of it but...

It's very simple.

DPI is dots per inch. It how many dots of ink the printer injects onto the printed page per inch. It is a printer resolution. A photo printer will often have a resolution of something like 4800dpi

PPI is pixels per inch. Pixels are the measurement of a digital image.

People often confuse them or use dpi when they should be using ppi. eg. @Box Brownie and @redhed17 both referred to dpi in their posts above when he should have said ppi. Just accept that when somebody prefers to dpi, they usually mean ppi unless they are referring to the resolution of the printer itself.

Generally you don't need to take much notice of dpi when printing image. Just select the photo quality or highest quality and the printer will deal with that.

If you want to view an image close up then you are generally looking to print at 300 (canon printers) or 360dpi (epson printers). Therefore in order for you to print your 4000px image at 300dpi you are limited to a 13.33" wide print.

There is nothing to stop you printing at a lower resolution though. At 200ppi you can print a 20" wide print. At 150ppi you can print a 26.6" wide print. The larger you go the lower the resolution of the print which may be visible when viewed close up, but larger images are generally viewed from further away.

Think of a large billboard as you drive down the motorway. These can be printed as low as 10ppi, yes just 10, but from the distance you view it from the image is perfectly acceptable.
 
Last edited:
I found this which explains it in a lot more detail than I did.

 
Last edited:
Riiiiight... Now this site seems to be working in a way that fits with my logic. It considers a 36" x 27" print, from a 12mp image (110ppi) to be great quality when viewed from 3' and superb when viewed from 6'
It seems your logic is right...

The way I think of it is (PPI)
300+ is for extremely critical viewing (i.e. with a printer's loupe)
200-300 is suitable for critical viewing (i.e. at distances less than the standard)
100-200 is suitable for casual viewing (at standard distance or greater)

FWIW, the CoC standard for image sharpness requires no more than 2MP for any size image when viewed from a distance equal to the image diagonal (i.e. 8x10 viewed at 12"). That's why none of the depth of field/acceptable sharpness calculators consider sensor resolution... every camera has more than enough to meet the requirement.

The thing that everyone misses is that neither PPI nor DPI are a particularly good indication of the actual image resolution. E.g. if I post an image here on the forum the 1024px limit means the image can contain no more than 2MP of resolution. But it actually probably contains far less resolution/detail, because not everything in the image is/was pixel level sharp. If the image looks questionable I can make it smaller on my screen, which lowers the image's displayed resolution, because I am still seeing it at 122 PPI (that is what my monitor is set at).

Most images start to fail well before you reach the pixel resolution limit... where the image is printed/viewed large enough that the square shape of the pixels become visible. If pixel resolution is actually the limiting factor you can fix that by resampling; breaking the image up into more squares so they don't print visibly. Otherwise you need to determine the maximum print/display size by the physical size at which it starts to fail; not PPI (nor DPI).

People often confuse them or use dpi when they should be using ppi. eg. @Box Brownie and @redhed17 both referred to dpi in their posts above when he should have said ppi. Just accept that when somebody prefers to dpi, they usually mean ppi unless they are referring to the resolution of the printer itself.
Back when digital imaging was first starting it was done by scanning images into the digital format. The DPI setting of the scanner becomes the PPI resolution of the image... and that's why the 300 PPI/DPI thing first began. And in that application, saying you always need 300 minimum for decent quality makes some sense.
 
Last edited:
It seems your logic is right...

The way I think of it is (PPI)
300+ is for extremely critical viewing (i.e. with a printer's loupe)
200-300 is suitable for critical viewing (i.e. at distances less than the standard)
100-200 is suitable for casual viewing (at standard distance or greater)

FWIW, the CoC standard for image sharpness requires no more than 2MP for any size image when viewed from a distance equal to the image diagonal (i.e. 8x10 viewed at 12"). That's why none of the depth of field/acceptable sharpness calculators consider sensor resolution... every camera has more than enough to meet the requirement.

The thing that everyone misses is that neither PPI nor DPI are a particularly good indication of the actual image resolution. E.g. if I post an image here on the forum the 1024px limit means the image can contain no more than 2MP of resolution. But it actually probably contains far less resolution/detail, because not everything in the image is/was pixel level sharp. If the image looks questionable I can make it smaller on my screen, which lowers the image's resolution, but I am still seeing it at 122 PPI (that is what my monitor is set at).

Most images start to fail well before you reach the pixel resolution limit... where the image is printed/viewed large enough that the square shape of the pixels become visible. If pixel resolution is actually the limiting factor you can fix that by resampling; breaking the image up into more squares so they don't print visibly. Otherwise you need to determine the maximum print/display size by the physical size at which it starts to fail; not PPI (nor DPI).


Back when digital imaging was first starting it was done by scanning images into the digital format. The DPI setting of the scanner becomes the PPI resolution of the image... and that's why the 300 PPI/DPI thing first began.
Yes, and for all practical purposes are they (ppi & dpi) not synonymous in regard to resolution.

Then the context IMO becomes important, ppi is all about the digital presentation whereas by the very nature of print technology it can only ever dpi (none too sure how dpi relates to Contone printing......but that is I surmise a whole other aspect, compared to inkjet printing, to discuss/learn about?)
 
Ok so my conclusion is that it's all about viewing distance.

From a 48mp image I can print something as large as a 43" TV screen and it'll look superb from 3 ft to the average person... Ie not a professional who's pixel peeping... Ie if someone buys a print from me that size and hangs it in a pub or a hotel foyer or office waiting room etc it'll look superb.

That's good enough for me... For now...

Thanks everyone for your input and advice. It's very much appreciated and thank you all for the the welcome.
 
Ok so my conclusion is that it's all about viewing distance.

From a 48mp image I can print something as large as a 43" TV screen and it'll look superb from 3 ft to the average person... Ie not a professional who's pixel peeping... Ie if someone buys a print from me that size and hangs it in a pub or a hotel foyer or office waiting room etc it'll look superb.

That's good enough for me... For now...

Thanks everyone for your input and advice. It's very much appreciated and thank you all for the the welcome.
Don't forget the SIZE of the dots too. Just give it a try I reckon.
 
I read an article once about how good the human eye is at resolving detail. It was quite sciency and isn't available any more for me to link, but the crux of it was that for a human with 20/20 vision...

At a standard reading distance (for example 12") the eye can only resolve about 300 ppi.

At a standard monitor distance (for example 30") the eye can resolve about 115ppi.

At a TV viewing distance (for example 6 feet) the eye can resolve no better than 50ppi.

Looking at a cinema screen distance (about 40 feet) the eye can't do any better than 7ppi.
 
Just wondering. Do you really want to print a 48" photo? My printer is supposed to do 13"x38". Haven't tried it but 12"x24" comes out pretty nice! Gonna try a 12"x36" one of these days just to see what I get. If I don't like it I have the option of throwing it away! How big a printer do you have anyway?
 
The simple answer is you can as long as you only look at the print from a similar distance and apply a healthy dose of sharpening and ideally interpolation. At home setting this is usually not the case, i.e. you tend to be much closer at times, but something like a large billboard should work absolutely fine. You may find your life becomes a lot easier if you limit yourself to A1 or even better A2.
 
Ok so my conclusion is that it's all about viewing distance.

From a 48mp image I can print something as large as a 43" TV screen and it'll look superb from 3 ft to the average person... Ie not a professional who's pixel peeping... Ie if someone buys a print from me that size and hangs it in a pub or a hotel foyer or office waiting room etc it'll look superb.

That's good enough for me... For now...

Thanks everyone for your input and advice. It's very much appreciated and thank you all for the the welcome.
You're missing a very important point.

A 4k movie or TV show is shot not just with a 4k sensor, but with a very sophisticated lens focussed on a much larger sensor.

Your camera has a lens that's possibly not quite as sharp as that which is required to take advantage of the 4k sensor and a sensor that's short of the size used in broadcast recordings.

But the maths is fine - a 4k image from a decent camera would happily print a good image at TV sizes.
 
Back
Top