Jump to content

Image resolution vs. display resolution- quality?


Recommended Posts

I've never been quite clear on whether it's necessary to have higher resolution on an image than the display being used. It seems like quality increases if the image has about 2X the number of pixels as the display, but why shouldn't it be OK when the pixels are exactly equal? Or, is it?
Link to comment
Share on other sites

Everything I've ever read on image sizing says that the image size should match the (maximum) screen resolution on which it might be displayed. I've never read anything that suggests that "quality increases if the image has about 2X the number of pixels as the display". But I've never done any experiments. It is true that on some social media platforms (Facebook) the quality is sometimes better if the image is slightly larger than the standard display size.

 

My laptop screen resolution is 1920 x 1080 ("HD 1080p"). Some screens have higher resolutions: 2560 x 1440, 3200 x 1800 or even 3840 x 2160 ("4K").

 

For posting to Photonet, etc. I usually resize to about 1800 pixels on the long side. So 1800x1200 (3:2) or 1600x1200 (4:3).

 

Mike

 

 

I've never been quite clear on whether it's necessary to have higher resolution on an image than the display being used. It seems like quality increases if the image has about 2X the number of pixels as the display, but why shouldn't it be OK when the pixels are exactly equal? Or, is it?
  • Like 1
Link to comment
Share on other sites

I always display images in the display's native resolution if at all possible. Anything else and you are at the mercy of the upscaling/downscaling algorithms used by the software/hardware.

 

My television, for example, has a particularly poor downscaling implementation, guaranteed to produce stairstepping on diagonal lines. So I output images in 1920x1080 if they're going to be displayed in that way. Or, rather, I should say, I output images with a max size of 1920x1080, I don't crop to suit the display.

 

For clients, I output 800x600, 1368x720, 1920x1080 and whatever the full image size is after cropping (for printing). Unless they ask for something specific. I'll drop 720p for 4k soon maybe, though a lot of big displays are still 720p.

 

I'd rather spend the extra cpu cycles to resize than have my images look like poo because of a poor display implementation.

  • Like 2
Link to comment
Share on other sites

The simplistic answer is if images are displayed pixel for pixel with the monitor, the results can't be any sharper. It gets a little more complicated than that. While pixel=pixel display might be okay for a website like Photo.net, when editing or viewing, it's better to have images presented at a convenient size so you can better see the contents. Secondly, most digital cameras produce images too large to see in their entirety on some monitors. Monitors are getting better too, so that individual pixels (cluster of 3 dots) are well below the threshold of human vision. In general, you don't want to see individual pixels, except in special circumstances.

 

Case in point, I'm working on a 27", 5K iMac, and shooting with 24 MP or 42 MP (Sony) cameras. In either case, the image must be downsized to fit on the screen. Most of the time the useable area is defined by the editing or viewing program, e.g., Photoshop or Lightroom, and must be reduces well below the maximum size. Secondly, If I view the image at 1:1 magnification (e.g., for dust spotting or critical evaluation), the pixels are still too small for me to see. As a practical matter, I need to magnify the image pixels 3:1 before I can see granularity. That said, I can't see any artifacts produced as a result of this resampling process.

 

Printing is another issue. a typical inkjet printer, at 600 dpi, has 3 or 4 times the resolution as my 5K screen. If you print a 24 MP image larger than 8"x10", you may have to resample the image so that individual pixels can't be seen on close examination. Upsizing an image requires compromises between increasing the pixel count (e.g., to 300 ppi at the print size) and how well details in the image are interpolated at the new size. There are several algorithms for resizing, including bi-cubic (e.g., Photoshop/Lightroom) and fractal (e.g.,ImagePrint). You can also do resampling in several steps, which is recommended by some in the Photo.net community. I'm not sure it makes a difference until you approach mural size. Besides, it takes extraordinary effort and lenses of the highest quality to be "pixel sharp" at 24 MP, much less at 40+.

Link to comment
Share on other sites

When I create 4K video slideshows for my 75" 4K UHDTV, the program (Adobe Premiere Elements) allows me to create a 3840x2160 image size to match the screens 4K pixels. (8MB). I also create HD 2K at 1920x1080 in case I want to give the file to someone who doesn;t have 4K TV's but the older 1080 2K TV. Of course, both 2K and 4K files can be played on a 1920 monitor and both look the same there. I've also uploaded the 4K to YoutUbe which down samples to lower resolutions depending on the viewers bandwidth and computer system or TV.

Link to comment
Share on other sites

Video reproduction is a special case. While all 4K video has the roughly number of pixels, about 8 MP (DCI is 2160x4096 rather than 2160x3840 UHD), not all 4K is the same. The difference in clarity between a Super-35 cinematic camera and a typical 1/3" to 1" camera is striking. Just as telling is the difference between long GOP and GOP-I compression (I'm experimenting with non-compressed RAW at the moment). There is somewhat less difference between interlaced and progressive video, even on a professional monitor with twice the bandwidth of a living room television. While 4K is fairly typical for computer monitors, it's still well below that of an inkjet printer. Considering the relatively slow shutter speeds used for video (blur is more watchable than stroboscopic effects), 4k video doesn't look that great to me on close examination.
Link to comment
Share on other sites

Of course, both 2K and 4K files can be played on a 1920 monitor and both look the same there.

 

As per my post above, that depends on the algorithms used to resize the image. I don't know what my TV uses, but they obviously decided to skimp on cpu cycles. No way of knowing what any given image viewing application will use without digging through the specs/preferences/code.

Link to comment
Share on other sites

A 6 TB drive costs about $190 (server quality), and has enough capacity to hold all of my images twice over. Those image will be uncompressed raw until some politician decides otherwise (then I'll look for a new politician). In fact they now reside on a 20 TB RAID, which I am incrementally updating by replacing its 4 TB drives with 6 TB as the need arises.

 

Memory is cheap. Compression is forever.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...