Does size matter?

Discussion in 'Digital Darkroom' started by Jean-Claude, Apr 16, 2019.

  1. Helleo all,

    I have a 23" inch Eizo screen with full HD resolution (1080x1920) = 2 million pixels

    I have a D610 24 million pixel camera.

    1) how can those 24Mo enter a 2Mo screen?

    I have a smart Samsung 55" inch TV, also full HD. I download my Photoshop retouched pictures from my PC on a USB key which I then attach at the back of my TV for a nice diaporama (with music!)

    The quality of the photos remains the same due to identical full HD resolution on both screens, but the pictures are nicer because shown on a bigger screen (TV).

    Now, say a buy a d810 camera which delivers 37Mo pixels.

    2) will I see any quality difference (in sharpness for example) on both screens due to a higher resolution/definition?

    I am lost with screen sizes, resolutions and camera defintion.

    I don't make any digital prints, only black and white in my big darkroom.
     
  2. digitaldog

    digitaldog Andrew Rodney

    To keep it simple, one image pixel isn’t necessarily going to be used to print or view that pixel. For example, you may have a printer that can output 1440 dots per inch but one only needs to send it say 240-300 pixels per inch. Or you may have a display and more importantly software like Photoshop that allows you of course to zoom in and out of the image. At 100% (also known as 1:1), you are displaying 1 image pixel with one screen pixel and the image appears on screen, with a high resolution image to be greatly “zoomed” in. Zooming out to see the entire image requires subsampling of all the pixels to show you the entire image and the view is less accurate due to the subsampling. So the bottom line answer is, it depends.
     
  3. I am very bad with informatics.

    I put my question like this: does more camera resolution (definition?) increase the viewing quality on my 2 different screens? Or like this, can one see on a full HD screen the difference between a 16Mo pixel pictures and 100Mo pixel picture?
     
  4. A monitor will give sharper images and more accurate color than a TV monitor of the same size and resolution.

    Monitors and television sets have different standards for color. A TV set interprets color signals according to NTSC standards, not RGB for monitors. NTSC is often described as "Never The Same Color twice." Much of the difference is the way TV video is processed to fit into the available bandwidth and signal structure. There's a reason you never see red checkerboard patterns in your favorite sitcoms.

    High resolution images are resampled by the graphics drivers to fit the resolution of the screen. You have the option of displaying pixel=pixel in Photoshop (or Lightroom) for example, but possibly not the entire image.

    I'm using a 27" 5K iMac at the moment, which will display a 24 MP image with room to spare, and it does look sharper than a 2K (HD) monitor. In fact, I have to magnify images 3:1 before pixels in the image are large enough to discern. 42 MP (7K) images still have to be squeezed a bit to fit.
     
  5. If the screen resolution is the limiting factor, you probably won't see any difference between a 16 MP image and 100 MP.

    Larger screens and more resolution are often used for more practical displays. For example, tool bars take relatively less space in a larger display with more resolution, leaving more room for the image. Text is much cleaner, and remains legible in a smaller type face.

    For what it's worth, HD (2K) is hardly sine qua non for displays. 4K or higher is the new expectation.
     
  6. So and if I understand you right, since the camera resolution can't be seen on screen, higher resolution than 24 Mo in my case is only a Print argument. Resolution means printing, not image vscreen viewin? Correct?

    One gets more image quality and sharprness for example, printing a 40x50 cm image (or larger) with a 40Mo picture than a 24Mo picture. Correct?
     
  7. my tv monitor is 55" inch versus PC of 23" inch
     
  8. In a photo editing program like Photoshop or Lightroom, you have the option of viewing an image 1:1, where one pixel in the image occupies one pixel on the screen. The higher the screen resolution, the more of the image you can view at one time. Call it "pixel peeping" or not, that's the way you can see as much detail as possible.
    The more resolution in the image/camera, the greater the size you can print without resampling. You can resample to print larger than the native resolution would allow. That doesn't make it sharper, but does make the image pixels too small to see.

    The average resolution of an unaided human eye is about 300 ppi at a distance of 10", or about 5000 pixels in the width of a 23" screen (4:3 aspect ratio). An HD display is 1080 x 1920 pixels, well within the ability of the eye to resolve (you can see individual pixels). A 55" HD display has the same resolution, but you would normally view it from a greater distance. A 24 MP image is 6000 x 4000 pixels, so the HD monitor is limiting.

    My iMac is 5120 x 2880 pixels (5K), and I normally view it from about 15". Even with corrected vision, individual pixels cannot be seen at that distance. Resolution of the iMac is limiting for a 24 MP camera, as well as a 42 MP image from a Sony A7Riii, 7952 x 5304 pixels.

    For some reason, your eye is less forgiving of pixelation in a print than on a monitor. A print at HD resolution and 300 dpi is only 6" wide, compared to 20" for a 24 MP camera and 26" for 42 MP.
     
  9. Thank you. I know that I can zoom in 100% in PS.

    My questionning started with my US video lessons from Linkedn, very well made and explained by the Adobe professionals by the way. You may donwload their exercise file and there I saw such sharp and clear images as never before. I decided I want the same quality

    Whar can I do: buy a higher resolution camera body (37 Mo pixels) and/or higher quality lenses? Mine are dated from the film era. They suit well my D610 but do not deliver such crisp pictures as the ones from the Adobe guys BEFORE any retouching. They are just amazing on my screen. I want the same ones!

    (Remember, I just want to view my pictures, not print them)
     
  10. What camera, lens? That's too much to cover in a brief response. You need to do your homework. Read reviews and look at samples from recognized professionals.

    Any camera with 24 MP or more is a good place to start. I suggest a mirrorless camera, because the lenses are often better corrected than lenses for a DSLR, which must allow nearly 2" for mirror clearance. Some lenses, particularly Zeiss, have bright color and good acutance (micro contrast). In particular, mirrorless lenses are sharper in the corners than lenses designed primarily for film.

    Sony A7Rii + Zeiss Loxia 50/2, 1/50 and f/5.6 @Iso 100
    upload_2019-4-16_14-25-27.jpeg
     
    Jean-Claude likes this.
  11. Amazing, fabulous!!

    One of the camera bodies is a big Canon with a L lens
     
  12. Photos that "pop" need to be focused well and camera shake eliminated. Some DSLRs have "live view", with the mirror up, viewing the image from the actual sensor. Mirrorless cameras are 100% live view. Both cases offer precise manual focusing, much better than on a finder screen. Camera shake can be eliminated using a tripod, and greatly reduced using image stabilization in the lens and/or the camera.

    Another way to create "pop" is with selective focus. Some lenses have smoother out of focus areas than others. The Loxia above, a Planar design, is pretty good, as are some DSLR lenses. The only way to tell is from sample images or personal experience.
     
  13. I've always been puzzled by this. Why do you need much less resolution on a monitor than a hard copy?
     
  14. I can only speculate...

    • Whatcha' going to do about it? It [the monitor] is what it is.
    • You can put your eye up to a print easier than a monitor (e.g., with a magnifier) to pixel-peep. I zoom the monitor in order to read fine print or see fine details. It's neither practical nor useful to move closer than normal viewing distance.
    • Printer resolution is often high enough to resolve individual pixels in an image with less than 300 ppi resolution, displaying aliasing and staircasing of diagonals. Graphic drivers often use anti-aliasing techniques.
    • When using a computer, many elements in the software are vector based. You can zoom in on an image to the point individual pixels can be distinguished, while text remains pixel-sharp.
    • An iMac 27", 5K {5120 x 2880) Retina display has a resolution of 216 ppi, which is often considered adequate for inkjet prints. It's hard to see the difference between screen and print in that respect.
    • A display has more contrast than a print (q.v., transparency v print in film) and high acutance, making images look crisper than a print in hand. Images displayed on a 10" iPad Air (2224 x 1668) are like viewing an 8x10" transparency, and it's a lot easier to carry than a portfolio.
     
  15. digitaldog

    digitaldog Andrew Rodney

    What standards? If they all followed standards to produce the same color, we wouldn't need to calibrate them and use software that allowed a vast number of calibration aim points (targets) like the myriad of white points, or backlight intensity (cd/m^2). And yet we do need these differing settings for calibration and many of us calibrate the same displays differently.
     
  16. With television, color is an afterthought, woven into the raster between the lines. The calibration process, such as it is, is completely different than for a computer monitor.
     
  17. digitaldog

    digitaldog Andrew Rodney

    Completely different and again, san's a standard or they would all behave the same. Walk into any electronics store with dozens of differing models, ALL getting the identical RGB signal and one can clearly see they are all over the map in output behavior.
     
  18. Television sets don't take an RGB signal. It's something different, like Composite (BNC) or Component (3x RCA). At home I use HDMI or SDI, but those aren't generally suitable for distribution, and beyond the pay grade for department stores. Video calibration involves blue screen and a simple bar chart for hue, saturation and luminance. I haven't tried to match it with an RGB calibrated monitor, and I don't think it's worth my time. As long as faces don't look like Martians, I'm good.

    Monitors, scanners and printers are another matter.
     
    Last edited: Apr 17, 2019
  19. digitaldog

    digitaldog Andrew Rodney

    Of course they do.
    Don't digress from that comment about standards either. :rolleyes:
     
    Last edited: Apr 17, 2019
  20. I concede that some HD sets have RGB inputs, presumably for use with computers, and can be calibrated as such. That's probably not how signals are distributed in a store. A composite signal via BNC connectors is more likely.

    The "stamdards" I refer to are for broadcast, not receiving. I record HD video to meet the ITU709 color standard.. That's as good as I can do. I have no control over what others do with that video, any more than if I distribute still images from a calibrated monitor. TV sets have a lot of latitude for adjustment (or mis-adjustment). It doesn't surprise me when no two sets look alike in a store.

    A 55" HD television (1920x1080) has a screen width of about 49", which gives a resolution of about 50 ppi. You would have to sit abut 5' from the screen before individual pixels would be too small to discern. A 4K set would have 200 ppi resolution, which you could use from 2' or so, but you'd be turning your head a lot. Even 27" can be a strain after an hour or so. I take frequent breaks.
     

Share This Page