Jump to content

Recommended Posts

Helleo all,

 

I have a 23" inch Eizo screen with full HD resolution (1080x1920) = 2 million pixels

 

I have a D610 24 million pixel camera.

 

1) how can those 24Mo enter a 2Mo screen?

 

I have a smart Samsung 55" inch TV, also full HD. I download my Photoshop retouched pictures from my PC on a USB key which I then attach at the back of my TV for a nice diaporama (with music!)

 

The quality of the photos remains the same due to identical full HD resolution on both screens, but the pictures are nicer because shown on a bigger screen (TV).

 

Now, say a buy a d810 camera which delivers 37Mo pixels.

 

2) will I see any quality difference (in sharpness for example) on both screens due to a higher resolution/definition?

 

I am lost with screen sizes, resolutions and camera defintion.

 

I don't make any digital prints, only black and white in my big darkroom.

Link to comment
Share on other sites

To keep it simple, one image pixel isn’t necessarily going to be used to print or view that pixel. For example, you may have a printer that can output 1440 dots per inch but one only needs to send it say 240-300 pixels per inch. Or you may have a display and more importantly software like Photoshop that allows you of course to zoom in and out of the image. At 100% (also known as 1:1), you are displaying 1 image pixel with one screen pixel and the image appears on screen, with a high resolution image to be greatly “zoomed” in. Zooming out to see the entire image requires subsampling of all the pixels to show you the entire image and the view is less accurate due to the subsampling. So the bottom line answer is, it depends.

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

To keep it simple, one image pixel isn’t necessarily going to be used to print or view that pixel. For example, you may have a printer that can output 1440 dots per inch but one only needs to send it say 240-300 pixels per inch. Or you may have a display and more importantly software like Photoshop that allows you of course to zoom in and out of the image. At 100% (also known as 1:1), you are displaying 1 image pixel with one screen pixel and the image appears on screen, with a high resolution image to be greatly “zoomed” in. Zooming out to see the entire image requires subsampling of all the pixels to show you the entire image and the view is less accurate due to the subsampling. So the bottom line answer is, it depends.

 

I am very bad with informatics.

 

I put my question like this: does more camera resolution (definition?) increase the viewing quality on my 2 different screens? Or like this, can one see on a full HD screen the difference between a 16Mo pixel pictures and 100Mo pixel picture?

Link to comment
Share on other sites

A monitor will give sharper images and more accurate color than a TV monitor of the same size and resolution.

 

Monitors and television sets have different standards for color. A TV set interprets color signals according to NTSC standards, not RGB for monitors. NTSC is often described as "Never The Same Color twice." Much of the difference is the way TV video is processed to fit into the available bandwidth and signal structure. There's a reason you never see red checkerboard patterns in your favorite sitcoms.

 

High resolution images are resampled by the graphics drivers to fit the resolution of the screen. You have the option of displaying pixel=pixel in Photoshop (or Lightroom) for example, but possibly not the entire image.

 

I'm using a 27" 5K iMac at the moment, which will display a 24 MP image with room to spare, and it does look sharper than a 2K (HD) monitor. In fact, I have to magnify images 3:1 before pixels in the image are large enough to discern. 42 MP (7K) images still have to be squeezed a bit to fit.

Link to comment
Share on other sites

I put my question like this: does more camera resolution (definition?) increase the viewing quality on my 2 different screens? Or like this, can one see on a full HD screen the difference between a 16Mo pixel pictures and 100Mo pixel picture?

If the screen resolution is the limiting factor, you probably won't see any difference between a 16 MP image and 100 MP.

 

Larger screens and more resolution are often used for more practical displays. For example, tool bars take relatively less space in a larger display with more resolution, leaving more room for the image. Text is much cleaner, and remains legible in a smaller type face.

 

For what it's worth, HD (2K) is hardly sine qua non for displays. 4K or higher is the new expectation.

Link to comment
Share on other sites

If the screen resolution is the limiting factor, you probably won't see any difference between a 16 MP image and 100 MP.

 

Larger screens and more resolution are often used for more practical displays. For example, tool bars take relatively less space in a larger display with more resolution, leaving more room for the image. Text is much cleaner, and remains legible in a smaller type face.

 

For what it's worth, HD (2K) is hardly sine qua non for displays. 4K or higher is the new expectation.

 

So and if I understand you right, since the camera resolution can't be seen on screen, higher resolution than 24 Mo in my case is only a Print argument. Resolution means printing, not image vscreen viewin? Correct?

 

One gets more image quality and sharprness for example, printing a 40x50 cm image (or larger) with a 40Mo picture than a 24Mo picture. Correct?

Link to comment
Share on other sites

A monitor will give sharper images and more accurate color than a TV monitor of the same size and resolution.

 

Monitors and television sets have different standards for color. A TV set interprets color signals according to NTSC standards, not RGB for monitors. NTSC is often described as "Never The Same Color twice." Much of the difference is the way TV video is processed to fit into the available bandwidth and signal structure. There's a reason you never see red checkerboard patterns in your favorite sitcoms.

 

High resolution images are resampled by the graphics drivers to fit the resolution of the screen. You have the option of displaying pixel=pixel in Photoshop (or Lightroom) for example, but possibly not the entire image.

 

I'm using a 27" 5K iMac at the moment, which will display a 24 MP image with room to spare, and it does look sharper than a 2K (HD) monitor. In fact, I have to magnify images 3:1 before pixels in the image are large enough to discern. 42 MP (7K) images still have to be squeezed a bit to fit.

my tv monitor is 55" inch versus PC of 23" inch

Link to comment
Share on other sites

So and if I understand you right, since the camera resolution can't be seen on screen, higher resolution than 24 Mo in my case is only a Print argument. Resolution means printing, not image vscreen viewin? Correct?

In a photo editing program like Photoshop or Lightroom, you have the option of viewing an image 1:1, where one pixel in the image occupies one pixel on the screen. The higher the screen resolution, the more of the image you can view at one time. Call it "pixel peeping" or not, that's the way you can see as much detail as possible.

One gets more image quality and sharprness for example, printing a 40x50 cm image (or larger) with a 40Mo picture than a 24Mo picture. Correct?

The more resolution in the image/camera, the greater the size you can print without resampling. You can resample to print larger than the native resolution would allow. That doesn't make it sharper, but does make the image pixels too small to see.

 

The average resolution of an unaided human eye is about 300 ppi at a distance of 10", or about 5000 pixels in the width of a 23" screen (4:3 aspect ratio). An HD display is 1080 x 1920 pixels, well within the ability of the eye to resolve (you can see individual pixels). A 55" HD display has the same resolution, but you would normally view it from a greater distance. A 24 MP image is 6000 x 4000 pixels, so the HD monitor is limiting.

 

My iMac is 5120 x 2880 pixels (5K), and I normally view it from about 15". Even with corrected vision, individual pixels cannot be seen at that distance. Resolution of the iMac is limiting for a 24 MP camera, as well as a 42 MP image from a Sony A7Riii, 7952 x 5304 pixels.

 

For some reason, your eye is less forgiving of pixelation in a print than on a monitor. A print at HD resolution and 300 dpi is only 6" wide, compared to 20" for a 24 MP camera and 26" for 42 MP.

Link to comment
Share on other sites

In a photo editing program like Photoshop or Lightroom, you have the option of viewing an image 1:1, where one pixel in the image occupies one pixel on the screen. The higher the screen resolution, the more of the image you can view at one time. Call it "pixel peeping" or not, that's the way you can see as much detail as possible.

 

The more resolution in the image/camera, the greater the size you can print without resampling. You can resample to print larger than the native resolution would allow. That doesn't make it sharper, but does make the image pixels too small to see.

 

The average resolution of an unaided human eye is about 300 ppi at a distance of 10", or about 5000 pixels in the width of a 23" screen (4:3 aspect ratio). An HD display is 1080 x 1920 pixels, well within the ability of the eye to resolve (you can see individual pixels). A 55" HD display has the same resolution, but you would normally view it from a greater distance. A 24 MP image is 6000 x 4000 pixels, so the HD monitor is limiting.

 

My iMac is 5120 x 2880 pixels (5K), and I normally view it from about 15". Even with corrected vision, individual pixels cannot be seen at that distance. Resolution of the iMac is limiting for a 24 MP camera, as well as a 42 MP image from a Sony A7Riii, 7952 x 5304 pixels.

 

For some reason, your eye is less forgiving of pixelation in a print than on a monitor. A print at HD resolution and 300 dpi is only 6" wide, compared to 20" for a 24 MP camera and 26" for 42 MP.

 

Thank you. I know that I can zoom in 100% in PS.

 

My questionning started with my US video lessons from Linkedn, very well made and explained by the Adobe professionals by the way. You may donwload their exercise file and there I saw such sharp and clear images as never before. I decided I want the same quality

 

Whar can I do: buy a higher resolution camera body (37 Mo pixels) and/or higher quality lenses? Mine are dated from the film era. They suit well my D610 but do not deliver such crisp pictures as the ones from the Adobe guys BEFORE any retouching. They are just amazing on my screen. I want the same ones!

 

(Remember, I just want to view my pictures, not print them)

Link to comment
Share on other sites

What camera, lens? That's too much to cover in a brief response. You need to do your homework. Read reviews and look at samples from recognized professionals.

 

Any camera with 24 MP or more is a good place to start. I suggest a mirrorless camera, because the lenses are often better corrected than lenses for a DSLR, which must allow nearly 2" for mirror clearance. Some lenses, particularly Zeiss, have bright color and good acutance (micro contrast). In particular, mirrorless lenses are sharper in the corners than lenses designed primarily for film.

 

Sony A7Rii + Zeiss Loxia 50/2, 1/50 and f/5.6 @Iso 100

upload_2019-4-16_14-25-27.jpeg.76a5ca42c9d5788f005e4eb445af58a8.jpeg

  • Like 1
Link to comment
Share on other sites

What camera, lens? That's too much to cover in a brief response. You need to do your homework. Read reviews and look at samples from recognized professionals.

 

Any camera with 24 MP or more is a good place to start. I suggest a mirrorless camera, because the lenses are often better corrected than lenses for a DSLR, which must allow nearly 2" for mirror clearance. Some lenses, particularly Zeiss, have bright color and good acutance (micro contrast). In particular, mirrorless lenses are sharper in the corners than lenses designed primarily for film.

 

Sony A7Rii + Zeiss Loxia 50/2, 1/50 and f/5.6 @Iso 100

[ATTACH=full]1291637[/ATTACH]

 

Amazing, fabulous!!

 

One of the camera bodies is a big Canon with a L lens

Link to comment
Share on other sites

Photos that "pop" need to be focused well and camera shake eliminated. Some DSLRs have "live view", with the mirror up, viewing the image from the actual sensor. Mirrorless cameras are 100% live view. Both cases offer precise manual focusing, much better than on a finder screen. Camera shake can be eliminated using a tripod, and greatly reduced using image stabilization in the lens and/or the camera.

 

Another way to create "pop" is with selective focus. Some lenses have smoother out of focus areas than others. The Loxia above, a Planar design, is pretty good, as are some DSLR lenses. The only way to tell is from sample images or personal experience.

Link to comment
Share on other sites

 

For some reason, your eye is less forgiving of pixelation in a print than on a monitor. A print at HD resolution and 300 dpi is only 6" wide, compared to 20" for a 24 MP camera and 26" for 42 MP.

 

I've always been puzzled by this. Why do you need much less resolution on a monitor than a hard copy?

Link to comment
Share on other sites

I've always been puzzled by this. Why do you need much less resolution on a monitor than a hard copy?

I can only speculate...

 

  • Whatcha' going to do about it? It [the monitor] is what it is.
  • You can put your eye up to a print easier than a monitor (e.g., with a magnifier) to pixel-peep. I zoom the monitor in order to read fine print or see fine details. It's neither practical nor useful to move closer than normal viewing distance.
  • Printer resolution is often high enough to resolve individual pixels in an image with less than 300 ppi resolution, displaying aliasing and staircasing of diagonals. Graphic drivers often use anti-aliasing techniques.
  • When using a computer, many elements in the software are vector based. You can zoom in on an image to the point individual pixels can be distinguished, while text remains pixel-sharp.
  • An iMac 27", 5K {5120 x 2880) Retina display has a resolution of 216 ppi, which is often considered adequate for inkjet prints. It's hard to see the difference between screen and print in that respect.
  • A display has more contrast than a print (q.v., transparency v print in film) and high acutance, making images look crisper than a print in hand. Images displayed on a 10" iPad Air (2224 x 1668) are like viewing an 8x10" transparency, and it's a lot easier to carry than a portfolio.

Link to comment
Share on other sites

A monitor will give sharper images and more accurate color than a TV monitor of the same size and resolution.

 

Monitors and television sets have different standards for color.

What standards? If they all followed standards to produce the same color, we wouldn't need to calibrate them and use software that allowed a vast number of calibration aim points (targets) like the myriad of white points, or backlight intensity (cd/m^2). And yet we do need these differing settings for calibration and many of us calibrate the same displays differently.

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

With television, color is an afterthought, woven into the raster between the lines. The calibration process, such as it is, is completely different than for a computer monitor.

Completely different and again, san's a standard or they would all behave the same. Walk into any electronics store with dozens of differing models, ALL getting the identical RGB signal and one can clearly see they are all over the map in output behavior.

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

Television sets don't take an RGB signal. It's something different, like Composite (BNC) or Component (3x RCA). At home I use HDMI or SDI, but those aren't generally suitable for distribution, and beyond the pay grade for department stores. Video calibration involves blue screen and a simple bar chart for hue, saturation and luminance. I haven't tried to match it with an RGB calibrated monitor, and I don't think it's worth my time. As long as faces don't look like Martians, I'm good.

 

Monitors, scanners and printers are another matter.

Edited by Ed_Ingold
Link to comment
Share on other sites

I concede that some HD sets have RGB inputs, presumably for use with computers, and can be calibrated as such. That's probably not how signals are distributed in a store. A composite signal via BNC connectors is more likely.

 

The "stamdards" I refer to are for broadcast, not receiving. I record HD video to meet the ITU709 color standard.. That's as good as I can do. I have no control over what others do with that video, any more than if I distribute still images from a calibrated monitor. TV sets have a lot of latitude for adjustment (or mis-adjustment). It doesn't surprise me when no two sets look alike in a store.

 

A 55" HD television (1920x1080) has a screen width of about 49", which gives a resolution of about 50 ppi. You would have to sit abut 5' from the screen before individual pixels would be too small to discern. A 4K set would have 200 ppi resolution, which you could use from 2' or so, but you'd be turning your head a lot. Even 27" can be a strain after an hour or so. I take frequent breaks.

Link to comment
Share on other sites

Exactly. Perhaps we have a different concept of what constitutes a "standard" amd "calibration."

 

Calibration consists of the comparison of the device under test with a standard of known accuracy traceable to a fundamental principle, including a reference standard (e.g., the standard kilogram), a transfer standard, or an industry consensus (e.g., ISO film speed). Calibration, per se, is not adjustment, just a comparison to a measurable degree of accuracy. Adjustments are made to bring the device under test into compliance with the standard. If necessry, the device. under test is compared to that standard during or after these adjustments, leaving it in compliance before returing it to service.

 

Obviously, a device which is not calibrated and adjusted does not necessarily conform to the standard. Adjustments are made only if the device under test falls outside the tolerance for its intended use. In my experience (metrology), transfer (intermediate) standards are seldom adjusted pursuant to calibration against a higher standard. If they needed regular adjustments, that might affect every calibration performed with this instrument, TV sets and monitors do not rise to the level of a standard, by that definition. A color calibration device (e.g., X-Rite Color Munki) is a transfer standard, along with a color patch chart projected on the TV or monitor, and its accompanying data file.

 

In this context, whether a device conforms to manufacturing standards (tolerances) is misleading. Basically, it means that the device can be adjusted to conform to a consensus standard (e.g., color). It's not certain that a consumer TV set even meet that loose definition.

Link to comment
Share on other sites

Calibration is simply putting a device into a desired behavior and keeping it in that behavior and doesn't have to have anything to do with "standards" or "accuracy".

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

Calibration is the comparison between the device under test and a traceable standard. Adjustment is the process to bring that device within acceptable limits with respect to that standard. In principle, if the device is within tolerance, no adjustments are required, and may adversely affect the long term accuracy. Acceptable tolerances are established based on process requirements (e.g., color accuracy) over a reasonable time interval. That interval is based on the process requirements, including economics, and the stability of the instrument under test.

 

This definition may be different from "common knowledge," but it is scientifically accurate. The most common confusion is between calibration and adjustment (twiddling, to insert a non-scientific term).

 

Color calibration procedures usually offer a comparison before and after adjustments are made before committment to the new profile. Some software offers a comparison of the before and after profiles in graphical form. If the instrument (monitor) drifts over time, or if the cost of OOT product (images or video) is high,, it must be calibrated and adjusted (if necessary) more frequently. If it is consistently out of tolerance, it may not be suitable for its intended use.

Link to comment
Share on other sites

Calibration is the comparison between the device under test and a traceable standard.

No, it's not (necessarily and often). The facts show this idea is false.

 

I have a SpectraView display. It allows multiple calibrations internally using it's hardware, software and a measuring device (Colorimeter). I can and do calibrate for differing papers on differing printers to match visually and that requires differing calibrations. Differing white point settings (the papers white differ). The contrast ratio's differ (Matt vs. Glossy). There is NO standards to set. One produces the calibration for a goal, a desired result: a visual match to the print. This has absolutely nothing to do with accuracy either. To define accuracy you need a reference (what you are gauging to be accurate) and measurements of what you hope to compare to produce an accuracy metric (in this discussion, that's called deltaE). You cannot measure a print and compare it's accuracy to a display! You can produce a visual match which again has absolutely nothing to do with color accuracy. IF you wish to understand what color accuracy is, there's a video for that:

 

Delta-E and color accuracy

In this 7 minute video I'll cover: What is Delta-E and how we use it to evaluate color differences. Color Accuracy: what it really means, how we measure it using ColorThink Pro and BableColor CT&A. This is an edited subset of a video covering RGB working spaces from raw data (sRGB urban legend Part 1).

Low Rez:

High Rez: http://digitaldog.net/files/Delta-E%20and%20Color%20Accuracy%20Video.mp4

 

Again, calibration can have absolutely nothing to do with any standards provided or color accuracy. Epson printers are calibrated at the factory for a undefined behavior. There is no way to calibrate them. You can profile the behavior. You might, with some 3rd party product, linearize them which is one part of a full calibration process. But again, out of the box, you cannot calibrate them nor need to. The same is true in this discussion of displays. Out of the box, their behavior all differ, often hugely. Just the difference in the same make and model of a display producing 85cd/m2 vs. 150cd/m2 is massive. Neither is by default calibrated to any standard. Then we have the huge differences in calibration for contrast ratio. You can buy a display with a 1200:1 ratio. And you can calibrate it for 300:1 or 150:1 or 1000:1 IF your calibration software (and ideally hardware) provides that control over calibration. Again, NO standard. It is what it is. A calibration for a desired result. You recalibrate to produce that behavior in the future because displays are non-stable devices. You can then compare how that display calibration differed from the last time you calibrated it.

 

So your idea that displays follow some standard of color or accuracy is simply not correct. As noted if you view them (or TVs) all getting the same signal. If you want to match them to something else, or each other, you might get close if you match to the lowest common denominator: you can't calibrate a display that can't reach 250cd/m2 to one that can, or a display that can't reach 1100:1 to a display that can. You can't make (calibrate) an sRGB gamut display match a wide gamut display.

The REASON good calibration software for displays have huge granularity in setting white point, cd/m^2, contrast ratio and even TRC is because no one set is necessarily ideal for the calibration goals! Which again has absolutely nothing to do with color accuracy. If you attempt to calibrate 4 displays to match each other, and you measure X number of color patches for a reference (in Lab), then and only then can you measure the others and produce an accuracy report of how well those other's match the first.

Accuracy takes TWO sets of measurements! Without, it's like one hand clapping; doesn't work at all.

 

Color calibration procedures usually offer a comparison before and after adjustments are made before committment to the new profile.

The profile defines device behavior; calibrated or otherwise. It doesn't calibrate anything. What you're comparing visually, and again, this has has nothing to do with accuracy is two previews of two calibration behaviors. Nothing more.

And you're missing the massive difference between color accuracy (what you want based on a reference and what you get) to trending (what you measured today vs. what you measured 2 months ago, 4 months ago etc). NOT the same. Both can be reported in deltaE (color distance thus difference) but both tell us completely different things about what we measured.

 

Yes, there is indeed some confusion about calibration here, hopefully the colorimetric facts and video clear this up. :rolleyes:

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...