Who can explain the relationship.....

Discussion in 'Casual Photo Conversations' started by fotolopithecus, Mar 4, 2018.

  1. Between pixel density, and resolution. I ask because I've been told that the greater the pixel density the greater the ability a sensor has to resolve fine detail. That would appear on it's face not to be true, since a Nikon D810 has more resolving power than a Nikon D7200, which has greater pixel density, and by a fair amount. If the D7200 for example were scaled up to a full frame, it would have roughly 50 mps density wise. We need this issue resolved once, and for all, because the effect of it appears to be that nobody knows nothing. ;)
  2. Digital resolution is usually measured by the total number of pixels on a side, rather than pixels/inch. An image with the same field of view will require more enlargement for the same size print from an APS-C camera (e.g., D7200) than a full frame camera (D810). Unlike the D7200, the D810 does not have an anti-aliasing filter, which would reduce the effective resolution by about 30%.

    The sensor is not the only thing to affect resolution, rather the lens used. In order for a lens to have an insignificant effect on resolution, it would need 3x-4x the resolution dictated by the pixels/inch. Few lenses achieve that goal, hence degrade the theoretical resolution. If the lens and sensor had equal resolution, measured independently, the combined resolution would be half that value. Since the D7200 and D810 use essentially the same lenses, they would perform better on a full frame camera.

    I think Nikon has picked up the pace on lens quality, along with price increases. My Nikon lenses date to 2001 or earlier. I can compare them directly to other lenses, since they fit on a Sony FF camera using simple adapters. Their performance is well below that of lenses designed specifically for Sony, which is why I no longer use them. Of the lenses I own there is one exception. The manual 55/2.8 Micro-Nikkor will actually resolve images at the pixel level on an A7Rii (42 mp, no AA filter).
    fotolopithecus likes this.
  3. Resolution is not related to pixel density, except in conjunction with specifying within the identical format. For instance, with the same density, a large sensor will have more resolution with the proportionally correct lens than a small one, because the total field will have more pixels. So as a stand-alone factor, pixel density doesn't mean much.

    What does have to do with pixel density, only, is noise, because smaller pixels ( that is, more densely placed in any format) have more noise in marginal situations than larger ones., regardless of format size. That's why a 16mp full frame camera is better than a 16mp four-thirds camera. This mainly shows up at high ISO speeds, because the sensor is being pressed towards its limit.

    Whether the lens can do the job is a third factor, which also is unrelated to pixel density, in that this is not a sensor issue, it's a lens issue. Lens inadequacy is not the sensor's problem.
    NetR and fotolopithecus like this.
  4. Here's the situation. We have two perfectly sensible explanations which don't agree. Add to that, that I think I remember a Tony Northrup video where he claims small pixels don't inherently produce more noise because the area occupied by one large pixel will gather the same light as two smaller one's in the same space.
  5. Pixel density is the pixels per square cm (or inch, mm), so relative to the sensor size. Resolution is the total amount of pixels available, and hence absolute. Sensor size time pixel density yields resolution.

    What the side effects are of both is a whole other discussion, and depends on many more variables, like technological advancements made in sensor development and production, the quality and design of the processing circuits and quite a lot more (and lenses add even more variables). So, while the relation between pixel density and resolution is drop-dead easy, the relative effects not and in reality the effects of all this in real-world photos is pretty limited. But if you're into pixelpeeping, I guess it's massively interesting in some way.
    fotolopithecus likes this.
  6. [I say this with tongue firmly planted in cheek relative to what fotolopithecus recently said in the "Likes" thread]

    What’s the difference again between photography and audiophilia?

    fotolopithecus likes this.
  7. "I've been told that the greater the pixel density the greater the ability a sensor has to resolve fine detail." Ah, the myth of the power to resolve fine detail! Covering the same area of a scene, the more pixels the more detail can be captured and pixel density on the sensor has nothing to do with it.

    I had an exhaustive, ridiculous "discussion" on this subject with AstroImager aka Paul Lefevre on the popphoto forum. The forum is no longer active, but the past threads can still be looked at. You may or may not want to check it out, but be warned, your head may start to spin!
    fotolopithecus likes this.
  8. Which argument ignores any sensitivity threshold issues, should they exist, which I suspect they do. Also, there's the problem that I doubt very much that two or four pixels within the same area as one can do so without any total area loss from the space between that divides them, especially with the problem of focusing microlenses and sensor depth cluttering up the problem.

    May I draw your attention to figure 4, here: How to Evaluate Camera Sensitivity

    Of course, I would still maintain that Ed's solution is about lenses as much as about sensors, which was not really part of the OP's question.
    fotolopithecus likes this.
  9. Norman 202

    Norman 202 i am the light

    audiophilia sounds rude :)
  10. I take your point, but in my case I'm interested in both, because while I'm not taking pictures I like to be thinking, or talking about all things photographic. For an Audiophile it's pretty much just thinking about components, preamps, Amps, Speakers, Turntables, and in what arrangements of them to find the Holy grail of true musicality. No actual artistic input of your own there.
    wogears likes this.
  11. Unlike the D7200, the D810 does not have an anti-aliasing filter, which would reduce the effective resolution by about 30%.

    The D7200 has no AA filter. Also where does the nicely round 30% come from???

    The OP might also research Thom Hogan's discussions of pixel density and resolution.
    fotolopithecus likes this.
  12. I remember it, and although Paul was very knowledgeable, and helpful, after a time I wondered if he had it right on this. Nonetheless, let's not go down that road again.
  13. w-o-o-o-o-f-er!
    Last edited: Mar 4, 2018
    fotolopithecus likes this.
  14. Oh Pith!
    Strictly speaking "Resolution" is measured by length unit or area unit. For example 100 pixels per inch or 10000 pixels per inch square. The total amount of pixels in an image is really not the resolution but I don't really have a good term for it. In the TV industries they call it "Definition".
    Using those terms the D7200 has higher resolution but the D810 has higher definition.
    fotolopithecus likes this.
  15. Actually, lens resolution is intimately connected to pixel density. The closer pixels are spaced, the more resolution is required from the lens.
    fotolopithecus likes this.
  16. It would be true if you compared apples to apples; i.e. stick to one sensor size. There were and are manufacturers that put the same sensor base (=pixel density) into APS & FF bodies too. There the bigger sensor should have more resolution with a different lens used than on the smaller camera. - That 's basically the same as in your Nikons example where the difference between DX & FX is bigger than the one between the pixel densities.
    As far as I am recalling that one he intended to point out room for tech progress, like why a Sony A7R(#) is able to outperform an EOS 5D (1st version) and why low light capability doesn't have to mean huge pixels.
    fotolopithecus likes this.
  17. People who don't understand something often seem to have this view; perhaps it's from not being able to recognize the people who do know. On the internet there are so many conflicting voices, how can one know which ones are right and which ones are blowing smoke?

    Regarding your real topic, if you asked me, in person, this question, the first thing I'd say is what do you mean by "resolution?" Do you mean the finest detail that can be resolved at the sensor, or the total amount of scene detail, etc. Or something else? Two significantly different things that sort of masquerade under the same name - until you start clarifying the definition.

    Even when you think you have nailed down the definition - say it is the finest detail that can be seen at the imager, there can be ambiguity. I was involved in testing the "resolving power" power of a lens/film system back when I was a punk kid (perhaps my late twenties?). At the place where I worked we were following the ANSI standards of the day, using the good old 3-bar resolution targets. You photograph them, process the film, and evaluate under a microscope. You are looking to find the smallest set where 3 separate bars can be seen on the film. But... as they get tinier and the film image gets really gritty, here are 3 bars, but a blob connects two of them at one end... so are they really 3 bars, or only 2? The ANSI guidelines, as I recall, were that you must use your judgment, and if you think there is greater than a 50% likelihood that 3 bars are resolved, this you consider it so. If several people rate a film, you get different ratings. So it turns out to be a bit of a statistical rating.

    If you tried this same method on a digital camera, you'd probably find more ambiguity. Instead of finding the point where 3 bars are no longer quite resolved, you may find that 2, or perhaps 4 crisp bars show up. They seem quite clearly resolved. But also quite clearly, the original test chart has 3 bars so we seem to have spurious resolution, something of a little white lie. So different test methods are used for digital cameras. And when the test methods have to be different, clearly you are not testing for exactly the same thing, right?

    There is a lot more to be said on the general topic, but this is probably enough to show how there can be some ambiguity in the answers you get. I think that most of the dissent you get is a result of people coming into it from different viewpoints - another way of saying different definitions of the issues. I can go on and on, but I'm afraid everyone's eyes will glaze over; this is the reality of the modern photo.net.

    Ps, a last comment: a couple of years ago I saw a film vs digital comparison where one of the test subjects was a landscape scene - off in the distance was a field dotted with red flowers, tiny in the image. The film and digital had broadly similar "resolution," meaning ability to see similar scene detail. But... in the digital image something like half the red flowers were missing. This is something I've known COULD happen, but thought it would have to be a contrived situation. It's basically a result of the somewhat sparse red sampling of a "Bayer filter-equipped" digital camera. Only 1/4 of the sensor pixels can actually "see" red (the rest of the red color is interpolated based on the surrounding area). But when the red articles are too small for the sensor to see a pattern, and the flowers happen to miss a red-sensitive pixel, well, they disappear. The film didn't have this problem since it has full color sensing everywhere. Had there not been a film image the testers might have missed this, probably thinking, "I recall there being more flowers, but I guess not." Just an artifact that can slip under the radar of standard resolution testing.
    fotolopithecus likes this.
  18. Well no doubt everyone can see now why it's hard to get a clear understanding of this, and it seems to be because there's really not a consensus on it. Some seem to be saying the same things.... sort of. What we need is someone who designs camera sensors to explain this in a simple way. Here's a question I have. All things being equal, if you take a DX sensor of 16mps, and an FX sensor with 16mps (which I think would mean the FX sensor's 16mps would have to be larger) which sensor would a able to resolve more fine detail? Now my intuitive answer would be that the smaller pixels would be able to resolve fine detail better, but maybe not.
  19. You're still not being exactly clear on exactly what it means to "resolve more fine detail." I'm guessing that you mean as seen on the subject, but the setup is not clear. If you change lenses to match the framing, and if the lens is not the limiting factor, then both 16 megapixel sensors should resolve roughly the same amount of fine subject detail.

    If you say that the same lens is used at the same subject distance, and that the lens is not the limiting factor, then the smaller sensor should resolve more detail. (The tradeoff is that you are not covering the full subject anymore, compared to the larger sensor.)

    This is assuming that AA filters in each camera don't cause further image degradation.
  20. Yes, framing being the same, and lens being the same, which I guess would make it a full frame lens. Assume the lens is perfectly the same on both cameras. I'm trying to get at what the sensors themselves can do irrespective of lenses.

Share This Page