Jump to content

Who can explain the relationship.....


Recommended Posts

Between pixel density, and resolution. I ask because I've been told that the greater the pixel density the greater the ability a sensor has to resolve fine detail. That would appear on it's face not to be true, since a Nikon D810 has more resolving power than a Nikon D7200, which has greater pixel density, and by a fair amount. If the D7200 for example were scaled up to a full frame, it would have roughly 50 mps density wise. We need this issue resolved once, and for all, because the effect of it appears to be that nobody knows nothing. ;)
Link to comment
Share on other sites

  • Replies 50
  • Created
  • Last Reply

Top Posters In This Topic

Digital resolution is usually measured by the total number of pixels on a side, rather than pixels/inch. An image with the same field of view will require more enlargement for the same size print from an APS-C camera (e.g., D7200) than a full frame camera (D810). Unlike the D7200, the D810 does not have an anti-aliasing filter, which would reduce the effective resolution by about 30%.

 

The sensor is not the only thing to affect resolution, rather the lens used. In order for a lens to have an insignificant effect on resolution, it would need 3x-4x the resolution dictated by the pixels/inch. Few lenses achieve that goal, hence degrade the theoretical resolution. If the lens and sensor had equal resolution, measured independently, the combined resolution would be half that value. Since the D7200 and D810 use essentially the same lenses, they would perform better on a full frame camera.

 

I think Nikon has picked up the pace on lens quality, along with price increases. My Nikon lenses date to 2001 or earlier. I can compare them directly to other lenses, since they fit on a Sony FF camera using simple adapters. Their performance is well below that of lenses designed specifically for Sony, which is why I no longer use them. Of the lenses I own there is one exception. The manual 55/2.8 Micro-Nikkor will actually resolve images at the pixel level on an A7Rii (42 mp, no AA filter).

  • Like 1
Link to comment
Share on other sites

Resolution is not related to pixel density, except in conjunction with specifying within the identical format. For instance, with the same density, a large sensor will have more resolution with the proportionally correct lens than a small one, because the total field will have more pixels. So as a stand-alone factor, pixel density doesn't mean much.

 

What does have to do with pixel density, only, is noise, because smaller pixels ( that is, more densely placed in any format) have more noise in marginal situations than larger ones., regardless of format size. That's why a 16mp full frame camera is better than a 16mp four-thirds camera. This mainly shows up at high ISO speeds, because the sensor is being pressed towards its limit.

 

Whether the lens can do the job is a third factor, which also is unrelated to pixel density, in that this is not a sensor issue, it's a lens issue. Lens inadequacy is not the sensor's problem.

  • Like 2
Link to comment
Share on other sites

Here's the situation. We have two perfectly sensible explanations which don't agree. Add to that, that I think I remember a Tony Northrup video where he claims small pixels don't inherently produce more noise because the area occupied by one large pixel will gather the same light as two smaller one's in the same space.
Link to comment
Share on other sites

Pixel density is the pixels per square cm (or inch, mm), so relative to the sensor size. Resolution is the total amount of pixels available, and hence absolute. Sensor size time pixel density yields resolution.

 

What the side effects are of both is a whole other discussion, and depends on many more variables, like technological advancements made in sensor development and production, the quality and design of the processing circuits and quite a lot more (and lenses add even more variables). So, while the relation between pixel density and resolution is drop-dead easy, the relative effects not and in reality the effects of all this in real-world photos is pretty limited. But if you're into pixelpeeping, I guess it's massively interesting in some way.

  • Like 1
Link to comment
Share on other sites

"I've been told that the greater the pixel density the greater the ability a sensor has to resolve fine detail." Ah, the myth of the power to resolve fine detail! Covering the same area of a scene, the more pixels the more detail can be captured and pixel density on the sensor has nothing to do with it.

 

I had an exhaustive, ridiculous "discussion" on this subject with AstroImager aka Paul Lefevre on the popphoto forum. The forum is no longer active, but the past threads can still be looked at. You may or may not want to check it out, but be warned, your head may start to spin!

  • Like 1
Link to comment
Share on other sites

. . . Tony Northrup video where he claims small pixels don't inherently produce more noise because the area occupied by one large pixel will gather the same light as two smaller one's in the same space.

 

Which argument ignores any sensitivity threshold issues, should they exist, which I suspect they do. Also, there's the problem that I doubt very much that two or four pixels within the same area as one can do so without any total area loss from the space between that divides them, especially with the problem of focusing microlenses and sensor depth cluttering up the problem.

 

May I draw your attention to figure 4, here: How to Evaluate Camera Sensitivity

 

Of course, I would still maintain that Ed's solution is about lenses as much as about sensors, which was not really part of the OP's question.

  • Like 1
Link to comment
Share on other sites

[i say this with tongue firmly planted in cheek relative to what fotolopithecus recently said in the "Likes" thread]

 

What’s the difference again between photography and audiophilia?

 

:eek:

 

I take your point, but in my case I'm interested in both, because while I'm not taking pictures I like to be thinking, or talking about all things photographic. For an Audiophile it's pretty much just thinking about components, preamps, Amps, Speakers, Turntables, and in what arrangements of them to find the Holy grail of true musicality. No actual artistic input of your own there.

  • Like 1
Link to comment
Share on other sites

Unlike the D7200, the D810 does not have an anti-aliasing filter, which would reduce the effective resolution by about 30%.

The D7200 has no AA filter. Also where does the nicely round 30% come from???

 

The OP might also research Thom Hogan's discussions of pixel density and resolution.

  • Like 1
Link to comment
Share on other sites

"I've been told that the greater the pixel density the greater the ability a sensor has to resolve fine detail." Ah, the myth of the power to resolve fine detail! Covering the same area of a scene, the more pixels the more detail can be captured and pixel density on the sensor has nothing to do with it.

 

I had an exhaustive, ridiculous "discussion" on this subject with AstroImager aka Paul Lefevre on the popphoto forum. The forum is no longer active, but the past threads can still be looked at. You may or may not want to check it out, but be warned, your head may start to spin!

 

I remember it, and although Paul was very knowledgeable, and helpful, after a time I wondered if he had it right on this. Nonetheless, let's not go down that road again.

Link to comment
Share on other sites

Oh Pith!

Strictly speaking "Resolution" is measured by length unit or area unit. For example 100 pixels per inch or 10000 pixels per inch square. The total amount of pixels in an image is really not the resolution but I don't really have a good term for it. In the TV industries they call it "Definition".

Using those terms the D7200 has higher resolution but the D810 has higher definition.

  • Like 1
Link to comment
Share on other sites

Whether the lens can do the job is a third factor, which also is unrelated to pixel density, in that this is not a sensor issue, it's a lens issue. Lens inadequacy is not the sensor's problem.

Actually, lens resolution is intimately connected to pixel density. The closer pixels are spaced, the more resolution is required from the lens.

  • Like 1
Link to comment
Share on other sites

I've been told that the greater the pixel density the greater the ability a sensor has to resolve fine detail

It would be true if you compared apples to apples; i.e. stick to one sensor size. There were and are manufacturers that put the same sensor base (=pixel density) into APS & FF bodies too. There the bigger sensor should have more resolution with a different lens used than on the smaller camera. - That 's basically the same as in your Nikons example where the difference between DX & FX is bigger than the one between the pixel densities.

I think I remember a Tony Northrup video where he claims small pixels don't inherently produce more noise because the area occupied by one large pixel will gather the same light as two smaller one's in the same space.
As far as I am recalling that one he intended to point out room for tech progress, like why a Sony A7R(#) is able to outperform an EOS 5D (1st version) and why low light capability doesn't have to mean huge pixels.
  • Like 1
Link to comment
Share on other sites

We need this issue resolved once, and for all, because the effect of it appears to be that nobody knows nothing.

 

People who don't understand something often seem to have this view; perhaps it's from not being able to recognize the people who do know. On the internet there are so many conflicting voices, how can one know which ones are right and which ones are blowing smoke?

 

Regarding your real topic, if you asked me, in person, this question, the first thing I'd say is what do you mean by "resolution?" Do you mean the finest detail that can be resolved at the sensor, or the total amount of scene detail, etc. Or something else? Two significantly different things that sort of masquerade under the same name - until you start clarifying the definition.

 

Even when you think you have nailed down the definition - say it is the finest detail that can be seen at the imager, there can be ambiguity. I was involved in testing the "resolving power" power of a lens/film system back when I was a punk kid (perhaps my late twenties?). At the place where I worked we were following the ANSI standards of the day, using the good old 3-bar resolution targets. You photograph them, process the film, and evaluate under a microscope. You are looking to find the smallest set where 3 separate bars can be seen on the film. But... as they get tinier and the film image gets really gritty, here are 3 bars, but a blob connects two of them at one end... so are they really 3 bars, or only 2? The ANSI guidelines, as I recall, were that you must use your judgment, and if you think there is greater than a 50% likelihood that 3 bars are resolved, this you consider it so. If several people rate a film, you get different ratings. So it turns out to be a bit of a statistical rating.

 

If you tried this same method on a digital camera, you'd probably find more ambiguity. Instead of finding the point where 3 bars are no longer quite resolved, you may find that 2, or perhaps 4 crisp bars show up. They seem quite clearly resolved. But also quite clearly, the original test chart has 3 bars so we seem to have spurious resolution, something of a little white lie. So different test methods are used for digital cameras. And when the test methods have to be different, clearly you are not testing for exactly the same thing, right?

 

There is a lot more to be said on the general topic, but this is probably enough to show how there can be some ambiguity in the answers you get. I think that most of the dissent you get is a result of people coming into it from different viewpoints - another way of saying different definitions of the issues. I can go on and on, but I'm afraid everyone's eyes will glaze over; this is the reality of the modern photo.net.

 

Ps, a last comment: a couple of years ago I saw a film vs digital comparison where one of the test subjects was a landscape scene - off in the distance was a field dotted with red flowers, tiny in the image. The film and digital had broadly similar "resolution," meaning ability to see similar scene detail. But... in the digital image something like half the red flowers were missing. This is something I've known COULD happen, but thought it would have to be a contrived situation. It's basically a result of the somewhat sparse red sampling of a "Bayer filter-equipped" digital camera. Only 1/4 of the sensor pixels can actually "see" red (the rest of the red color is interpolated based on the surrounding area). But when the red articles are too small for the sensor to see a pattern, and the flowers happen to miss a red-sensitive pixel, well, they disappear. The film didn't have this problem since it has full color sensing everywhere. Had there not been a film image the testers might have missed this, probably thinking, "I recall there being more flowers, but I guess not." Just an artifact that can slip under the radar of standard resolution testing.

  • Like 1
Link to comment
Share on other sites

Well no doubt everyone can see now why it's hard to get a clear understanding of this, and it seems to be because there's really not a consensus on it. Some seem to be saying the same things.... sort of. What we need is someone who designs camera sensors to explain this in a simple way. Here's a question I have. All things being equal, if you take a DX sensor of 16mps, and an FX sensor with 16mps (which I think would mean the FX sensor's 16mps would have to be larger) which sensor would a able to resolve more fine detail? Now my intuitive answer would be that the smaller pixels would be able to resolve fine detail better, but maybe not.
Link to comment
Share on other sites

All things being equal, if you take a DX sensor of 16mps, and an FX sensor with 16mps (which I think would mean the FX sensor's 16mps would have to be larger) which sensor would a able to resolve more fine detail?

 

You're still not being exactly clear on exactly what it means to "resolve more fine detail." I'm guessing that you mean as seen on the subject, but the setup is not clear. If you change lenses to match the framing, and if the lens is not the limiting factor, then both 16 megapixel sensors should resolve roughly the same amount of fine subject detail.

 

If you say that the same lens is used at the same subject distance, and that the lens is not the limiting factor, then the smaller sensor should resolve more detail. (The tradeoff is that you are not covering the full subject anymore, compared to the larger sensor.)

 

This is assuming that AA filters in each camera don't cause further image degradation.

Link to comment
Share on other sites

Yes, framing being the same, and lens being the same, which I guess would make it a full frame lens.

 

But if the lens is the same, meaning the same physical lens at same focal length, how do you get the same framing with different size sensors?

 

In some cases you can change the distance, but if everything is not on a flat plane then the playing field is no longer level.

 

IF - your subject is a flat plane (perpendicular to the lens axis) then you can change distance to match the framing, and expect both 16 megapixel sensors to record the same amount of subject detail. Assuming, as before, that neither the lens nor the AA filter is the limiting factor.

 

I mention the lens limitation to rule out the use of a very small physical aperture, in which case the smaller sensor's image would be degraded more.

Link to comment
Share on other sites

The "input" side of this conversation might be (a bit of) a red herring. I know, garbage in garbage out. Still, it might be better to look at resolution from the point of view of output. I know photographers who use digital leicas with the best lenses in the world and their only form of output is jpgs on Instagram. Seems like overkill but if you can afford it and don't care, that's cool too. And isn't digital printing at labs limited to 260 dpi, anyway?

 

Back on the input side, we now have lenses that can out-resolve sensors and vice versa. But unless you're making very fine prints, how much does it even matter? Going past that, ISO is going to have some impact on detail for a lot of reasons including noise and dynamic range.

 

I don't mean to sound dismissive of a very interesting subject but isn't photography about making pictures? It's fascinating to think about what's going on under the hood but I suspect most of us are staring at Porsche engines trying to understand why they are better performers than VW beetle engines, which share a similar design.

 

Just my two cents.

  • Like 1
Link to comment
Share on other sites

But if the lens is the same, meaning the same physical lens at same focal length, how do you get the same framing with different size sensors?

 

In some cases you can change the distance, but if everything is not on a flat plane then the playing field is no longer level.

 

IF - your subject is a flat plane (perpendicular to the lens axis) then you can change distance to match the framing, and expect both 16 megapixel sensors to record the same amount of subject detail. Assuming, as before, that neither the lens nor the AA filter is the limiting factor.

 

I mention the lens limitation to rule out the use of a very small physical aperture, in which case the smaller sensor's image would be degraded more.

 

You move the camera positions until the framing is the same to answer the first question. You're complicating things beyond what I'm asking. Let's say everything is on the same plane, no AA filter involved, were talking all things being equal, it's the sensors I'm asking about, not lenses, or anything else, but what the sensors themselves are capable of when put under the same situation. Resolving fine detail like hair, or the lines that make up what appears to be solid black on the area around George Washington's head on the dollar bill etc.

Link to comment
Share on other sites

You move the camera positions until the framing is the same to answer the first question. You're complicating things beyond what I'm asking.

 

No, no, no. You keep throwing complications into the mix, like requiring the same fl lens so the angle of view has to change. I'm putting in all the conditions so that you can't say, "well, I shot a landscape at f/22, then backed up a half mile to keep the framing the same, etc. (In that case, the larger sensor will win.)

 

When you keep the playing field level, same subject framing and image quality not limited by the lens, then the two 16 megapixel sensors should both produce the same image detail.

 

Just think of the scene being broken into 16 million little dots, in one case the dots are shrunk into a smaller space. How can they be different?

  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...