Nikon D850 Monochrome

Discussion in 'Nikon' started by kevin_beretta, Oct 16, 2020.

  1. Yes... you explain to us how a Bayer array equiped sensor can resolve detail right up to the limit posed by its photosite spacing, please.
     
    mag_miksch likes this.
  2. I'll just show you; shall I?

    This resolution chart was shot at half the required magnification for its lppmm figures to be correct - i.e. at 1:100 instead of 1:50.
    DSC_1859.jpg
    The camera was a Nikon D7200 with a photosite spacing of 4 microns (24mm/6000), giving a theoretical resolution limit of 125 lppmm.
    And here's a 100% crop from the above:
    100%-crop.jpg
    There was a slight skew on the chart that explains the artefacts, but you can clearly count the lines (5 of them), which is the requirement for a resolution figure.

    Strangely, the resolution seems dependant on the RAW processing, because ACR didn't manage as well as CaptureOne.
    100%-ACR-processed.jpg
    This makes it look as if 140 lppmm are resolved, but only 4 lines are apparent. While ACR has added more artefacts to the 125 lppmm set.

    In both cases 110 lppmm is cleanly resolved. Despite the lack of orthogonality.

    Anyway, with the right processing the Bayer-filtered sensor can capture right up to its Nyquist limit, and how could that possibly be improved on?
     
  3. No. Explain.
    And remember, the limit imposed by photosite frequency.
     
    mag_miksch likes this.
  4. Well, you can still buy B&W film. Imagine that.
     
  5. Does film have a level frequency/wavelength tonal response?

    I'd never really thought about it before, just used filters to change colour > tone contrast.
     
  6. No. The spectral response is in the data sheets - usually.
    Oh just go away and do your own research q.g.
    It works. What further do you need explaining to you?

    The photosite spacing was given in my previous post if you'd taken the trouble to actually read it!
     
    andylynn likes this.
  7. A Bayer filter reduces the sample frequency to less than the photosite spacing allows. Remove the Bayer and there will be a difference. Explain your assertions, instead of telling us to go away.
     
    mag_miksch likes this.
  8. Peace, my friends :)
    It was a mere figment of my imagination that I would pursue this with vigor. Although, I am a terrible pixel peeper and the moment I used Capture One, I knew photography would never be the same (bu bye Adobe ACR and Lightroom)). Rodeo Joe showed this here as well. Still, the Max Max conversion would provide more resolution versus stock, hence the request from an Australian observatory group to have a D850 converted, that is what got Max Max to create the first D850M.

    I'll do a bit more reading but sadly the authority who has done the most work on this is DigiLloyd and that site is paywalled.
    The curiosity continues, with the next step being what an actual RAW(NEF) file looks like as it seems we all get fooled, even with Irfanview but maybe not by what RawTherapee shows. More pixels and resolution, please. I may pursue the Max Max thing .. I'll keep you posted.
     
    mike_halliwell likes this.
  9. And in the face of hard evidence to the contrary, the idiocy continues!

    A colour pixel, despite popular and uninformed belief, is not created simply by smudging together four adjacent photosites.

    Each pixel is the result of complex and sophisticated processing algorithms.

    So, yes, it's entirely possible to get a resolution up to that of (half) the pixel spacing. As I thought was clearly illustrated above.
    But that's a bit different from walking around taking a few B&W snaps!
     
    Last edited: Oct 19, 2020
    mike_halliwell likes this.
  10. Yes indeedy!

    The mechanism they use to remove those pesky layers so precisely is, I guess, Ion Milling.
     
  11. The requirement to have the CFA removed for Astronomy is more about having a flatter spectral response than greater resolution I suspect.

    If, for example, you're fitting a 0.15nm bandwidth Hydrogen-alpha filter, you don't want a bunch of additional random RGB filters messing up your carefully sculpted spectral response.

    FWIW, I'm not sure how evenly an Ion-beam miller would work on organic material like filter dyes and micro-lenses.
     
  12. A belated thought.
    How are the files from modified monochrome cameras processed?

    I presume standard raw processing software doesn't work? So how do you make the most of 14 bit depth and dynamic range?

    Like I said, I prefer to leave the built-in full pallette of colour filters option available during monochrome conversion. So I would never consider having the CFA removed. Just curious about post-processing a filterless file.
     
  13. Differently!

    People have written RAW converters that omit the bayer mosaic and de-mosaic processing. Essentially assigning luminance properties to each pixel.

    I'll go find the details!.

    What I don't know is whether true mono sensors have micro-lenses.

    All these mods remove everything ...:)
     
    rodeo_joe|1 likes this.
  14. Here you go. I suspect some of it means more to you than me...:D

    Processing

    More Tech

    Debayer Study
     
    Last edited: Oct 24, 2020
  15. From MaxMax site:
    "Debayering is a process where the software looks at each pixel and the surrounding pixels and makes a decision about what each pixel's RGB value should be."

    Well, that's technically incorrect for a start.
    Pixels don't exist until they're created from photosite data. Especially not in full colour.

    With a de-filtered monochrome sensor the translation from photosite to pixel is pretty straightforward, but in colour a pixel is definitely not interchangeable with a photosite. Not in the slightest.

    Semantics? Maybe so, but if a specialist company can't even get their terminology right, what else are they lax about?

    As for their (ancient) Canon 30D 'demonstration' - I call complete BS on that. Of course a camera with a low-pass AA filter is going to look sharper if you remove that filter.
     
    Last edited: Oct 24, 2020
  16. In addition, removing the microlens array will affect the 'fill factor' and lower the photon efficiency of the sensor. Therefore any gain in sensitivity got by removing the filter array may, at least partially, be counteracted by the reduced efficiency.

    It's not a clear win-win situation at all.
     
  17. How does a de-bayered 'photosite' determine luminosity?

    Energy wise, 2 blue photon are brighter than 3 red photons....
     
  18. I always thought that the IR filter was separate from the color filter, but I don't know that I really know the details on how they build them.
    I am not sure where you find one, but an external IR block filter would be convenient.

    There are also companies that will take the IR filter off, to allow for IR photography like from IR film.
     
  19. Absorbed photons create an electron-hole pair. Higher energy photons will create higher electron and hole energies, but they lose that pretty fast.
    The current, then, is proportional to the number of photons absorbed.

    This is a limitation on the efficiency of solar cells. All those blue photons only supply about half their energy in photocurrent.
    There is a fix in the case of solar cells, which is to put a higher band gap cell in front, which will absorb, and extract energy from, the higher energy photons. Then let the rest through to a silicon cell in the back. A lot more expensive, though, so it isn't usually done. It might even be done with three layers.


    The absorption mostly depends on the band structure of silicon.
    A web search gives plenty of pages with graphs on them.
    Slight complication that it is indirect gap, but visible photons will have enough energy to do it in any case.

    As well as I know, the overall (photons, not spectral) sensitivity is different for different pixels,
    so there needs to be a correction for that. There is, somewhere, a look-up table to correct for it.
     
  20. Some time ago, there was a question about which camera you would choose if you could have any one for free.

    I chose the Monochrom, as I knew that there is no way that I would ever decide that I could afford one, for the use I would get from it.

    The D850 does have plenty of resolution, even with any loss due to the filter array.
     

Share This Page