Jump to content

Nikon D850 Monochrome


kevin_beretta

Recommended Posts

Does film have a level frequency/wavelength tonal response?

No. The spectral response is in the data sheets - usually.

No. Explain.

And remember, the limit imposed by photosite frequency.

Oh just go away and do your own research q.g.

It works. What further do you need explaining to you?

 

The photosite spacing was given in my previous post if you'd taken the trouble to actually read it!

  • Like 1
Link to comment
Share on other sites

  • Replies 65
  • Created
  • Last Reply

Top Posters In This Topic

No. Explain.

No. The spectral response is in the data sheets - usually.

 

Oh just go away and do your own research q.g.

It works. What further do you need explaining to you?

 

The photosite spacing was given in my previous post if you'd taken the trouble to actually read it!

A Bayer filter reduces the sample frequency to less than the photosite spacing allows. Remove the Bayer and there will be a difference. Explain your assertions, instead of telling us to go away.

  • Like 1
Link to comment
Share on other sites

Peace, my friends :)

It was a mere figment of my imagination that I would pursue this with vigor. Although, I am a terrible pixel peeper and the moment I used Capture One, I knew photography would never be the same (bu bye Adobe ACR and Lightroom)). Rodeo Joe showed this here as well. Still, the Max Max conversion would provide more resolution versus stock, hence the request from an Australian observatory group to have a D850 converted, that is what got Max Max to create the first D850M.

 

I'll do a bit more reading but sadly the authority who has done the most work on this is DigiLloyd and that site is paywalled.

The curiosity continues, with the next step being what an actual RAW(NEF) file looks like as it seems we all get fooled, even with Irfanview but maybe not by what RawTherapee shows. More pixels and resolution, please. I may pursue the Max Max thing .. I'll keep you posted.

  • Like 1
Link to comment
Share on other sites

A Bayer filter reduces the sample frequency to less than the photosite spacing allows.

And in the face of hard evidence to the contrary, the idiocy continues!

 

A colour pixel, despite popular and uninformed belief, is not created simply by smudging together four adjacent photosites.

 

Each pixel is the result of complex and sophisticated processing algorithms.

 

So, yes, it's entirely possible to get a resolution up to that of (half) the pixel spacing. As I thought was clearly illustrated above.

hence the request from an Australian observatory group to have a D850 converted, that is what got Max Max to create the first D850M.

But that's a bit different from walking around taking a few B&W snaps!

Edited by rodeo_joe|1
  • Like 1
Link to comment
Share on other sites

The requirement to have the CFA removed for Astronomy is more about having a flatter spectral response than greater resolution I suspect.

 

If, for example, you're fitting a 0.15nm bandwidth Hydrogen-alpha filter, you don't want a bunch of additional random RGB filters messing up your carefully sculpted spectral response.

 

FWIW, I'm not sure how evenly an Ion-beam miller would work on organic material like filter dyes and micro-lenses.

Link to comment
Share on other sites

A belated thought.

How are the files from modified monochrome cameras processed?

 

I presume standard raw processing software doesn't work? So how do you make the most of 14 bit depth and dynamic range?

 

Like I said, I prefer to leave the built-in full pallette of colour filters option available during monochrome conversion. So I would never consider having the CFA removed. Just curious about post-processing a filterless file.

Link to comment
Share on other sites

How are the files from modified monochrome cameras processed

Differently!

 

People have written RAW converters that omit the bayer mosaic and de-mosaic processing. Essentially assigning luminance properties to each pixel.

 

I'll go find the details!.

 

What I don't know is whether true mono sensors have micro-lenses.

 

All these mods remove everything ...:)

  • Like 1
Link to comment
Share on other sites

From MaxMax site:

"Debayering is a process where the software looks at each pixel and the surrounding pixels and makes a decision about what each pixel's RGB value should be."

 

Well, that's technically incorrect for a start.

Pixels don't exist until they're created from photosite data. Especially not in full colour.

 

With a de-filtered monochrome sensor the translation from photosite to pixel is pretty straightforward, but in colour a pixel is definitely not interchangeable with a photosite. Not in the slightest.

 

Semantics? Maybe so, but if a specialist company can't even get their terminology right, what else are they lax about?

 

As for their (ancient) Canon 30D 'demonstration' - I call complete BS on that. Of course a camera with a low-pass AA filter is going to look sharper if you remove that filter.

Edited by rodeo_joe|1
Link to comment
Share on other sites

In addition, removing the microlens array will affect the 'fill factor' and lower the photon efficiency of the sensor. Therefore any gain in sensitivity got by removing the filter array may, at least partially, be counteracted by the reduced efficiency.

 

It's not a clear win-win situation at all.

Link to comment
Share on other sites

Silicon photodiodes are quite sensitive to red and near-infrared, so a sensor stripped of the CFA would likely have quite a stronger sensitivity to red and NIR light than the human eye. (Also the sensitivity to UV is higher, but typical lenses don't transmit much UV, so maybe it is not as big a problem.) This then should be corrected by using the appropriate filter to come up with a pleasing B&W rendering. I would assume that in the Leica Monochrom, Leica would have put an appropriate filter in place of the CFA?

 

I always thought that the IR filter was separate from the color filter, but I don't know that I really know the details on how they build them.

I am not sure where you find one, but an external IR block filter would be convenient.

 

There are also companies that will take the IR filter off, to allow for IR photography like from IR film.

-- glen

Link to comment
Share on other sites

In a mono sensor, is a pixel's luminance derived by the number of photons or the 'energy' of those photons.?

 

ie a blue photon has more inherent energy than a red one....so is 'brighter'?

 

What is never divulged is the actual sensor response either before or after you rip off the CFA etc.

 

So, just like in the first days of panchromatic B/W film, who knows what tone which colour comes out at?!!

 

Absorbed photons create an electron-hole pair. Higher energy photons will create higher electron and hole energies, but they lose that pretty fast.

The current, then, is proportional to the number of photons absorbed.

 

This is a limitation on the efficiency of solar cells. All those blue photons only supply about half their energy in photocurrent.

There is a fix in the case of solar cells, which is to put a higher band gap cell in front, which will absorb, and extract energy from, the higher energy photons. Then let the rest through to a silicon cell in the back. A lot more expensive, though, so it isn't usually done. It might even be done with three layers.

 

 

The absorption mostly depends on the band structure of silicon.

A web search gives plenty of pages with graphs on them.

Slight complication that it is indirect gap, but visible photons will have enough energy to do it in any case.

 

As well as I know, the overall (photons, not spectral) sensitivity is different for different pixels,

so there needs to be a correction for that. There is, somewhere, a look-up table to correct for it.

-- glen

Link to comment
Share on other sites

I bet that generates some amazing results under the right circumstances, but $5700 for a monochrome DSLR that needs software tricks to use the files seems like a big, big commitment. Are the advantages over doing a bw conversion of regular raw files enough to be worth it? A regular D850 generates a heck of a lot of image data to work with, and so does a Z7. Adorama is selling refurb Z7s for $2050.

 

Some time ago, there was a question about which camera you would choose if you could have any one for free.

 

I chose the Monochrom, as I knew that there is no way that I would ever decide that I could afford one, for the use I would get from it.

 

The D850 does have plenty of resolution, even with any loss due to the filter array.

-- glen

Link to comment
Share on other sites

I always thought that the IR filter was separate from the color filter

Yup, you are correct. The easily removable filter stack usually has the UV/IR blocker, an AA (if present) and the ultrasonic vibration sensor cleaner mechanism.

 

They are usually spaced away from the sensor surface by a very thin O ring.... and can be lifted away after removing a csrewed on clamp.

 

What you are then left with is a glass cover epoxy bonded over the Microlens and CFA layer . Most people stop there...:D

 

Then, if you are very brave (?)... you do this..:eek:

 

Scratching the Color Filter Array Layer Off a DSLR Sensor for Sharper B&W Photos

  • Like 1
Link to comment
Share on other sites

Differently!

 

People have written RAW converters that omit the bayer mosaic and de-mosaic processing. Essentially assigning luminance properties to each pixel.

 

I'll go find the details!.

 

What I don't know is whether true mono sensors have micro-lenses.

 

All these mods remove everything ...:)

The M Monochrom has an offset microlens array. These double the efficiency of collecting light. My original Monochrome camera, a Kodak DCS200ir, came out before the use of microlens arrays.

 

Converted cameras lose their microlens array, you give up about as much efficiency as you lose of a color filter array.

 

Nikon uses the same 16MPixel CMOS sensor in their full-frame microscope camera as they use in the Df and D4. Look for one on the used market and get someone to swap sensors for you. Nikon made a prototype DF-M, but did not bring it to market.

 

I apply a minimum of processing to individual files from the M Monochrom. I wrote my own software to rewrite new DNG files with an added Gamma curve, and to convert the pixel values to 16-bits.

  • Like 1
Link to comment
Share on other sites

I always thought that the IR filter was separate from the color filter, but I don't know that I really know the details on how they build them.

I am not sure where you find one, but an external IR block filter would be convenient.

 

There are also companies that will take the IR filter off, to allow for IR photography like from IR film.

The BG55 cover glass absorbs 99.9% of the IR.

 

IR cut filters, ie "Hot Mirror" filters are reflective. IR absorbing filters are available, but tend to alter color in the visible spectrum as it makes it's way though. A color camera would pick up a green tint. The cover glass over the sensor and dye in the color filter arrays are essentially matched to produce colors that are captured.

  • Like 1
Link to comment
Share on other sites

The BG55 cover glass absorbs 99.9% of the IR.

 

IR cut filters, ie "Hot Mirror" filters are reflective. IR absorbing filters are available, but tend to alter color in the visible spectrum as it makes it's way though. A color camera would pick up a green tint. The cover glass over the sensor and dye in the color filter arrays are essentially matched to produce colors that are captured.

 

I think I intentionally didn't say "absorb" when I mentioned the IR filter, but I might not remember if I actually thought about it at the time.

  • Like 1

-- glen

Link to comment
Share on other sites

The D850 uses a BIA (backside <snigger> illuminated array) sensor I believe.

 

I wonder if the geometry of this allows easier 'scraping' or makes it harder?

 

And I would have thought that removing the CFA with an organic solvent would be easier and safer.

 

Also, if the subject sits still long enough, you can get the full photosite resolution, and in full colour, from Sony's multi-exposure pixel-shift technology.

 

My a7riv can resolve a theoretical 260 lppmm. Now if only I can find where I put that aberration-free f/2 lens.....

Edited by rodeo_joe|1
  • Like 1
Link to comment
Share on other sites

Although whether 'stacking 20 shots (after alignment) with a tap on the tripod between shots has the same effect??

'Twould have to be a very precise tap!

 

Seriously though, that really wouldn't work. Moving the whole lens+sensor assembly would result in different parallax between near and far parts of the scene.

 

Quite honestly, I was totally skeptical about pixel shifting until I owned the a7riv. I was converted to a believer and amazed at the resolution of a humble 55mm micro-Nikkor when I tried it.

 

Here's a rather tired old res chart at twice its standard distance (1:100 magnification instead of 1:50) so the lines marked as 100 line-pairs are actually 200 lppmm!

1007021350_200lppmm.jpg.9671a2b41c3a21ef7a21fbdb12db86e5.jpg

The full-frame is inset.

Don't know about you, but I couldn't even resolve the largest bars with my naked eye at that distance.

  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...