Removing Bayer Mask

Discussion in 'Mirrorless Digital Cameras' started by rayfraser, Mar 13, 2005.

  1. Has anyone ever removed a bayer mask? Please correct me if I am wrong but it should be possible to increase resolution by at least factor of 2. My probably errant reasoning is that there would then be twice as many B&W pixels as green pixels. Are these mask's usually directly attached requiring chemical removal or is it possible to just peel them off? Of course for CMOS imagers with a layer of microlenses, I assume those would be removed along with mask.
     
  2. I don't understand the reason for Bob's curt (to the point of rudeness) answer. I am currently trying to get hold of old/discarded/obsolete digicams so I can try to remove the Bayer filters. I figure the first few tries will probably result in destroyed cameras. One problem with this is that "disposable" point-and-shoots rarely offer RAW output. So really interesting experiments won't be possible. But if removing the filter turns out to be straightforward, maybe I'll try it later on a higher-end camera. Ray, if you attempt any experiments of your own, please post about them.
     
  3. You may not like Bob's answer but he's right. First, you won't gain any increase in resolution. The Bayer color matrix does not degrade the luminance channel, it only affects the chrominance channels. Since the final image is formed using the brightness values in every pixel, the full spatial resolution of the sensor is maintained, only the color resolution is interpolated. The Bayer color matrix is sometimes formed on a glass substrate that's epoxy-glued onto the sensor, or it can be deposited directly on the surface of the sensor. In either case, it is almost impossible to remove it without damaging the chip, whether mechanically (scratching the active surface, or cracking the fragile silicon die), or electrically (the sensors are very susceptible to ESD damage, and the tiny wirebonds that connect the die to the package can be easily broken). So, in the end you may have to thank Bob for his direct answer...
     
  4. I agree with Bob. Godfrey
     
  5. Berg Na wrote: "You may not like Bob's answer but he's right". Well, maybe so, but I learned something from your answer. Thanks for elaborating, Mike Spencer
     
  6. I don't think it's possible either. But IF you could, you could then increase resolution somewhat by yanking the antialiasing filter too--monochrome aliasing doesn't look nearly as bad as color aliasing. This probably wouldn't increase max resolving power much (if at all) but would give you better MTF at frequencies approaching the resolution limit Then there's the issue of speed--I'm not sure exactly how much speed the bayer filter costs, but I've got to believe it's at least a stop. It'll help even more if you're shooting underexposed in very red lighting (for me, shooting a band on stage) the blue channel is generating very little image and a lot of noise, because it's getting very little light. WB is an after-the-fact adjustment, and can't really fix color-balance-induced underexposed channels.
     
  7. I'm really not trying to discourage you here... but even if you manage to strip off the color filter array without killing the chip, the next step will be even more complex: completely rewriting the image reconstruction algorithm in the camera firmware. Without this, the image will be full of odd artifacts since the values of the luminance channel are determined from an expected response from the -- now missing -- Bayer CFA (and there are different versions too).
     
  8. "Just say no" - Nancy Regan
    "Google is your friend" - Internet Sage
     
  9. Berg Na: That's why I think that ultimately, the best camera for this type of experiment is one that can produce RAW output. If I got that far, I was planning to try to enlist some help from the fellow who develops dcraw. From what people are saying, it sounds as if I'll be quite lucky to get even close to that stage, especially given my rather limited knowledge of electronics.
     
  10. Ray, Removing bayer masks is not really possible, its just the way the sensor is made. HOWEVER, as a unrelated point, if you remove the IR filter from a consumer digicam with RAW (such as the G2) and use a program such as DCRAW and disable bayer whatever (I will post when I remember the DCRAW option) you do get a IR image with more resolution than the traditional color one.
     
  11. That's a cool link, Bob! Only now I'm gonna lay awake at night wondering when we'll see CMY sensors to replace the RGB sensors we use now.
     
  12. I meant in the post from 7 pm. I knew about the 760M from a Mike Johnston essay. I just reread Pete Myers's review of that camera. Sigh. It's going to be a joyous day when the next B&W digital SLR arrives. (Not that I won't have a ton of fun shooting with what I have until then!)
     
  13. Given the huge success of the DCS 760m, I doubt we'll see another mass production monochome DSLR anytime soon.
     
  14. Thanks for all the encouragement! Actually it looks like Berg Na may have provided the necessary go ahead "glass substrate that's epoxy-glued onto the sensor" - I had previously tried unsuccessfully to remove what appears to be a glass cover with a small screwdriver. I might be able to cut it off with an exacto knife. Bob probably knew I'd be back with my $10 hacked extended dynamic SMaL cameras. Google is my friend also and I had already been to site given by Bob - the article does support higher resolution when light source is blue but did not explain Berg's glass substrate and epoxy. I have a couple of fried ones that won't cost anything to experiment on (sometimes the firmware mod fails to properly update flash memory). I am thinking some B&W raws from $9.99 CVS Drugstore digitals using a custom decompressor may best demonstrate 120db (20 bit range) claim of SMaL. While a nonimbedded decompressor (SMaL uses a patent pending lossless compression routine) has yet to be completed, one of the hackers is close to a solution.
     
  15. I don't know why some of you folks don't just go back to B&W film. All this time and effort expended to limit the advantages of the digital camera. I don't get it. A strictly monochrome digital, like the Kodak. Wonderful, just what we need It's bad enough the manufacturers squander their development dollars on anachronistic DSLR designs instead of taking the digicam concept to a professional image quality level. Let's continue to take this incredible technology and use it to recreate the first half of the last century. Finally the photographer can have all this control over color.. and they clamber for a b&w monochrome digital. You all probably really get off on the Epson RD-1's film advance lever. Maybe that'd be a good camera to experiment with removing the bayer mask.
     
  16. Don't knock it. There are guys in their basement working on perpetual motion machines, everlasting batteries, gasoline engines that get 1000mpg and contacting aliens. Who knows, maybe one day one of them will come up with something. It's their time to waste. Everyone needs a quest. The fact that it's 99.9% (or in some cases 100%) likely to be unobtainable just makes the journey a little longer. Some people like to do things the hard way. Question: Why shoot film when you can buy a camera for $9.99 from the local drugstore, hack into the electronics, take a belt sander to the sensor and (if you're very, very lucky) end up with images which are almost as good as those shot with a Kodak disk camera? Answer: because it's "fun" I guess. An interesting project might be to try to build a monochrome digital back for a film camera. You can buy monochrome imaging sensors. Kodak will sell you a KAC-9638 1288 x 1032 resolution high dynamic range monochrome imaging chip with peripheral circuitry on a PC board for $27. Or isn't that enough of a challenge?
     
  17. You won't get lovely dynamic range out of the 10-bit A/D convertor on that beastie, Bob.
     
  18. Dean: I can't go "back" to film because I never got into film in the first place. Digital suits my sensibilities much better.
     
  19. The dynamic range is 57dB. That's 10^5.7 or 1:500,000. It has on-chip image processing which presumably applies non-linear conversion (gamma maps). The output is 10 bit, but that's after processing. Your Jpegs are only 8-bit and everything you've ever printed is 8-bit. Seems like enough for most applications to me, but what do I know. I'm just a dumb 'ol Ph.D. engineer, not a photographer. If I was a photographer and I wanted high resolution, high dynamic range monochrome images, I'd read about the Zone System and use film.
     
  20. Did anyone else here ever see the Simpson's episode where Homer bumbles into a job as head of the design dept of a car manufacturer, and designs his dream car?
     
  21. Ben, the two mask removal techniques I've played with are UV fading and dissolving the mask with solvents. Chris Johnson is a bit braver, and has unsealed chips and removed the applique style filters that Berg mentioned. As Ryan points out, DCRAW is an easy way to get at the basic sensor data. Reconstruction is relatively easy after that. Most raw formats still scale the red, green, and blue channels to fit within the dynamic range of the format, so you typically need to apply scaling factors. Ryan's way, removing the IR filter, is also a good way to get your feet wet. This works because the red, green, and blue filters are all transparent to IR from about 800nm to 1100nm. You'll need to remove the camera's IR cut filter anyway, as you tear down the camera to gain access to the sensor. Use a relatively long wavelength IR filter (87, 87A, 87B, 98C) instead of an 89B. An 89B (Hoya R72, Cokin 007, Heliopan RG715) will pass too much visible red, and the red pixels will have a substantially different signal from the green and blue (not just scaled differently, but different data entirely). Monochrome conversion, either through actual removal of the filters, or using IR to "bypass" the filters, are resolution increasing techniques. Berg's statement about resolution and luminance is very far off. "The Bayer color matrix does not degrade the luminance channel, it only affects the chrominance channels. Since the final image is formed using the brightness values in every pixel, the full spatial resolution of the sensor is maintained, only the color resolution is interpolated". Bryce Bayer's work is based on using the green channel (1/2 the sensors pixels) for the luminance of a "typical" scene. Less "typical" scenes may only produce useful information in the red or blue channel, and this limits resolution to 1/4 the pixels. Furthermore, before the sensor is a spatial low pass (anti-aliasing) filter that further reduces resolution, because chrominance aliasing (color moire) is visually annoying in the extreme. Technically, while a monochrome camera should still include an AA filter, the effects of luminance aliasing are not as visually annoying as chrominance aliasing, and can often be ignored. And there is a "mass production monochrome DSLR" currently on the market, the Sigma SD10, although it's only 3.4mp. Its stacked photodiodes let you add up the color channels in the raw file to get the total output of the photodiode stack. And, at the same time, its quite useful as a color camera. Its IR capabilities are also prodigious. Bob and Berg, you're way off base this time. Sorry.
     
  22. >> First, you won't gain any increase in resolution. The Bayer color matrix does not degrade the luminance channel, it only affects the chrominance channels. Since the final image is formed using the brightness values in every pixel, the full spatial resolution of the sensor is maintained, only the color resolution is interpolated. << The color interpolation process itself degrades resolution somewhat. Not a lot but measurably so. I have to laugh when I read people complaining about the anachronistic nature of monochrome images and the luddite attitude of people who'd like a digital camera dedicated to creating them. Isn't photography about individual expression, creativity, vision? Shouldn't people who "see" best in monochrome desire the best possible tools? Well, maybe not. Maybe photography is really about slavish adherence to whatever technology the powers that be present us with. Maybe it's really about comformity rather than expression. -Dave-
     
  23. Joseph - There are many different types of Bayer architectures that are very different from the one proposed by Bryce E. Bayer in 1976, when he worked at Eastman Kodak. There's an article about the different types here. In any case, every pixel in the sensor is photosensitive and does contribute to its spatial resolution. There's absolutely no situation where the resolution of the sensor is limited to only a quarter of its pixels.
     
  24. Did anyone read the article Bob found using google? Did Bob read the article? Here is a quote that clearly indicates decreased spatial resolution for lower wavelength blue light (not sure whether that could some sky and water):
    "Thus, three of the four Bayer filters in each quartet pass an equal amount of yellow light, while the fourth (blue) filter also transmits some of this light. In contrast, lower wavelength blue light (435 nanometers; see Figure 4) passes only through the blue filters to any significant degree, reducing both the sensitivity and spatial resolution of images composed mainly of light in this region of the visible spectrum."
    So perhaps such an experiment is not a waste of time if it proves so many "experts" wrong. Are there any shots on PN that have significant amounts of the lower blue?
    In answer to Bob's belt on belt sanders, with my lifelong disdain for authority clearly stated in my bio, I would not do it just for fun and song, I would do it just to prove the Senior Editor wrong. And to all the other nay sayers - you've encouraged stripping away bayers.
     
  25. Love the poetry. Joseph, Ray, if you've had any success, post pictures and text with details! It'd help me a lot, I'm not really skilled in hardware hacking. If you put together something good, I bet Slashdot would link to it. I can provide hosting if you want to give it a try.
     
  26. Hey, they laughed at the guy who jumped off the Eiffel tower wearing a pair of wooden wings strapped to his arms....Oh, wait, bad example...
    Well there's always Columbus. He didn't fall off the edge of the earth as many predicted, so "conventional wisdom" can indeed be wrong.
    The worst that could happen is a featured spot on crank.net, alongside the guy who's building a nuclear reactor in his basement, or the guy who actually did build one in his backyard.
    "When David's Geiger counter began picking up radiation five doors from his mom's house, he decided that he had "too much radioactive stuff in one place" and began to disassemble the reactor. He hid some of the material in his mother's house, left some in the shed, and packed most of the rest into the trunk of his Pontiac"
    I say go for it (the camera, not the nuclear reactor)!
     
  27. While many are content to limit themselves to current 8 bit displays, those wishing for photo longevity may be well advised to save raws with greater dynamics. How much will future projection devices be able to handle?
    My long term goal is to capture migrating Monarch butterflies from an RC plane including UV in attempts to display them against landscapes like never seen before (especially using SMaL's extended dynamic hopefully on a future display device). Does anyone have a better suggestion than using a cheap drugstore digital with bayer removed? Note that UV falls below the 435 wavelength mentioned above. And if succesful I doubt that it will be considered a crank!
     
  28. Bob seems to assume I don't know that 8 bits is enough for printing, web viewing, etc. What I've found shooting 8-bit is that images "fall apart" when I manipulate contrast, levels, curves, etc. in post-processing. I care about an extra few bits of dynamic range not for output, but for flexibility in the "digital darkroom".
     
  29. Bob, what a great example. "Well there's always Columbus. He didn't fall off the edge of the earth as many predicted, so 'conventional wisdom' can indeed be wrong." The only problem was that the conventional wisdom was exactly, totally, 100% correct. In 1492, it was well known that the Earth was round, and about 8000 miles in diameter. This was known to anyone who navigated a ship, and anyone who financed such voyages. (But it wasn't a good idea to say this too loudly around powerful church members). Christoforo (or Christobol) Columbo was an amateur linguist, Bible scholar, and scientist. Based on his translations of biblical dimensions and the works of Ptolemy, and his experiments with how fast the masts of ships disappeared over the horizon, he became convinced that the commonly accepted diameter of the earth was wrong. It was quite brilliant, he may have been the first person to attempt to calculate the diameter of the earth using line of sight. Unfortunately, the calculations didn't take into consideration refraction due to the miles of air, stacked in layers of different temperature and density, so Columbus's earth ended up around 5000 miles in diameter, instead of 8000. But, for the late fourteen hundreds, it was still an amazing bit of work. Anyway, aside from his calculations irritating the people who actually knew the diameters of the earth, some other "activities" of his involving members of the royal court of Spain made it clear to the queen that it was time for Chris to be gone. So, she gave him three ships full of political prisoners, malcontents, and other undesirables. And she gave him just enough provisions for the journey west from Europe to India on the 5000 mile Earth. He was supposed to starve to death in the mid Atlantic, along with the afore mentioned ships full of malcontents. Unfortunately for the plans of the royal court of Spain, somebody left an extra continent laying around, just waiting for a crazed dreamer like Columbo. In any way, if you don't like the idea of a B&W camera, then stay out of the way of those of us who are working on them. There's room enough in the sandbox for all of us to play.
     
  30. Ben, CMY sensors won't replace RGB sensors. In fact, quite the opposite. CMY or CMYG sensors were popular 5 years ago, but are virtually extinct today. RGB sensors have more accurate colors (lower observer metamersim) and the higher signal levels of a CMYG pattern are offset by the higher mathematical gains in the color conversion matricies, leading to increased noise. That article is highly inaccurate in a number of ways, don't take it too seriously. No sensor has had RGB responses as bad as the ones they show in the "Bayer Filter Transmission Spectral Profiles" graph in many years.
     
  31. Berg, that Fill Factory link was very amusing. Yes, I'm quite familiar with pseudorandom patterns, equidistribution patterns, various 4, 6, 8, and 9 color patterns, and even 3 and 6 color equidistribution heagonal patterns. Are you familiar enough with modern filter spectral responses to know that the ones illustrated are not indicitave of the state of the art? Even if we use that particular sensor, you will note that the green and blue responses both botom out at less than 10% of the red response around 630nm. In otherwords, for red objects, you're not getting enough detail in the green and blue channels to contribute anything other than noise to the "luminance". I'd expect a more current sensor to have responses in the reds or blues that dropped the opponent color to 3% or less of the target color.
     
  32. These days, just about everything is done in software where possible. Why does this apparently not apply to chrominance anti-aliasing? Putting an (inevitably fairly crude) hardware filter in front of the sensor to do it by an ill-controlled form of local averaging, and then spending much effort trying to overcome the mess produced by this with sharpening software, does not seem on the face of it like a very smart way to deal with the problem. What am I missing?
     
  33. What am I missing?
    Aliasing occurs and is a consequence in any sampled-data system. Potential aliases need to be reduced, to an acceptable level, before the sampling process.
     
  34. One thing to remember is by removing this filter you are effectively changing the position of the focal plane so your auto focus will be off. This is due to the removal of the refractive index of the Bayer filter.
     
  35. Brad, I was first appointed as a (full) Professor of Statistics thirty years ago; I do know about some of the concepts involved, although I am not an expert on sensor arrays. You must forgive me for saying that I find your 'explanation' needs a bit more unpacking for me. Are you saying that the physical anti-aliasing filter is acting as a spatial low-pass filter in the frequency domain, thereby doing things that it would be too late to do by processing the pixels? Maybe this could be of value to reduce luminance anti-aliasing, although I am not clear how commonly this would really be a problem. I would need a bit more persuading that it was the right way to tackle chrominance moire effects. And I am certainly suspicious that there is a lot of overkill involved, because of the amount of time people spend sharpening their digital images. The reason for trying to minimise the need for this is because local-average operators are not invertible, and the harder you try to produce an approximate inverse, the more unstable is the operator that you get. That is why sharpening can go so badly wrong if it is overdone.
     
  36. Typo, should be 'luminance aliasing'
     
  37. >> These days, just about everything is done in software where possible. Why does this apparently not apply to chrominance anti-aliasing? > Are you saying that the physical anti-aliasing filter is acting as a spatial low-pass filter in the frequency domain, thereby doing things that it would be too late to do by processing the pixels? << I believe this is what I am saying with my example.
     
  38. Robin, you sumarized it perfectly "the physical anti-aliasing filter is acting as a spatial low-pass filter in the frequency domain, thereby doing things that it would be too late to do by processing the pixels?" There is no way to disambiguate aliasing. The chrominance or luminace aliasing could just as easily be a picture of something that really does look like moire. Aliasing is a many to 1 mapping, it's not reversable.
     
  39. In any way, if you don't like the idea of a B&W camera, then stay out of the way of those of us who are working on them
    I'm in your way? How so?
    I'm just giving my opinion that it's most likely a waste of time. However as I think I said, it's your time to waste.
    While there was a time when garage tinkerers could do stuff the big guys couldn't, I think that era mostly went out with the Wright Brothers and the two musicians who invented Kodachrome.
    I knew a guy who built his own digital cassette recorder. He spent years on it. By the time he'd finished it, he could have bought one that worked better at lower cost from a commercial manufacturer.
    Some people just like to tinker. Good luck to them. However they're usually more interested in tinkering than in the final product. Everyone needs a hobby.
    My guess is that even if you could grind the face off the chip and remove the bayer filtering mask, then rewrite the RAW conversion algorithms to account for the fact that all the sensors now had the same sensitiviity, even if the rest of the camera hardware (chip amplifiers etc) didn't have problems, the final result wouldn't be significantly better than you can get from doing an optimized color -> B&W conversion in software.
    I'm very willing to be proved wrong however. I look forward to seeing the results of these experiments.
     
  40. Bob, while you may believe all you were doing was "giving your opinion", I saw your conduct as rude, mocking, and beligerant. You didn't just state that you thought people working on this were wasting their time. You lumped them into the category of people building perpetual motion machines... And you may be right, the era of garage tinkerers Woz probably finished over 100 years ago with the Write Brothers. Those guys should all just go find Jobs. The unemployment likes are Packered with stary eyed dreamers, and it's up to people like you Who Let them know that their dreams are futile. Ciao! Joe
     
  41. ...I saw your conduct as rude, mocking, and beligerant.
    Oh please... Was just insightful observation and very well-grounded as well.
     
  42. Yup, if you can't attack someone's idea, attack them personally. That always works. I think it's a dumb idea, but lots of people have dumb ideas and a very few people have dumb ideas that turn out not to have been so dumb. Show me the monochrome results from a modified color sensor that are better than a converted color -> B&W image and I'll admit I was wrong. Until then, I'll hang onto my opinion.
     
  43. Bob - In an earlier post, you said that a dynamic range of 57dB was equivalent to 1:500000, but the ratio is actually only 1:700 since dynamic range is defined by 20 log(max signal/noise). 1:500000 will be over 110dB, a figure that you won't get with a $27 digital imaging kit!
     
  44. "Yup, if you can't attack someone's idea, attack them personally" Bob, that is exactly what you did, over and over again. Look over what you wrote in this thread. I await your apology. Joe
     
  45. Berg - you are correct. Dynamic range is as you stated, so it's the square root of what I posted (I'd assumed a simple ratio), or just over 700:1 for a 57dB dynamic range. That's about 9.5 stops in photographic terms.
     
  46. Berg & other skeptics, I am skeptical about SMaL's claim of 120db but do not know how to measure without any specialized equipment. I have tried several different shots that no one but myself sees as extended dynamic. One of my reasons for trying to remove bayer is to obtain "measurable" samples of dynamic range. I would welcome any suggestions. Here is one of my Sun shots (example for dynamic discussion only please do not critique overall low image quality):
    00BXRd-22408184.jpg
     
  47. Bill C., Joseph - if you are still there! I think we are saying the same thing about the basics. The subtlety that I am trying to convey is as follows. Because most objects are not pure R, G, or B, you are usually sampling the image of an object at every pixel of the Bayer pattern; in other words, that is how you should think of the luminance being sampled. The effective sampling frequency for the chrominance is lower, because you have to start comparing pixels with one another. so for the luminance, you either have to have an anti-aliasing filter or accept (as I believe the EOS-1D did) the risk of moire, for exactly the reason set out by Bill C. What I am suggesting is that there might be a clever algorithm for removing residual chrominance moire that can exploit the higher luminance resolution after sampling to avoid the need for an anti-aliasing filter that is aggressive enough to fix the chrominance moire prior to sampling. It is all to do with the fact that you are sampling three channels not one, but they usually have a high level of spatial correlation. Sorry about the jargon, but I can't really avoid it to make my point precisely.
     
  48. Bob is right. Learn the Zone System. Robert
     
  49. This is an old thread but people might be interested in this http://www.nikoncafe.com/vforums/showthread.php?t=82434 where Iliah Borg of RawMagick has posted a picture taken with a modified Nikon D2x with the BA stripped off using solvents and processed in RawMagick
     

Share This Page