Jump to content

Backlight stripes in MINOLTA DiMAGE Scan 5400 II or what?


Recommended Posts

Only tangentially relevant to the theme of the post, but, I have a pdf of the service manual for this scanner,

and absolutely no idea how to post it here ?????

Maybe you can put it up somewhere and post a link to it here? Please do, if you can. Would be grateful.

Link to comment
Share on other sites

Nearly all scanners do a pre-scan 'calibration' of their light source.

1) To check that the light source is sufficiently 'warmed up' - probably not relevant for an LED source.

2) To map variations in the source. The 'map' being used by software to compensate for any width-wise variation during image processing.

 

This light-source defect map is usually taken from a clear area of the film holder or film-positioning mechanism. (If a glass platen is involved, any dust or smears on the calibration area of the glass can give a false defect map and lead to lines or streaks in the final scan.)

 

So, I think what's happening with the Minolta 5400, is that the defect map isn't being fully implemented by Vuescan. Or that one or more of the LEDs has gone out of specification with time and use; giving areas of low output or colour imbalance that can't be compensated for.

 

This would explain why Minolta's original software works, but Vuescan doesn't. An unfortunate combination of Minolta incorporating insufficient diffusion in the scanner light-source, and Vuescan - being a broad-based and general scanner software - encountering a situation that's outside of its capability to compensate for. The Minolta software was likely tailored to expect significant variations in light across the scan width, and to have the ability to apply the fairly large corrections needed.

Link to comment
Share on other sites

I eventually brought to the table six 5400II units I've got and spent almost whole day today checking them out in various settings and combinations. The first thing I must say is that I was at least partially wrong in my original post. With some units (mostly two) similar (stripes) result can happen also using original software. So, while I thought that I might have eventually found a solution, this is unfortunately not true. With both Vuescan and Minolta software it is a hit'n miss but the two units produce the stripes virtually every time on both. The remaining ones tend to do it much more often when driven by Vuescan (practically every time) but it can also happen from time to time with Minolta. Obviously I'd need to spend weeks full-time to gather proper, statistically valid sample but for one day this is the result. So when I checked the original software last night, after reading a comment in this thread, I was lucky, it seems. Checked also multi-sampling as suggested here - no change. Once the stripes are there, multisampling doesn't help. Back to Nikon…
Link to comment
Share on other sites

So when I checked the original software last night, after reading a comment in this thread, I was lucky, it seems. Checked also multi-sampling as suggested here - no change. Once the stripes are there, multisampling doesn't help. Back to Nikon…

 

It sounds like the age of the scanners is catching up with them. Hardware rather than software problems.

Link to comment
Share on other sites

It sounds like the age of the scanners is catching up with them. Hardware rather than software problems.

One of the two main selling points for 5400ii was that it's LED lit and the light source doesn't age, etc. The original, metal encased 5400s I have use CCFL lamp, which is prone to ageing and needs a warm-up but then didn't need that bubble-shaped lens and don't produce this kind of side-effects. It would be ironic if the older ones lasted actually longer. My old LED lit Nikons don't have any of these problems either so they are the obvious "goto" devices. But Minoltas have superior optics, better film handling/holders (than 5000 and its predecessors – the 9000 holders are OK) and the 5400ii is on a par in terms of scanning speed (the original 5400 is dog-slow) with the NIkons. It'd be a big pity not to be able to utilise their potential due to a faulty design of the light path…

Link to comment
Share on other sites

White. That's what OP began with. The problem with this is that there's that bubble-shaped plastic lens on the way. We'd probably need to get rid of that and build a much different board with many LEDs covered by a very good diffuser. I just ordered the highest CRI LEDs from the manufacturer OPs linked to (the newer ones) and shall see what I can do with it. There's many questions to be answered before declaring that it can be safely done with many LEDs. Power consumption, thermal considerations are the first that come to mind. Shall see if I can eventually get somewhere with those.
Link to comment
Share on other sites

There's many questions to be answered before declaring that it can be safely done with many LEDs. Power consumption, thermal considerations are the first that come to mind. Shall see if I can eventually get somewhere with those.

White LED power and efficiency has increased tremendously in the last few years. I doubt there's any risk of an overload or heat issue if matching the light output of LEDs fitted to a 15 or 16 year old design.

 

These things were put together, therefore they can be taken apart and repaired. And often the design wasn't that great in the first place. Built down to a price, rather than up to a high spec. That's what happens when the Production Engineering team get their hands on something.

 

Edit: Fully diffusing the light from lens-ended LEDs is quite difficult. I found I had to grind the ends of some LEDs flat to get rid of any collimation from the moulded domed package.

Link to comment
Share on other sites

White LED power and efficiency has increased tremendously in the last few years. I doubt there's any risk of an overload or heat issue if matching the light output of LEDs fitted to a 15 or 16 year old design.

 

That's what I hope for… deep inside ;-) The thing is that even if they're twice TDP efficient now, it may not be enough if I replace three with twelve or fifteen. This may be needed in order to be able to get rid of that extra lens. The other thing is that getting rid of the lens calls for extremely good diffuser pretty close to the LEDs themselves, which means less circulation, more heat accumulation etc. But I'll get answers to those questions once I try and measure things. As I have more than one of those scanners I am not too scared to experiment.

 

Edit: Fully diffusing the light from lens-ended LEDs is quite difficult. I found I had to grind the ends of some LEDs flat to get rid of any collimation from the moulded domed package.

 

Those are flat SMDs but they surely need to be diffused heavily to eliminate hotspots, even if I put twelve of them.

Link to comment
Share on other sites

Well, the more LEDs you use, the less light you need from each device, so the current and dissipation per LED is reduced.

 

Modern COB LEDs are incredibly bright, and barely get warm. I'm pretty sure that a couple of integrated bar-type units, such as can be found in cheap keyring lights or similar, will put out more light than a dozen 10 or 15 year old separate LEDs, and with a better spectrum as well.

 

I also suspect that the prismatic 'D' profile lens was originally intended for use with a miniature fluorescent tube, and that the LED board was a hurried and fudged replacement design. Because it doesn't sound like it ever worked too well.

 

Don't overthink this. It's just a light source - and one that doesn't work well as it stands. How could you make it worse?

 

This little keyring light works off a few button cells, is blindingly bright and develops no perceptible heat.

IMG_20200220_221026.jpg.100d8adf1fd2b521dc0ab29c849dd46f.jpg

Edited by rodeo_joe|1
Link to comment
Share on other sites

Well, the more LEDs you use, the less light you need from each device, so the current and dissipation per LED is reduced

The thing is that you can't freely regulate the light intensity. Reducing current beyond what the LEDs are designed to work with will heavily affect chromaticity/CRI. So they need to work within the specs limits and the final intensity needs to be reduced by ND filter.

I also suspect that the prismatic 'D' profile lens was originally intended for use with a miniature fluorescent tube, and that the LED board was a hurried and fudged replacement design. Because it doesn't sound like it ever worked too well

Of course I can't tell for sure what they meant but to me it looks like a light beam shaping design. They used only three LEDs there (presumably highly expensive back then) and tried to shape the beams to get something close to uniform coverage of the target area. Please note that three "D"s on the lens are aligned with the three LEDs on the adjacent board. A kludgey substitute for a wide, well diffused light source. Since this approach can probably never give 100% accurate results, they introduced additional software calibration process to make up for hardware deficiencies. This "mostly works", but… Anyways, LEDs arrived. I plan on taking one of the scanners apart in a fortnight.

Link to comment
Share on other sites

Modern COB LEDs are incredibly bright, and barely get warm. I'm pretty sure that a couple of integrated bar-type units, such as can be found in cheap keyring lights or similar, will put out more light than a dozen 10 or 15 year old separate LEDs, and with a better spectrum as well.

COB increases efficiency/lumen density but does not affect CRI. We're talking about a higher-end film scanner here. That means I am rather convinced that none of the typical bar type units as can be found in cheap keyring lights will have CRI even close to acceptable for the purpose of accurate colour reproduction we strive for.

Link to comment
Share on other sites

As long as there's approximately equal light output at the maximum cyan, yellow and magenta density of the film dyes, the CRI is relatively unimportant.

 

You're not trying to match critical colours, just push some light through a bit of film. The colour balance is adjusted in software. That's part of the calibration process. Think about it: if CRI was important, you'd never be able to scan negative film.

We're talking about a higher-end film scanner here.

Oh really?

That would be why the original light source was so botched then.

Honestly, does it look as of the designers had 'high end' in mind when they created that mess of lenses to try and even out the light from 3 inadequate LEDs?

Link to comment
Share on other sites

As long as there's approximately equal light output at the maximum cyan, yellow and magenta density of the film dyes

Which are six variables to match, varying from film to film and led to led. And then you sample CMY into RGB.

the CRI is relatively unimportant. You're not trying to match critical colours, just push some light through a bit of film.
Well, I disagree.

We're talking about a higher-end film scanner here
Oh really?

That would be why the original light source was so botched then.

Honestly, does it look as of the designers had 'high end' in mind when they created that mess of lenses to try and even out the light from 3 inadequate LEDs?

They clearly screwed it. They apparently went the way Bose pioneered in audio domain: "we don't care much how the hardware performs, we'll make up for everything in software". This "mostly" works. Until it stops. But all that doesn't change the fact that this scanner is one of five worth spending one's time on, short of some drum machines and Imacon's X range. If that doesn't make it "higher-end" then I don't know what does. It has better film handling than Nikon 5000, it has better glass than Nikon 5000, subjectively it has better Dmax than Nikon 5000/9000. Except intrinsic differences in edge-to-edge performance caused by drum scanning, either virtual (Imacon) or real (Heidelberg) it outperforms both (I have only older Imacon) on 135 in terms of outcome quality and I don't even start talking about total mount and scan time per frame. So yes, "really" :-)

Link to comment
Share on other sites

It's obvious from the condenser system that this scanner was originally designed to use a small fluorescent tube, and then modified (badly) to use LEDs. Possibly with thoughts of rapidly switching between a 'white' light source and IR LEDs to perform IR defect reduction.

 

Whatever. The botched white LED source needs re-designing, and with the range of LED types and sizes available now, that should be an easy exercise. There should be absolutely no need for additional external power sources or any other modification apart from the LED board, and removal of the stupid additional 3 lenses. If the internal supply from years ago could drive less efficient LEDs without overheating, it's just common sense that it can drive more efficient ones to the same light output also without overheating.

And then you sample CMY into RGB.

Exactly!

And those RGB channels are all that matter. They can be amplified or attenuated in software to compensate for almost any light source, within reason.

 

The light will have gone through at least two stages of filtering before hitting a sensor and being digitised. Once through the film dye layers, which vary from film type to film type, and again through the RGB filters over the tri-linear sensor array. The interaction between film dyes and RGB sensor filters is a complete unknown, and therefore must be compensated for in post-sample processing. This renders the RGB levels of light emitted by the source relatively unimportant, since they can, and must, be adjusted for each type of film.

 

This is the whole principle of tri-stimulus theory, which experiment and practise has shown to be imperfect in being able to recreate all colours exactly. So having a good CRI is not the same as providing approximately equal light output at the RGB sampling bands. That could theoretically be achieved

With three monochromatic sources, which would definitely not have a good CRI with reflective dyes and pigments. But we're not dealing with reflective surfaces in a huge colour space. We're dealing with transmissive dyes in a limited space.

 

At no point did I suggest that a randomly picked keychain LED would be ideally suitable for use in a scanner. I was simply using it as an example of the efficiency of modern LEDs, and the number of form-factors that LEDs can be readily obtained in these days.

Edited by rodeo_joe|1
Link to comment
Share on other sites

If the internal supply from years ago could drive less efficient LEDs without overheating, it's just common sense that it can drive more efficient ones to the same light output also without overheating.
It will have to drive many more LEDs to _much higher_ light output and it's not so much about overheating (overloading) the power supply (although this has to be double-checked too and possibly bypassed with a new supply path if needed - the external PSU should be more than enough to handle it though) but about overheating the plastic elements of the light duct (and its surroundings) itself. As for the CRI stuff - I believe that you are mistaken (or misunderstood my intentions). While good results can in fact be achieved with three RGB light sources or one having good output in the RGB range (not CMY as you originally suggested), tuned properly to the sensor's characteristics, this means quite a bit of fine-tuning. I believe that's what Nikon did and achieved excellent results, others are now measured against. Since I don't have the means Nikon engineers had and you can't balance in post-processing what you didn't sample properly in the first place, therefore I opt for high CRI. Using high CRI light source, more or less guarantees that there will be no "holes" in the frequencies, where they must not be. This is not to have the film lit so that it looks "correct" to human eye. It is in order to have closer to continuous light spectrum being emitted. So that what passes through the film can be sampled with enough dynamic range and S/N ratio at the frequencies the scanning sensor is sensitive to, regardless of what they actually are.
Link to comment
Share on other sites

OK. It's your time and money to waste.

 

And if you can get a set of LEDs to melt that - redundant - lens array, then you're pumping way, way too much current into them.

 

Read the previous posts in this thread. The mods have already been tried and the replacement LEDs gave out too much light. Which means the current needs cutting down; thereby reducing the heat dissipation.

 

If you spread the light source by using a larger number of LEDs, you can throw away that cobbled 3 lens array. Outputting the original amount of light from more LEDs will not increase the heat generated, since light output, and heat generated, is proportional to the current through an LED, and each LED will need less current in inverse proportion to the number of LEDs. E.g. 3 LEDs running at 50mA each = 6 LEDs @ 25mA each = 12 LEDs @ 12.5mA, etc. Except that heat reduces the efficiency of a LED, so you'll get even more light output for the same power dissipation by using a higher number of LEDs.

 

Download a couple of datasheets for the linear CCD sensors used in those old filmscanners. You'll see that the RGB sensitivity is far from equal in the three channels. So it's really a waste of time ensuring that your LEDs have equal output at the RGB filter peaks. Software balancing is definitely needed, and will definitely be incorporated in Minolta's original software.

 

Also, the film's CMY transmission isn't irrelevant. The magenta dye used is far from perfect, and blocks a lot of blue light as well as green.

Edited by rodeo_joe|1
Link to comment
Share on other sites

  • 7 months later...
Hi Everyone, I think I've fixed it. I have the Minolta Dimage Scan Elite 5400 II with severe banding when using Vuescan. The banding was consistent with the location of the grooves in the three "half circle" lenses. I tried the original minolta software and no banding was there. I opened up the scanner, got into the mirrors and lights and everything in there (open heart surgery) and cleaned everything but the banding was still there. So it seems the minolta software subtracts out uneven lighting during calibration. So what you have to do is explicitly run the Scanner--> calibrate step in Vuescan after opening the software. thought that the little dance the scanner did when firing up the software was calibration but it's not. You still have to go to the menu and calibrate. I'm very relieved and hope this little trivial step will help you.
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...