Jump to content

Minolta Elite 5400 - Is it a "2700 dpi" scanner ?


Recommended Posts

After 'Exposure' and 'Focus, GD, IR cleaning' threads, I wanted to

conclude with a more general thread (speed, memory usage, interface...

) on my Elite 5400, but I got results that bother me.

 

I made several Scans of the same color negative image with Minolta's

software, by changing in series : GD, ICE, scanned as 'Negative' or

'Positive with max exposure', Resolution. I made 19 realistics

variations, with timings + CPU usage + Memory usage.

 

What bothers me is that the scanner seems to provide the same amount

of data at 5400dpi and at 2700dpi...

 

Scanned as 'Color Positive', no GD or ICE, 16 bits (same exposure, no

image correction):

 

- at 1350dpi = 1mn19s (CPU usage while scanning = 22%)

 

- at 2700dpi = 2mn37s (CPU usage while scanning = 44%)

 

- at 5400dpi = 2mn37s (CPU usage while scanning = 44%)

 

Whereas there is some logic from 1350dpi to 2700dpi (time x2 and CPU

x2 => data x4 : OK), you don't find this again from 2700dpi to 5400dpi

(same time, same CPU usage !!!). I get the same kind of results as

'Negative', with GD or with ICE (although I cannot measure CPU usage

with ICE).

 

I am starting to believe that my Elite 5400 is indeed a 2700dpi

scanner ???

 

Any idea ?

 

Olivier

Link to comment
Share on other sites

Don't try to run algorithm analysis techniques on a sample set this small. It doesn't work. There are too many factors here that you don't, and probably can't, record. Besides, you're at the limits of the recordable data, of COURSE there are going to be nifty glitches and steppings. Welcome to the wonderful world of engineering, you may not assume that the horses are spheres.
Link to comment
Share on other sites

- 5400dpi files are 4X bigger than 2700dpi files. Well... hopefully :) But you can easily just 'resize' a 2700dpi file to get the same file size. The scanner may also do this... (I didn't say that it actually did).

 

- I resized the 2700dpi scan to the same size (bicubic), and increased sharpness up to get the same visual feeling of sharpness. From my point of view, there are more details in the original 5400dpi scan. I am actually looking at the details of the grain itself to check this because my lens + technique are rather on the soft side (sigma 18-35 - handheld...). Maybe a better resizing/sharpening technique would improve the result of the 2700dpi, but probably not up to the level of the 5400dpi.

 

BOTTOM LINE : I believe that I am looking at REAL '5400dpi' SCANS (and that Minolta wouldn't play that kind of game).

 

But, Steven, I also want to understand what's going on within this scanner, and I believe that one sample is enough... at least to try and build a theory. More experimentations are then welcome for supporting it.

 

My idea is that the Elite 5400 is managing what is captured (CCD) in a more complex way than what we would expect... I'll try to explain it in my next post.

Link to comment
Share on other sites

So here goes a possible explanation :

 

The Elite 5400 is composed of 3 CCD Lines (R+G+B) of 5300 cells each. It is able to move the film in 7800 steps, and get a max 5232 x 7800 pixels image (5400dpi).

 

Exposition target for each individual cell of the CCD should remain the same => ie: for the same image, Exposure Time is proportional to the number of steps for the Scan.

 

In those conditions, the CPU usage is related to the quantity of data transferred by the scanner per second => ie: CPU % is proportional to the number of cells used in the CCD lines for the Scan (for low dpi, this rule will be less accurate because of other data/processes).

 

So when I look at the timings and CPU Usage, here is what I get:

 

At 5400dpi, the scanner is using all of its 5300 cells (x3 RGB) and making its 7800 steps. CPU=C0 (44%) / Time=T0 (2mn37s).

 

At 2700dpi, the scanner is still using all of its 5300 cells (x3 RGB) and still making its 7800 steps. CPU=C0 / Time=T0.

 

At 1350dpi, the scanner is using 2650 cells (x3 RGB) and making 3900 steps. CPU=C0/2 / Time=T0/2.

 

At 675dpi, the scanner is using 1325 cells (x3 RGB) and making 1950 steps. CPU=C0/4 (7% in the test - in 8bits => equivalent to 14%) / Time=T0/4 (43s in the test).

 

So what ?

 

Well, it means that for 'Positive' Scans in 2700, 1325 and 675 dpi, the Elite 5400 is averaging the value of 4 cells to get 1 pixel in the image. You get lower noise (maybe equivalent to 4x multi-pass), and better transcription of the data in the slide (from 4, not 1 point only).

 

I know that the title of this thread is not related to this theory, but really, this is my real-time testing and thinking...

 

Bruce, I know that I am in the 'measurbator' category, and sometime a rather proud member : ) . Here goes life...

 

But as for me, I have just discovered that I don't need to scan at 5400 dpi and resize it to 2700 dpi afterwards (given the technical softness of my pictures, 5400dpi is not so usefull). This will really save some time for my up-coming 100s (1000s) of Scans...

Link to comment
Share on other sites

More findings...

 

This theory seems to be true in 2700dpi in all mode ('Pos'/'Neg' ; with/without GD ; with/without ICE) => You probably don't need multi-pass in 2700dpi...

 

When you use ICE in 675dpi (POS and NEG), you will get same timings as in 1350dpi. Thus you get a 4x4=16 averaging. Why? Because it has been reported that ICE does bring artefacts in very low dpi. And Minolta found this solution...

 

I cannot confirm the theory for 'Negatives' in 1350, 675dpi because timings/CPU are less related.

 

From an exposure point of view: when you use GD you increase Exposure time by 4.65 (absorption...), when you use GD+ICE you increase Exposure time by 9.3(=2x4.65) (2 scans : RGB & IR, in 1 pass).

 

I will also check how Vuescan deal with all this... (Yeah... I know, I am measurbating... But tomorrow, I stop... maybe... :)

Link to comment
Share on other sites

ok. Vuescan does give you exactly what you ask for: '5400dpi' will use the 5300(x3 RGB) cells and 7800 steps. '2700dpi' will use 2650(x3 RGB) cells and 3900 steps (half Time, nearly half CPU). It does not averaging/subsampling things...

 

I have to say that I much prefer this kind of interface (Vuescan), rather than the 'don't worry, we know what's best for you' thing of Minolta...

 

However for my High-Res Scans, it does not make much of a difference: I was ready to Scan at 5400dpi and resize to a 2700dpi scan. Well... I know that with Minolta's Software, I just have to select '2700dpi' and I will get the same result. And for my 'excellent ones', I will switch to 5400dpi with Multi-scanning, and Minolta can do it as well.

 

My only concern with Minolta's Software, is that I would like to make Low-Res Scans (650dpi) with ICE (no GD) for indexing my photo. And here Minolta is completely wrong: it puts GD and 16x averaging. When I need productivity, I get 18.6x longer scan than what I should get (Vuescan). Please Minolta: Let us decide what is good for us, and just add detailled explanations into your manual...

 

Olivier

Link to comment
Share on other sites

My 5400 shows a significant increase in resolution and detail as well as a decrease in pixel size as the resolution setting is changed from 2700ppi to 5400ppi. File size increases accordingly. The motivation for your post is rather unclear to me.
Link to comment
Share on other sites

Here are some images. I corrected the colors to get the same visual aspect + Gamma x0.45 (2.2=>1.0=linear) for Negs (in 16 bits). Please note : those areas would be very dark on a normal print, and the tree would be black, with almost no noise. This is a 'heavy' editing...

 

1: Vuescan as 'color negative' scanned at 1350dpi (no clipping, auto-white 0..1) Single-Scan.

 

2: Vuescan as 'color negative' scanned at 1350dpi (no clipping, auto-white 0..1) Single-Pass Multi-Scan(4x). You get slightly more details, but it seems that noise is not reduced.

 

3: Minolta as 'color negative' at '1350dpi' (=real 2700dpi/2) - no correction, Single-Scan. You can clearly see that the deep shadows of the tree have been clipped. If you look at the shadow of the blue window on the right, you will notice that noise has been reduced. You also get slightly more details.

 

4: Vuescan as 'color negative' scanned at 2700dpi then reduced by 2 in both directions (no clipping, auto-white 0..1) Single-Scan. This picture looks very similar to the Minolta (3), except that deep shadows have not been clipped (you can get the details on the tree). Noise is smoother than on 1 and 2.

 

5: Minolta as 'color positive' at '1350dpi'(=real 2700dpi/2), Exposure has been maximized without clipping - no correction, Single-Scan, Linear mode. Compared to (4), noise has been reduced further. You can read more about it in this thread : http://www.photo.net/bboard/q-and-a-fetch-msg?msg_id=006E3o

 

- - -

 

From those results, I consider that: Single-Pass Multi-Scan is useless, that 'Up-sampling' (scanning at higher resolution + reducing size afterward) is much more efficient (well... if you want 5400dpi, you will not be able to do so... :). I don't like how Minolta 'Negative' clip deep shadows, but this will not be a problem for most of my pictures - for difficult editing I will use the Max 'Positive' method. Vuescan is doing great here.

 

When comparing (3) and (4), I also believe that those pictures support the theory explained above (ie: in Minolta's software, '675dpi', '1350dpi' and '2700dpi' are performed at 1 level higher of resolution, and then downsized by 2 in both directions).

 

Olivier

Link to comment
Share on other sites

  • 2 weeks later...

It seems that a lot of people (cf newsgroups) think that a faster CPU (4Ghz???) would speed-up scanning times a lot with the Elite 5400, especially with GD&ICE. The answer is NO !

 

If you have a 2600Mhz CPU, you will be able to get the quickest Scan at 5400dpi 16bits (ie: 'Positive', no ICE, no GD). All other situation are much less CPU intensive (including the GD&ICE scenario), and I believe that a 1500Mhz CPU is all you need to get the fastest 5400dpi 16bits with GD&ICE. So why is it so long? Because Exposure becomes the bottle-neck when the GD is activated, and your 10Ghz dual CPUs will not help.

 

For people who wants more detailled information, here it goes (in Minolta's Software only - FireWire):

 

In all modes ('Color Positive', 'Color Negative', 'B&W Positive', 'B&W Negative') (but without GD or ICE) dpi Scans work this way : '337dpi' (minimum) will get you a real 675dpi Scan resized to the '337dpi' equivalent. From '338' to '675' you get a real 1350dpi Scan resized to what you asked for. From '676' to '1350' you get a real 2700dpi Scan resized. And from '1351' to '5400' you get a real 5400dpi Scan resized.

 

CPU Usage is proportional to the number of cells of CCD used for scanning (ie: dpi of real Scan for the height[24mm]) AND bit-depth selected (16bits = 2xCPU usage for 8bits).

 

Exposure time for the scan is proportional to the number of steps used for scanning (ie: dpi of a real Scan for the width[36mm] and to the exposition chosen).

 

If your CPU is not able to cope with the flow of Data, the scanner will wait after Exposure => Scanning time will get longer than the normal Exposure time.

 

I noticed that my Athlon 2500+(Barton) is at 100%, and a bit short of getting the max speed of 60s for Positive at 5400dpi-16bits.

 

So, the Mhz you need is = 2600Mhz x (RealScanDpi_width/5400) x (BitDepth/16) for Positive (default Exposure). If you scan Negatives, you may reduce it by 30% (depending on Density & Exposure). Example: you will get the fastest time for a '2700dpi' at 8bits for a Negative, at about 900Mhz.

 

When GD is activated, Exposure Time will take 4.65 x longer (cf absorption), and thus CPU requirements will be divided by 4.65 (same example: you don't need more than 200Mhz for the fastest '2700dpi' 8bits with GD of a Negative).

 

When ICE is activated, things get a bit different: First, the minimum real dpi is set to 2700. Second, ICE takes some CPU Power. It is very difficult to estimate how much, because the Process reports "100%" automatically (my method : I had 2 programs running at high priority and taking about 55% of CPU. I started the Scan at 5400dpi 16bits with GD&ICE, and the process got 60% of CPU with the same scanning time) => 5400dpi at 16 bits with GD&ICE needs 1500Mhz for fastest results - 1000Mhz should be fine for Negatives. And compared to GD alone, the scanner makes a second scan (time x2) for IR channel.

 

Take those findings with a bit of salt, and if you get timings/CPU that are completely differents from this : please tell me : )

 

Additional notes:

 

An USB2 may be slightly less CPU efficient than FireWire.

 

Vuescan seems to be a bit more CPU efficient, for the same output. So it may be more productive, in particular for very slow computers : you will be able to scan directly to RAW, and do all the post-processing later (at night?). It will not change a lot for faster computers, as Exposure will be the bottle-neck (and don't forget that a '2700dpi' in Minolta's Software is indeed a real 5400dpi Scan...).

 

I said that I didn't find any improvement in noise (for color negatives) with multi-scanning (single-pass). I am still puzzled... However, I made a relationship to the Noise Reduction systems in digital Cameras, where an additional shot is taken with shutter closed to record the noise in the CCD and allow its substraction from the real longer shot. ie: the noise in the longer exposure keeps the same 'pattern'. This may explain why an additional immediate scan would not reduce noise...

 

RAM requirements : When you scan, Minolta's Software takes the RAM needed for the output (ie: 233Mb for '5400dpi'-16bits / 58.4Mb for '2700dpi'-16bits...). But editing a file does require a lot (3x Image size is considered as a minimum. 5x is better). For example: I use 2700dpi 16bits Images, and scan while editing = 160Mb(for OS&progs) + 60Mb(scan) + 5x60Mb(edit in PWP) = 520Mb. I am ok with my 512Mb, using a bit of Swap file. If I decide to go for 5400dpi-16bits, I will need 160+240+5x240=1600Mb => 1.5Gb with a bit of swap...

 

Last word. I like the Minolta's Software for its ICE quality. However, I would like to have more control over the scanner: less clipping of Negs, possibility to have ICE without GD, total control over the oversampling in x and y (ex1: scan at 2700dpi x2 in height and 2700dpi in width to get a '2700dpi' Scan with 2x averaging at twice the usual '2700dpi' speed. / ex2: scan at 675dpi x4 in height and 675dpi in width with ICE (no GD) to get clean index scan in 30s instead of 6mn now : ( I think that I will provide some feedback to Minolta, to see if those improvements can appear in a 1.1.2 version soon. Please give your own suggestions, or support mines : )

 

Olivier

Link to comment
Share on other sites

Update : I noticed some changes with the new Minolta's Software (1.1.2).

 

The software is still oversampling by 2x2 for Lower Res (2700-337 dpi), but exposure time has been slightly decreased for '5400' and '2700' Scans, and more than divided by 2 for '1350' and lower Scans. I suppose that Minolta did not increase the power of light, but decided to lower real Exposition level of the CCD (this makes my CPU, time comparison more complicated...)

 

So, it has nearly no effect on the Quality for '5400' and '2700' Scans, whereas smaller Scans get much quicker (and quality decreases).

 

This makes sense (at least for me) => Good news.

 

Olivier

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...