Jump to content

Recommended Posts

Hello. I start to investigate film negative scanning questions about 10 years ago and currently develop some really fast but same time technically correct workflow. This workflow works with Linear Tiff files from Film Scanners as well as with DSLR reproduction systems.

So let's start from quick tips. In short workflow looks like this:

Debayer RAW file and convert it from Camera input ICC profile to ProPhotoRGB with L* gamma custom-made ICC profile -> Invert -> Apply RGB AutoLevels -> Recover back clipped data from RGB AutoLevels -> Adjust tonal curve and Contrast.

 

1. White balance camera to your light source and don't change WB anymore in RAW file. You may notice that this is the first essential task that every real scanner do when you turn it on. It calibrates sensor WB to light source.

When you start to adjust WB in RAW or attempt to pick orange mask in RAW, this cause shift in input profile and colors became incorrect after Invert. Colors will shift a lot depending of Color Space and Gamma you use.

Same goes to Blue filters applied after WB was set. Those filters may slightly amplify blue channel, but they also will change original ratio between colors. I described this problem in details here negadoctor invert mistake · Issue #6770 · darktable-org/darktable

 

2. Do invert and per channel RGB AutoLevels in L*gamma working profile. To generate working ICC profiles with L* gamma use Synthetic ICC app from DisplayCAL—Display Calibration and Characterization powered by ArgyllCMS package, or use RawTherapee ICC Profile Creator, or just use Elle Stone's Well-Behaved ICC Profiles (L* named there as labl) ellelstone/elles_icc_profiles.

Most free as well as commercial tools usually do Invert in basic sRGB tonal curve. This cause huge shifts in tonality and require aggressive manual curve adjustments with too many non-technical "guesses" and "fixes". Working gamma affects a lot highlights rolloff look and so film scans inverted in sRGB gamma start to look "too digital". Also in sRGB gamma Auto Levels may produce very different result if applied before or after Invert.

Don't forget to convert final image from L* gamma to sRGB gamma for final 8 bit web delivery.

 

3. Optionally you need to Calibrate your camera or scanner. To create custom Input ICC profile use IT 8.7 Scanner Calibration Target like these Affordable IT 8.7 (ISO 12641) Scanner Color Calibration Targets and use Rough Profiler app http://www.jpereira.net/descargar-rough-profiler-software-pefil-color-icc-free to generate Lab cLUT based ICC profile. This step is more-less optional for DSLR cameras because in most cases developers provide more-less ok input profiles for them.

Manual calibration also helps a lot to fix problems with low quality LED backlight sources.

 

4. Also for best possible color in single click you need:

- Quality and uniform light source.

5500K-6500 high CRI LEDs should be OK. Halogen lamps combined with 80A filters may work, but produce huge amount of IR pollution. Digital sensors are sensitive to IR light and so final images may have brownish-magenta tint that may affect RGB AutoLevels and so may produce incorrect color. (this is known problem with with digital cameras but i didn't test this in real life with film scans)

- Custom made input ICC profile always makes things magically look better. Film negative scans start to look and feels like slightly mixed with some slide film colors. Make sure you generate LAB cLUT type of ICC profile, but not a basic simplified Matrix type of ICC profile.

- Avoid light leaks, lens flares or reflections.

- Scratches and dust on the film may affect RGB AutoLevels and so produce incorrect color. So make sure you inspect and fix and all artifacts.

- Without proper White Balancing of RAW file to Light source it will be shift or clipping in colors.

- During camera scanning don't clip Red Channel in Orange mask. Clipping will destroy original relations between colors. Exposure to the right, but always keep some free empty space at the right side of histogram.

- Export images from RAW processors in ProPhotoRGB or BestRGB color spaces.

BestRGB fits well to native film color space, but may slightly clip in some areas and so usually produce less saturated and dimmed colors.

ProPhotoRGB is slightly larger than native film color space and will fit all possible data. It is usually produce well saturated look and slightly more vivid blue colors.

Rec2020 and ACEScg color spaces size are also more-less OK, but these color spaces use 6500K and 6000K white point instead of commonly used 5000K in image editing color spaces. Also these color spaces mostly used in video editors and less supported in image editing apps.

Do not use sRGB or AdobeRGB as working color spaces, because they are way too smaller than native film color space. They will clip a lot of colors (red colors will always look too orange, and yellows became too brown).

Do not use ACES color space. It is too huge compare to native film color space and will produce shifts in some colors (too vivid blues and too magenta reds).

 

Here is example of auto processed test RAW image.

(source file was shared by ColSebastianMoran on other forum)

8CGJOLV.jpg

 

And here is side by side comparison based on another DSLR RAW sample from other forum. Only Invert and Auto Levels where applied here. As you can see L* gamma produce the most realistic and symmetrical tonality result and usually require very small amount of additional manual correction. It also produce the best separation between bright skin tones and extreme highlights.

 

1sBC9aA.jpg

 

And some examples of processing in 16 vs 32 bit:

This is how Output levels work in 16 bit (clipped data lost):

MNgcjwY.gif

 

This is how Output levels work in 32 bit (clipped data recovered):

QPqWqZT.gif

Link to comment
Share on other sites

  • Replies 210
  • Created
  • Last Reply

Top Posters In This Topic

And here is my step by step workflow:

 

Calibration:

- To create custom Input ICC profile use IT 8.7 Scanner Calibration Target Affordable IT 8.7 (ISO 12641) Scanner Color Calibration Targets

- Use Rough Profiler app ArgyllCMS GUI, software perfiles color ICC IT8 colorchecker, fotografia cientifica - Jose Pereira to generate Lab cLUT based ICC profile.

 

Scanning:

- Scan film negatives in positive mode to 16 bit with linear gamma. Scanner color management should be turned off.

- Use manual exposure.

- Use scanner autofocus by manually selected point.

- Do NOT remove Orange mask during scanning with custom gain adjustment for each channel. This is same as WB adjusted in RAW and will cause problem with incorrect input color space described here negadoctor invert mistake · Issue #6770 · darktable-org/darktable

 

DSLR RAW scans Processing.

I recommend to use Iridient Developer Iridient Digital It can disable build-in camera factory correction curve and can export untouched TIFF with Linear gamma same as scanners do. It also can do internal color management processing and export TIFF to any ICC profile that you select. Debayer and sharpness options are also very nice.

Probably instead of Iridient Developer you may use MakeTIFF, export to Linear gamma and then convert in PhotoLine to L* gamma. But i have no idea how MakeTIFF deals with color spaces. If it only transforms gamma and skips color spaces, you probably need to find proper ICC input profile for your camera or build your own input ICC profile based on IT.8.7 target scan.

 

- Open RAW file in Iridient Developer

- Remove build-in Camera Contrast curve.

- Do not use large amount of Chroma Noise Reduction, because it may change some colors. And even small changes in source colors may produce large color shifts later with AutoLevels.

- Debayer and convert from Camera Input ICC profile to L* gamma ProPhotoRGB working ICC profile.

 

Development (can be recorded to one click action):

I recommend to use PhotoLine image editor because It is probably the only existing graphic editor app that allow to manipulate 8 or 16 bit images inside 32 bit depth project (similar to DaVinci Resolve). Also its Curves and Levels tools are way better than any other app i ever use before. PhotoLine also can read RAW files and can disable build-in camera factory correction curve, but it have not too perfect debayer and sharpness options yet.

It is always better to download latest beta version from the forum (which is formally always a release with bugfixes) here Betatester unter sich - PhotoLine Forum It is developed by small team, so original UI icons are not too pretty, so i can recommend to use my specially designed UI Icons theme available for free here: PhotoLine UI Icons Customization Project by shijan on DeviantArt

 

- Open file in PhotoLine.

- Rotate (optional)

- Switch to "Document mode"

- Set Document to 32 bit depth

- If image already was already transformed to ProPhotoRGB with L* gamma ICC profile, Document should automatically use same ICC profile as well.

(If image was in Linear gamma, you need manually assign ProPhotoRGB with L* gamma ICC profile to the Document and Camera Input ICC profile with Linear gamma to the image layer. This will provide color management from Input ICC profile with Linear gamma to Working ICC profile with L* gamma. In Document Mode PhotoLine allow non destructive (non baked) color management.

L* gamma is the core of this workflow. L* gamma is the only way to invert negative without color shifts and preserve original tonality. 32 bit depth allow quick and lossless recovery of clipped color data during editing.

To generate ICC profiles with L* gamma use Synthetic ICC app from DisplayCAL—Display Calibration and Characterization powered by ArgyllCMS package, or use RawTherapee ICC Profile Creator, or just use Elle Stone's Well-Behaved ICC Profiles ellelstone/elles_icc_profiles)

- Crop black frame (original pixels are not deleted here, so you can always recover original frame later if needed), or just select area without black frame.

- Add Curves adjustment layer, and do Invert.

- Add Levels adjustment layer and use "Auto Mode for All channels" Auto Correction to crop huge amount of empty space from RGB channels (you can experiment here with clipping level settings to find the best variation of AutoLevels colors)

 

gge3OCu.jpg

 

- Turn on "Mark Extreme values" to see clipped highlights and shadows.

- In same Levels tool adjust Output Levels, to recover clipped dynamic range in highlights and shadows.

(Do NOT touch Gamma slider at this step, because this will damage original relations between colors. Do not attempt to use Output Levels like this in apps that process image only in 16 bit. Recovery from clipped color data only possible in apps that can operate in 32 bit depth.)

 

JtqZTYn.jpg

 

- Add "Exposure" adjustment layer and adjust Luminance.

(Do NOT use Luminance slider in Curves tool. By mistake it is named same as Luminance in Exposure tool, but actually adjusts image in very different way. Problem was described here Large List of Suggestions and Requests for PhotoLine UI - Seite 4 - PhotoLine Forum)

 

Uo5hYhr.jpg

 

- Add "White Point" adjustment layer to fine tune color temperature. Select in settings "Setup Grey Point" and "Fix White Point" and pick some grey point somewhere in your photo.

 

ot0ekOy.jpg

 

- Image will look slightly low contrast and slightly desaturated now, so you need to add another adjustment layer and adjust Contrast with s-curve or use 1D LUTs with Contrast presets as i usually do.

(Instead of separated Luminance and Contrast layers some people prefer to use single Curve layer to shape tonality and contrast together.)

 

xBseVvi.jpg

 

- Export to sRGB JPEG for web or to ProPhotoRGB TIFF for printing.

 

And to summarize, here is a quick video for Film Negative Invert in L* Gamma in PhotoLine:

 

Link to comment
Share on other sites

Same L* gamma workflow is possible in Photoshop in 16 bit, but AutoLevels adjustment will be not so quick and easy there. Also there is no special tool for Grey Point in Photoshop, so you need to set it with Curves.

 

So here is how to deal with AutoLevels in Photoshop:

 

- Invert

- Add Levels

- Hold "Alt" key and press button "Auto" to open tool preferences.

- Select "Enhance Per Channel Contrast"

- Select Shadows/Highlights Clip somewhere from 0.01 to 0.1 (you can experiment here with clip level settings to find the best variation of colors)

- Select "Save as Defaults" so same settings will be applied next time when you simply click to button "Auto" without holding "Alt".

- Do NOT check "Snap Neutral Midtones" because it will change original relations between colors.

 

BSOgF0I.jpg

 

- Now you need to recover clipped data. In Photoshop this could be done only by adjusting Input Levels in each channel to same amount of numbers.

(Do NOT touch "Gamma" or "Output Levels" sliders at this step, because this will damage original relations between colors)

 

xtNj4j5.jpg

WQCEMov.jpg

HOL8QwX.jpg

 

- Add Curves layer and pick grey point somewhere

(This is same as "Snap Neutral Midtones" but you appy it in proper place and you can manually control it.)

 

VVxJ7sH.jpg

 

- Add another adjustment layer and adjust tonal curve and contrast as you like (Same as in PhotoLine workflow)

 

2rI3ZUF.jpg

 

And to summarize, here is a quick video for Film Negative Invert in L* Gamma in Photoshop:

 

Link to comment
Share on other sites

I also want to add some info about so-called custom-made inversion Log curves. I guess first time it came to film scanning masses in early Photoshop era as Film-Negative package by Timo Autiokari. That website was closed long time ago but original page "Scanning Negative Film using a Flatbed scanner" still archived here Wayback Machine

To be honest i still have no idea how exactly people generate those curves, and what log curve they use as a reference, but formally this method is technically incorrect and produce some problems:

- Photoshop curves in .amp format are very legacy and use low resolution data. Formally once applied, they produce low quality 8-bit starting point with further posterization and banding problems. It is very easy to see a gaps if take a look at histogram or at high resolution visualization.

pBfv29w.jpg

- As you may know, Photoshop itself use 15 bit internal processing instead or real 16 bit, so this also may cause some additional tonality data loss during these extreme transformations.

- Inverted log curve violates normal ICC profile based color management inside apps.

- As you may know, Contrast and saturation are linked. So in normal digital workflows increased contrast always produce increased saturation in the image. But when you use inverted log curve and for some reason apply AutoLevels before invert, relation between contrast and saturation became the opposite and you need manually increase saturation with other additional tools. Sometimes this flipped relation between contrast and saturation may produce interesting artistic effects.

Overall this method may take a place, but it just needs better quality curves and tools.

 

Here is also comparative visualization of different inversion methods:

 

1qjffTj.jpg

S4BxUVf.jpg

VFd2TrW.jpg

tyW8vvq.jpg

Link to comment
Share on other sites

Can you explain why a 32 bit workflow is needed? When no commonly available scanner has an A/D converter that outputs more than 16 bits, and most digital cameras are limited to a 14 bit A/D converter.

 

The idea of being able to 'add' bit depth somehow, defies logic. Unless interpolation is used to artificially fill in histogram gaps.

 

If you start off with a maximum 65535 levels of brightness; how can you possibly add more? Sure, the internal calculation maths needs to be done to a higher precision than 16 bits, but I still don't see the need for a 32 bit 'space' or file capacity.

 

Nor why an optical means (filtration) of cancelling, or partially cancelling the orange/magenta mask should result in a colour imbalance. The whole point of the mask is to create a colour imbalance that compensates for weaknesses in the coupler dyes and sensitivity of the film. No digital or optical bias will change that.

 

The mask is the very reason why the RGB histograms don't align in the first place.

Edited by rodeo_joe|1
Link to comment
Share on other sites

Too late to edit.

A glance at the published curves of any number of colour negative films shows that the total density variation should be no more than 3.3D, unless the film is improperly exposed or processed. This represents a contrast, or brightness range, of only 2000:1, which is well within the 65535:1 capacity of a 16 bit file. And allows for a discrimination of 32 tonal levels between the darkest parts of the negative (the highlights) when digitised to 16 bits.

 

Furthermore, if optical means - i.e. light-source colour - are used to remove the mask-induced density differences, then only a density variation of around 2.3D (~200:1) in the individual RGB channels need be accommodated digitally.

 

This optical compensation is done automatically when chemically printing a colour negative; by a combination of enlarger light-source filtration, and by the inbuilt spectral sensitivity variation of the printing paper.

 

Surely all we're seeking to do is to emulate that opto-chemical process? Why then, is pre-inversion removal of the mask such an issue? Since it's exactly analogous to what happens when darkroom printing a colour negative.

Edited by rodeo_joe|1
Link to comment
Share on other sites

Yes, the most cameras and scanners are 14 bit. Processing in 32 bit just allow "clipless" processing of that 14 bit data. It just makes recovery of clipped data faster. You just need to adjust "Output Levels" and clipped highlights and shadows magically return to original.

In 16 bit you can not adjust "Output Levels" in same way, because instead of original recovered image it will just recover pure black and pure white solid colors. To recover clipped data in 16 bit apps, you need to adjust "Input Levels" for each channel to same amount of digits, as described in example with Photoshop (post #3). This is just slower and require more manual work.

 

There are gif animations in first post that show difference between processing in 16 vs 32 bit.

 

This is how Output levels work in 16 bit (clipped data lost):

MNgcjwY.gif

 

This is how Output levels work in 32 bit (clipped data recovered):

QPqWqZT.gif

Edited by dmitry_shijan
Link to comment
Share on other sites

In 16 bit you can not adjust "Output Levels" in same way, because instead of original recovered image it will just recover pure black and pure white colors.

No. Not unless the histograms are already clipped during scanning or digitising.

You cannot put back tonal information that wasn't captured in the first place. And in the same way, you cannot lose information simply by shifting its digital place location - multiplication or division by powers of 2. Not unless the data are carelessly handled.

 

Posterisation - loss of tonal discrimination - is a different matter and entirely different from clipping.

Edited by rodeo_joe|1
Link to comment
Share on other sites

If something already clipped during scanning or digitising it can't be recovered. I did't talk about some magical additional data recovery from source images. I talk about already existing data clipped by AutoLevels tool during editing and then recovered. When you apply AutoLevels you always clip some data because this is how AutoLevels work. In 32 bit you can bring back that original data in one click with "Output Levels" slider. In 8 or 16 bit you need more manual work with "Input Levels" sliders in each channel.

Please make sure you read info before post statements like these.

Link to comment
Share on other sites

Those filters may slightly amplify blue channel, but they also will change original ratio between colors.

A filter removes things. It doesn't amplify what's passed through. Therefore an optical blue filter attenuates the red and green channels, leaving the blue largely untouched.

 

You have to think of a colour negative as a collection of B&W negatives, each taken through one of a tricolour set of RGB filters, but each having a complementary colour to the original filter. Therefore the contrast of each of those negatives can be adjusted by optical filtering. Get that filtering correct and the contrast of each negative is the same - meaning that their digitised histograms will align and little or no digital 'fiddling' will be required. Either before or after inversion.

 

Unfortunately, what a digital camera's white balance does, does not exactly imitate an optical filter. It changes the gain of the RGB channels, and after digitisation. Thereby increasing noise and posterisation in the boosted channel(s). Optical filtration of the light-source doesn't have that drawback.

I talk about already existing data clipped by AutoLevels tool

That's what I mean by 'careless data handling'. Adjust the levels manually and you won't get them clipped by a dumb Auto-Levels algorithm. It has nothing to do with the number of bits; just poor handling of the ones you've got!

Edited by rodeo_joe|1
Link to comment
Share on other sites

Can't discuss a lot about optical blue and CMY filters yet, because i mostly operate with files from Minolta film scanners.

 

The main problem that it is very hard and very slow to adjust levels manually. AutoLevels have "Clip" parameter in settings and it in most cases to detect best colors you need to increase Clipping to 0.1 or sometimes even more. As a result colors detected correctly, but same time some amount of highlights and shadows always became clipped.

I really can understand your anticipated point here. Editing in 32 bit in PhotoLine is just a unique option and allow do some things faster. There is no increase file size, because 16 bit source file inside 32 bit PLD document stays in 16bit. Only tools processing done in 32 bit.

 

Here is another better example how Output Levels work in 8 or 16 bit (clipped data lost and can't be recovered with "Output Levels"):

4cxcGgV.gif

 

This is how Output Levels work in 32 bit (clipped data recovered with "Output Levels"):

DnvtcjA.gif

Edited by dmitry_shijan
Link to comment
Share on other sites

One little update. I just checked MakeTiff+ExifTool and see that it can't do proper color management and can't transform from Input camera ICC profile to Working ICC profile. It only output TIFF in Linear gamma as is and provide very strange and technically very incorrect option to simply assign some basic ICC profiles to exported file. As expected in this situation without proper camera input ICC profile colors are always off.

To use MakeTiff+ExifTool you need manually create Lab cLUT Input ICC profile based on IT8.7 chart, assign that ICC profile to exported image, and then transform to ProPhoto L* gamma Working ICC profile.

 

k0okXog.jpg

Link to comment
Share on other sites

Also MakeTiff always set color temperature and tint to some very extreme position. This is why files processed with MakeTiff always look so green. Have no idea why it designed like this and why it don't preserve camera WB setting, or don't use some more common 5000K or 6500K starting point.

 

HCQvAmi.jpg

Link to comment
Share on other sites

The main problem that it is very hard and very slow to adjust levels manually.

But if you then need to manually 'recover' clipped data, where's the time saving? And it really doesn't take that long to marquee the end points of 3 histograms.

 

I actually hardly ever bother with the levels tool anyway. The curves tool can set the endpoints of histograms just as easily, and while you're in there a grey-point can be picked as well.

This is why files processed with MakeTiff always look so green.

It has nothing to do with white balance. It's because Bayer array sensors have an overdose of green-filtered photosites. The green channel data have to be effectively halved to get normal colour.

Edited by rodeo_joe|1
Link to comment
Share on other sites

This is how Output Levels work in 32 bit (clipped data recovered with "Output Levels"):

Looks like there's a bug in whatever program that is. A clipped histogram should stay a clipped histogram. No matter what you do to the black level afterwards. And that's only a preview screen we're seeing. None of those settings have been applied.

Link to comment
Share on other sites

It is same as discuss "other opinion" about flat earth. You post some imaginary tech fake that based on nothing and ask to discuss it as other opinion. And seems you do it permanently in other threads as well and try to confuse people.

1. There is no relation between Bayer pattern that have more green pixels, and debayered to RGB image color.

2. Tools in any app that can process data in 32 bit depth work in same "clipless" way. Davinci Resolve, PhotoLine and many others ... Same goes to audio apps that process music effects in 32 bit depth in same clipless way.

Edited by dmitry_shijan
Link to comment
Share on other sites

Well, pictures speak louder than words.

Here's a couple of negs picked at random. One inverted from setting a custom camera white balance to neutralise the orange mask, and another camera copy taken with a blue filter to neutralise the mask.

 

First the straight negative copies.

Comparison-neg.thumb.jpg.ba7c63b5c6722fbe646f410215eacfe7.jpg

And a straight inversion of the above:Comparison-pos.thumb.jpg.893c60e2995c76662a28ffc4d052911c.jpg

I think you'll agree that the filtered version will need much less afterwork to correct it.

Here are the finished results - optical filter -Optical-filtered.thumb.jpg.79979be096f06941768b2a91f4edcf7c.jpg

 

Custom camera WB -

Custom-WB.thumb.jpg.b3fad4165a141bce3c49624127bb6593.jpg

Not much to choose, but the optical filtered version has a truer sky blue IMO, and more vivid reds.

Edited by rodeo_joe|1
Link to comment
Share on other sites

Another, more colourful example from simply optically filtering away the mask colour.

 

After inversion only.

After-inversion.thumb.jpg.755776cb2077808f466e1b625fb4a5df.jpg

 

And two clicks later in PhotoShop - Autocolor followed by a grey-point selection.

Autocolor-final.thumb.jpg.b5143b498b8793b6c293d33f83944182.jpg

 

So, no techy BS or graphs. Just straight examples of what a simple blue filter can achieve.

 

Those examples were from very old negatives BTW. The digitised results are far superior to the chemical proof prints that were returned from the processor.

Edited by rodeo_joe|1
Link to comment
Share on other sites

OK, after all that fight against AutoLevels and 32 bit processing it all ended up with PhotoShop - Autocolor :)

Your examples only show that there is a some difference between WB and optical blue filters. I wrote the same thing in the first post - WB in RAW is not a good way neutralise the mask, so i have no idea what "BS or graphs" are you talking about.

To be honest i have no idea which image from your examples looks better. If compare side by side, image with blue filter just have more cyan tint, so blue sky is cooler, but grass and train are more brownish. In image without optical filter sky looks more transparent, and green grass looks more fresh and greener.

 

A lot of things are unknown from your examples:

How did you process RAW file?

What light source are you use?

What color space and gamma are you use to invert and process images?

Are you add blue filter and set camera WB, or set WB and add blue filter after that?

Are you use custom-made camera input ICC profile (and if yes, so what type or profile?) or you use basic camera profile bundled in RAW editor?

 

All these little things permanently affect colors. For example Lab_cLUT-based input ICC profile type produce very accurate colors but require more accurate workflow because may clip color space if something set up wrong. Same time Matrix-based input ICC profiles are less accurate in color reproduction but more flexible and simpler to operate. In most cases RAW editors use Matrix-based dual illuminant input profiles, so RAW files are less affected by problems with color shifts during WB adjustments.

 

Also in many situations film negative image that looks less neutralised at the start may produce better colors after editing, because original relations between colors where preserved in that image.

 

Examples from link in my first post scanned with Minolta Dimage Scan. I specially edit these images in ACES color space to illustrate how far colors may shift in largest color spaces:

 

WB adjusted in scanner (orange mask removed, but relations between colors are lost):

eV4XtT6.jpg

 

WB untouched in scanner (orange mask untouched, but relations between colors are preserved):

jwAk4qp.jpg

Link to comment
Share on other sites

WB in RAW is not a good way neutralise the mask...

There is no white balance in RAW. It's just metadata from the camera settings attached to the file.

 

How did you process RAW file?

What light source are you use?

What color space and gamma are you use to invert and process images?

File opened in ACR and imported into Photoshop with no changes, just to get a 16 bit (actually14 bits from camera) file to work on.

 

Light source was flash, filtered by an 80A - full CT - filter square.

 

Camera balance was set to 5500K, but that's irrelevant with a RAW file and could be adjusted at ACR import.

 

Colour space was sRGB, just to show that nothing fancy is needed if the RGB histograms are all aligned by optical filtering.

(When the histograms don't all lie on the same section of the gamma curve, then each channel needs a separate correction curve applied, and that's when things get complicated)

 

The whole idea is to simplify colour negative processing, right?

 

I was just showing that the addition of a filter to the light-source/camera can go a long way toward that simplification.

 

I still don't fully understand why the red channel has a tendency to become overly saturated by using a filter. It needs further investigation. However, I see use of a mask correction optical filter as a step forward in the workflow.

Even if that requires the creation of an action in Photoshop to apply some red desaturation or a slight hue rotation. As long as the effect is consistent, the remedy needed is irrelevant.

 

I've tried 'linearising' the RAW file using MakeTIFF, but it just lengthens the workflow with no real gain in end result.

 

And at the end of the day - it's film! Using imperfect dyes and having a particular 'look' baked into it. So does Velvia capture perfect colour? Or Ektachrome or Agfachrome or any other reversal material? No!

Then why should we expect perfect colour from a negative film?

Isn't just a pleasing result enough?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...