Jump to content

RGB Tricolor filters for separation photography


matt_t_butler

Recommended Posts

Could you tell us what filters you ended up using, and the factors you needed to apply?

 

This was the result when using the Wratten camera separation set #25 Red, #58 Green, # 47 Blue on a digital camera and using the 47 Blue (the least transmission) as a base exposure and adjusting the Red and Green to match when shooting a mid grey target to around 90.

 

1581307719_RGBset255847.png.e0fa3dc318081c7e2b1248440c3eedc0.png

 

 

This was only an approximate guide to a base balanced exposure - each scene was adjusted during photography using the RGB histogram in the camera - different scenes and time of day gave differing exposure values for each channel. (The Green channel also contains most of the luminosity information.)

Matt B
Link to comment
Share on other sites

(The Green channel also contains most of the luminosity information.)

 

- I think this is a myth - a conflation of human eye response with the fact that the Bayer array has twice as many green-filtered photosites as red and blue.

 

If you play with the HSL parameters of most natural scenes, it's the red channel that has most effect. There's very little pure green or blue in most scenes - even grass responds more to the yellow HSL sliders than it does to green.

 

Anyway, thanks for the info on the filter factors. Looking forward to seeing more timelapse shorts. I don't know how you find the patience to put them together.

Link to comment
Share on other sites

- I think this is a myth - a conflation of human eye response with the fact that the Bayer array has twice as many green-filtered photosites as red and blue.

 

Perhaps you're correct, by eye it appears the Green record is 'cleaner' - probably due to the 2 x Green pixels.

(Having a lot of green subject matter in the shot blocks information in the R and B records.)

 

In theory the 'mid grays' of the concrete fence bottom RHS and in the different cloud densities in the scene below should appear equally exposed in each record.

 

Some photographers did add an additional B&W record to the RGB layers to snap up the contrast.

 

..... and yes, they don't call it time lapse for nothing - a lot of time gets lapsed out of one's life.

 

1602252720_EmpireRef.jpeg.2a51e5ee309ffedae63db7a0f174b7c4.jpeg

 

1282222946_EmpireRED.JPG.c0a7e3162d671c59680b903cf4b2f0a8.JPG

 

1817916907_EmpireGREEN.JPG.4bfd21442c3a86053da2ef8e0d38fc08.JPG 1677618288_EmpireBLUE.jpg.c4ecfe704f55dadcb5126fa2d617a323.jpg

Matt B
Link to comment
Share on other sites

(Regarding the eye's green sensitivity.)

 

- I think this is a myth - a conflation of human eye response with the fact that the Bayer array has twice as many green-filtered photosites as red and blue.

 

Well, there are a few different things.

 

Definitely the dark adapted sensitivity is higher in green. That is why the #3 safelight is green.

(Well, it couldn't be blue, but also the eye isn't so sensitive to blue, anyway.)

 

The matrix for converting RGB video to YIQ, Y being the black and white equivalent,

uses 0.299*R + 0.587*G + 0.114*B

 

The lumen, a unit of light intensity normalized for the eye's sensitivity:

 

Lumen (unit) - Wikipedia

 

has a maximal value at the peak of the eye's sensitivity at 555nm, in the green.

 

There is another question that could come in, though, which is the eye's resolution

in different colors.

 

YIQ - Wikipedia

 

Before creating the NTSC color TV standard, there were studies on eye resolution,

and it turns out that for some color differences, the eye has more resolution than

others.

 

From the YIQ article, it is orange-blue where resolution is higher, and red-green where

it is lower.

 

The NTSC color subcarrier has more bandwidth for the orange-blue axis (I) than

for the red-green axis (Q). (And very few television sets decode this.)

 

The standard was designed to allow one to decode to (R-Y) and (B-Y)

instead of I and Q, and avoid some matrix terms.

 

It would be interesting to have a sensor array based on YIQ instead of RGB.

  • Like 1

-- glen

Link to comment
Share on other sites

It would be interesting to have a sensor array based on YIQ instead of RGB.

 

- Kodak tried a CMY matrix way back when. The colour saturation was poor compared to Bayer.

 

And NTSC stands for "Never The Same Color twice" doesn't it?

 

Personally I would like to get rid of the stupid surplus of green filters in the Bayer array, but a 3 filter geometry is hard to design without some offset of sampling from orthogonal. But that doesn't seem to matter with film 'grain'.

 

Instead I propose an RCYB matrix, with cyan and yellow filters replacing green in the Bayer array. This way there are no filtration 'gaps' for spectral colours to fall down, and the light efficiency is instantly doubled.

 

The de-mosaicing becomes a little more complex, with green being extrapolated from C minus B, and Y minus R, but you could then end up with 2xR, 2xG and 2xB signals from each sampling of 4 photosites.

 

R and B are the direct filtered data

G1 = Y - R

G2 = C - B

B' = C - G1

R' = Y - G2

The maths could be done in real time by analogue signal addition and subtraction before digitising - simple!

At least in theory.

 

The oft seen non-sequiter argument in favour of the Bayer array is that a sensor needs to capture more green because the eye is most sensitive to green! That's like saying that someone very sensitive to pain needs more pain in their life.

 

Illogic and craziness abound!

  • Like 1
Link to comment
Share on other sites

- Kodak tried a CMY matrix way back when. The colour saturation was poor compared to Bayer.

 

And NTSC stands for "Never The Same Color twice" doesn't it?

 

Personally I would like to get rid of the stupid surplus of green filters in the Bayer array, but a 3 filter geometry is hard to design without some offset of sampling from orthogonal. But that doesn't seem to matter with film 'grain'.

 

I suspect that the math for digital processing in the camera is a little harder for 3 filters, and that might have been important

in the early years, but probably not now.

 

I do think the four colors needed to to YIQ would be interesting, though. Note that the actual colors are not R, G, or B,

but somewhere else on the chromaticity diagram.

 

 

Instead I propose an RCYB matrix, with cyan and yellow filters replacing green in the Bayer array. This way there are no filtration 'gaps' for spectral colours to fall down, and the light efficiency is instantly doubled.

 

The de-mosaicing becomes a little more complex, with green being extrapolated from C minus B, and Y minus R, but you could then end up with 2xR, 2xG and 2xB signals from each sampling of 4 photosites.

 

 

Not so obvious to me, does the actual sensitivity depend on the lightest or darkest filters?

 

 

R and B are the direct filtered data

G1 = Y - R

G2 = C - B

B' = C - G1

R' = Y - G2

The maths could be done in real time by analogue signal addition and subtraction before digitising - simple!

At least in theory.

 

The oft seen non-sequiter argument in favour of the Bayer array is that a sensor needs to capture more green because the eye is most sensitive to green! That's like saying that someone very sensitive to pain needs more pain in their life.

 

Illogic and craziness abound!

 

One that isn't mentioned much, but I believe is true.

 

That there is natural variation in the sensitivity of individual sensor cells, which is corrected from a look-up table.

 

Darker filters will mean less signal, and so lower S/N.

 

Additional green means even more S/N for green, than red or blue.

 

What I don't know now, is when NTSC did the resolution vs. color tests, if they did them all at the

same intensity (power) level. Seems to me that since red and blue are darker, they will be less

noticeable. Lots of questions, not so many answers.

  • Like 1

-- glen

Link to comment
Share on other sites

Additional green means even more S/N for green, than red or blue.

 

- But the system I propose effectively gives two signals in the red and blue channels for each sampling also.

 

The logic is simple. Cyan and Yellow filters are spectrally broader than green, and therefore let in more light. And since Cyan covers the blue to green region, and Yellow covers the red to green region, the whole spectrum is covered. Subtract blue from cyan, and you have a green signal. Likewise if you subtract red from yellow you have another green signal.

 

Cross-subtract those green signals from yellow and cyan, and you recover additional red and blue signals. Voila! You've not only reduced the R and B noise by root2, but you've doubled the light efficiency of the sensor as well.

 

I would forsee problems with YIQ filtering, in that the necessary magenta filter (minus green) is a non-spectral colour that requires dyes which filter across a non-continuous imaginary colour space.

 

It's effectively just a hue rotation of the above RYCB filter matrix 'centred' about red. With magenta replacing cyan, and orange replacing yellow. Except there's no real spectral centre for the colours, and the filters need to transition the imaginary purple-magenta region.

  • Like 1
Link to comment
Share on other sites

- But the system I propose effectively gives two signals in the red and blue channels for each sampling also.

 

The logic is simple. Cyan and Yellow filters are spectrally broader than green, and therefore let in more light. And since Cyan covers the blue to green region, and Yellow covers the red to green region, the whole spectrum is covered. Subtract blue from cyan, and you have a green signal. Likewise if you subtract red from yellow you have another green signal.

 

Cross-subtract those green signals from yellow and cyan, and you recover additional red and blue signals. Voila! You've not only reduced the R and B noise by root2, but you've doubled the light efficiency of the sensor as well.

 

I would forsee problems with YIQ filtering, in that the necessary magenta filter (minus green) is a non-spectral colour that requires dyes which filter across a non-continuous imaginary colour space.

 

It's effectively just a hue rotation of the above RYCB filter matrix 'centred' about red. With magenta replacing cyan, and orange replacing yellow. Except there's no real spectral centre for the colours, and the filters need to transition the imaginary purple-magenta region.

 

I think that sounds right, but it always takes me a little while to be sure.

 

One that I found out only recently, when my daughter sent me one, is that there is a JPG format that is CMYK.

 

It displays fine on my computer, but uploading it to some places, it was refused. (I believe both Shutterfly and Snapfish refused it.)

 

I then found a program that would convert it, but then later she sent one in RGB.

 

Since there is already a CMYK form for JPG, it wouldn't seem so strange for cameras to use it.

 

Otherwise, since the whole idea behind YIQ is more resolution for I than for Q, you would want more sensors with I filters than Q filters.

 

It isn't so easy to know which filters are easy to make, and which are hard.

Since filters work by absorbing some wavelengths, I will guess that it isn't hard, but only when someone tries will we know.

Besides, magenta filters are common for color printing, and I haven't heard that they are hard to make.

 

Film dyes have the complication that they all have to be made from one oxidized color developer molecule.

Other dyes don't have that problem.

-- glen

Link to comment
Share on other sites

The issue with magenta filters is that they have no directly measurable bandwidth or centre-frequency. That all has to be synthesised from a minus-green analysis. Whereas the transmission of any filter that falls within the real visual spectrum can simply be measured.

 

RGB filters of well-known characteristics already exist, and cyan and yellow filters could equally be readily made and characterised. But magenta and purple colours don't exist as spectral electromagnetic frequencies, making their characterisation more problematic.

 

I'm not saying it's impossible, just why start with a square wheel when round ones already exist.

Link to comment
Share on other sites

It is not very scientific, one would say purely academic even - but here is the same location (NYC HiLine) photographed sequentially through a set of RGB and then CMY filters on a digital camera with regular Bayer array. Images composited in FCPX.

 

315662873_RGBcomp*.jpg.1797c32bdb7e5eba2489b8480bac8bf4.jpg

RGB composite

 

555055452_CMYcomp*.jpg.bf91526bb66b7a54e6728fa6d85f1c19.jpg

CYM composite

Matt B
Link to comment
Share on other sites

  • 7 months later...

Everything about the availability of RGB camera filters that you were afraid to ask …

(Originally posted July 13, 2019)

 

Revised Information as of September 2019

(Some companies previously listed no longer supply the filters)

The numbers listed refer to Wratten Numbers LINK: :Wratten number - Wikipedia

 

Wratten (original Eastman Kodak 1909 to early 2000s - available 2nd hand through Ebay)

 

Tricolour photography - suggested set for ‘one-shot tricolour cameras’ with B&W negatives -

#25 Red; #58 Green; #47 Blue - Early nomenclature ‘A’ (Red); ‘B2’ (Green); ‘C5’ (Blue)

 

Tricolour set for B&W separation camera negatives - #25 Red; #61 Green; #47 Blue;

 

Tricolour set for colour separation copy from colour transparencies

#29 Red; #61 Green; #47B Blue

(All the above were the Kodak recommendations but photographers devised their own sets depending on personal style,

film types and Daylight or Tungsten lighting set ups.)

Tricolour photography is based on Trichromy LINK : Trichromy - Wikipedia

 

Harris Shutter 1971 (ref: Kodak Publication AE-90)

Mounted gelatin filters in a drop shutter #25 (Red); #61 (Green); #38A (Blue)

 

Wratten 2 (available through contemporary Kodak)

#25, #26, #29 Red; #58, #61, #99 Green; #47, #98 Blue

(#99 is a combination of #61+#16 Yellow/Orange)

(#98 is a combination of #47B+#2B Pale Yellow)

 

Tiffen

Glass filters

#25, #29 Red; #58, #61 Green; #47, #47B Blue

LINK: https://tiffen.com/film-enhancement/

Lee Filters for 100mm system

Polyester Tricolour Red #25; Green #58; Blue #47B (0.1mm thickness)

These tricolor polyester filters have been designed for tricolor photography.

LINK: Tricolour - Part of the LEE Camera Filter Range

 

Kenko

The SP Color Set contains Red, Green, Blue filters for ‘factorization photography’ - similar to

a set by Prisma released in the 1970s.

LINK: SP Color Set- Kenko Global Site (Now gives Error 404)

NO LONGER MANUFACTURED

 

Cokin

Cokin P Filter kit has a Red P003, Green P004 not suitable for tricolour work. (Needs clarification)

Black & White

(Cokin was acquired by Kenko in 2011)

 

Formatt

Various filters in #25 Red; #61 Green; #47 Blue

LINK: Camera Filters — Formatt-Hitech

Formatt Hitech 67mm 47 Dark Blue Camera Filter HT67BW47 B&H

NO LONGER AVAILABLE

(Formatt was acquired by Kenko in 2014)

 

Nikon

R60 (Red #25); X1 (Green #11); B12 (Blue #80B)

The X1 and B12 are unsuitable for true tricolour work

B+W

090M is equivalent to #25 Red; 091M is a #29; STILL AVAILABLE

061 is #13 Green; 081 is a medium Blue.

All are listed in their handbook but the 061 and the 081 are unsuitable for tricolour work

LINK: Home

Matt B
Link to comment
Share on other sites

  • 2 weeks later...

This is a process that I have always wanted to try, however I do have a question. If performing this process, you do include the filter factor in making your exposure, right?

 

I assumed this would be the case since you want a correctly exposed negative, however while reading up on the subject I found one source saying that you should not correct for the light loss through the filter, and I could not find anything to counter this.

Link to comment
Share on other sites

Robert 'Bob' Harris, the Eastman Kodak photo-educator who devised the 'Harris Shutter' suggested as a guide when using Wratten Red #25, Green #61 and Blue ##38A on the individual RGB exposures on the SAME negative to meter normally then open one stop on the 'normal' reading (adjusted exposure) for each filter when shooting colour negative.

This setting will give a slight colour caste to your image that can be corrected in printing.(or Photoshop)

This is a 'rule of thumb' exposure guide - my experience with using the technique with a digital camera is to expose for the more dense blue filters first to set a base exposure then close down two stops for the red and one stop for the green. This was using a Canon 5D2.

If using film, I suggest as Robert Harris did, is to bracket one stop either side of the adjusted exposure to find the settings that will work for you.

Matt B
Link to comment
Share on other sites

  • 1 year later...

I have been playing around with the process a bit, thanks for the info on here it's really useful. I have used this in photo and its really neat but am starting to try this in video and wondering a few things.

 

I see there is time lapse on here but I guess I am wondering on how to blend video in final cut, or if you have used this in video that isn't time lapse. I can wrap my head around photoshop and the use of filters but I am trying to figure out the equivalent in final cut. I am wondering on how you have blended them in video as I have tried a few ways but am not too happy with it. Also I am wondering on if you have transitioned a regular shot to a RGB separated shot in the same sequence without too much of a cut.

 

I have been using the lee filters, and they are ok, they are thin and flimsy but get the job done without too much cost.

 

Mainly I am trying to see if there is a difference in using a shot with 3 different filtered video shots on site vs a shot with 3 different video shots then colour separation added after in post.

Link to comment
Share on other sites

  • 4 weeks later...
ah I found it in premiere:

 

but not overly in final cut

 

It has been a while but from memory the work flow in Final Cut Pro is to 1) lay the three separate R, G and B exposures above one another on the edit timeline.

Then 2) in the 'Effects' pane select 'Compositing' > 'Blend'.

If your individual RGB's have been correctly exposed the colour balance should come out OK.

3) If not use the 'Opacity' slider to compensate.

('Transform' is selected as the stills format had to be adjusted for the 16 x 9 frame.)

 

I have not tried RGB separations from a 'full colour' normal frame or clip .......

 

1677981199_RGBinFCPX.thumb.jpg.665570832d56a03af591d0b2800a9855.jpg

Matt B
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...