Jump to content

Bill C

  • Posts

  • Joined

  • Last visited


455 Excellent


Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I think you may have misunderstood what he was saying. Perhaps he was speaking of self-made "process control strips," or something of that order. In such a case it's considered "safe" to wait that long to ensure that there's no more "latent image shift" in the material. The idea is that there's no need to specifically KNOW the latent image characteristics of the specific material if you just give it an excessively long time (several days) to stabilize. As I mentioned we specifically tested for latent image shift before doing serious testing on a new film. Since we were doing critical color testing on these films we wanted to be sure the latent image was in a very stable condition before beginning. And we didn't want to wait an excessive time. So we ran "sensitometric testing" with expose-to-process times running from about an hour, then two hours, then three hours, etc., then getting into larger Intervals of a day or two, up to about 2 weeks. With the specific films I mentioned they had reached a very stable condition in perhaps 4 to 5 hours at room temperature, so there is really no reason to wait longer. So for critical color testing we'd make sure to use this process delay. You might think that, since we did such testing, it's important for individual photographers to also do so. Probably not. We used literally millions and millions of US dollars worth of photo materials per year. This sort of volume required contractual agreements and delivery schedules, so once committed it's hard to change. Consequently we wanted no chance of unforseen future problems to show up. Thus the extensive testing ahead of time. And locking down test variables as much as possible. When we actually got to shooting tests we would use perhaps a half-dozen models spanning the range of complexions and hair color, along with an extreme exposure bracketing range. Everything was then printed, hand-balancing for critical color matching of skin tones, then evaluated in color booths. So you can probably see why we didn't want a possible issue of latent image shift to enter into this. Likewise we did not test with out-of-spec developer temperatures, nor even slightly off-center "process control" conditions, etc. FWIW I doubt that a typical professional photographer would be able to see any effect (with these specific films) due to a reasonable variation in time from expose-to-process, say between 1 hour vs 2 weeks.
  2. Regarding "holding time" for a film before processing: I spent a lot of years with a large chain studio outfit, and this is something we specifically tested for when evaluating a new film under consideration for use. We used only professional portrait films, almost exclusively the low speed Kodak films of the day. (These were, over many years: VPSII, VPSIII, PORTRA 160NC, and PORTRA 160.) In those days studio film was shipped back to the processing lab, and the exposures might typically be somewhere between a day up to about a week old before processing. Occasionally film might be temporarily "lost" in shipping, or whatever, and go a couple of weeks. So we wanted to know how the film would behave in these situations. Additionally, when we did in-house testing, we wanted to know how long the film should be held before processing. With these specific films, being held at room temperature, something on the order of about 4 hours hold time was generally sufficient. Going from a long-ago shaky memory, the difference between this and overnight holding might have been around 0.01 to 0.02 density loss on a photographed grey card. To put this density shift into perspective, it would be a very rare photographer or lab who could see this kind of difference; even "process control strips" will vary that much. In case I'm not being clear, there is not much point in holding these particular films beyond 4 or 5 hours, or something along that order. Other films - I really don't know for sure, but would guess that high-quality films are gonna behave similarly. But this is strictly a guess.
  3. For more practical purposes, though, you might look at the modern monobath sold by Cinestill - Df96, I believe. See here: https://cinestillfilm.com/products/df96-developer-fix-b-w-monobath-single-step-solution-for-processing-at-home?variant=7367677247522 I don't know anything about it other than what their website says, but if you're dead set on a monobath, well... here's a commercial product. Personally I'd probably stay with separate developer and fixer, ones that I'm already familiar with.
  4. Hi, I posted this one before - from 1966, a 4-second monobath. Published in one of the SPSE journals:
  5. Well how COULD there be absolute units for something that is essentially a human perception? Well, I just described, roughly, how the "horseshoe" was arrived at. It seems to me that you're trying to make it be something that it's not, by requiring it to have units of saturation or whatever.
  6. Hmm... where to start? I was in a similar place, 1990s, with something of a "need" to become knowledgeable about digital color management. Which I spent considerable time at, and can describe how to generate the chromaticity diagram (aka horseshoe or shark fin). Essentially you multiply the CIE color-matching functions times the spectral makeup of each sample (your laser's wavelength, for example) to get the 3 CIEXYZ values (X, Y, and Z). Which are then "normalized" somehow. Then the x value for the chart is a proportion of CIE X to the sum of CIE X, Y, and Z. And similar for the y value. You don't have to include the z value cuz you know it will all add up to 1 (x plus y plus z equal 1). But in case it's not obvious, by using only the proportional values, the luminance aspect is lost. In essence this is a way to put what oughta be 3-D onto a 2D graph. That said, I don't think it's very useful for what I think your purposes might be. The ability of humans to distinguish between "colors" on the chromaticity diagram is highest near the lower left of the chart, turning into larger and larger ellipses moving away from it. So it is not "perceptually uniform." The CIE sort of took care of this issue in the 1970s with the invention of both CIELAB and CIELUV, which ARE (roughly) perceptually uniform. (These can be calculated from from CIEXYZ after being "normalized" and adapted to the proper light source.) If someone wants to determine if two colors can be distinguished by a "standard observer," meaning a more or less average person without color deficiencies, I would suggest evaluating them in CIELAB. This gives three numbers, called L*, a*, and b* (pronounce the '*' as "star"). L* is seen as a lightness measurement; a* is red vs green, and b* is yellow vs blue. If you see this as a 3D shape it is possible to calculate the distance between any two specific "colors." (In the industry lingo any calculated "distance" between "colors" is referred to as delta E.) The general rule is that a "just noticeable" difference in colors is about 1 unit. Now, I don't think this is a precise rule, rather just a rule of thumb to estimate how significant a color difference is. In the real world, average color "errors" of 2 or 3 delta E units would typically not be a problem even in very high quality color prints. Fwiw my use for these sort of things has been with respect to color photography, not in measuring the limits of human perception, etc. These are things that most photographers won't care about but if one spends much time making and troubleshooting color profiles, both for printers and digital cameras, a basic understanding of these things can be pretty helpful.
  7. Bill C


    Temporary images
  8. I've not read the ISO method for digital cameras, but have formerly read some technical papers related to the topic. To get an overall view, of a very technical nature, I would suggest doing a search for "Jack Holm" and "ISO speed." I think that he may have been on the ISO committee. If you have the background to follow it, here's an older IS&T paper: https://www.google.com/url?sa=t&source=web&rct=j&url=https://www.imaging.org/site/PDFS/Papers/1999/PICS-0-42/987.pdf&ved=2ahUKEwjZyor8sdj7AhUCj4kEHZttDV0QFnoECBMQAQ&usg=AOvVaw0Cnw8dCOTA3UsO9hiQbEcA My rough understanding is that there are two general methods. The first, called base ISO, is roughly based on how much light energy to "saturate" an image sensor. (Image sensors can "fill up" with a charge, and then can not detect any further increase in light; they are then said to be saturated.) A particular Kodak sensor that I was familiar with was based on being able to reach 170% of a perfect white (diffuse) reflector. So, for example, an 18% gray reflector, when photographed, should put the sensor at about 18/170 of the saturation level. (One would need to know the actual amount of light energy being delivered to the sensor at that point.) This is a simplification. The second method, for higher ISO speeds, is based on some "signal/noise" ratio. I don't know any specific details, but basically the camera electronics can amplify the sensor output until it eventually becomes "too noisy" to be useful. This is loosely like what paddler4 is saying. Ps, I should mention that the nominal correct exposure should remain roughly the same between a film camera and a digital camera at the same ISO speed setting. One big difference is that a "daylight-balanced" color film is ideally used at the correct color temperature of light. Whereas the digital camera can be adapted for different color temperatures, within certain "noise" limits.
  9. Hi Mike. Don't overlook that this is a beginner question. (I'm even a bit embarrassed that I made it as complicated as I did.) If one does as I suggested it is almost guaranteed that pretty good (to perhaps very good) color will be the result. If, on the other hand, one attempts to start out with RAW files here are some things they must deal with: 1) obtain and learn to use an image-processing program, and 2) have a display monitor that is "adequate," colorwise, for image processing. If #2 is not "known" to be the case then the monitor ought to be "profiled" (according to ICC standards) by using a hardware/software profiling package. Additionally the user must develop some considerable skill in judging color appearance on a monitor. So contrast this situation with my suggestion to 1) manually set white balance and 2) manually set exposure, which nearly guarantees a good-quality result straight out of the camera. I think there is almost no chance that a relative beginner will be able to do a better job than a properly set up in-camera jpeg. Fwiw I made a full-time living in photography for upwards of 40 years, with the majority of it in large lab work, primarily portraits. If there were color problems of an unusual nature, I, or my department, would be where they ultimately got solved. Anyway, my recommendation is based on real-world experience.
  10. Hi, do you have a digital camera yet? Regardless, here's how I'd suggest to start. Let the camera do the work via its built-in jpeg capability. With this you need to have two things pretty close - they are the exposure, which you probably understand somewhat by now, and something called the "white balance." Let's cover the difficult one, white balance, first. Most people don't realize how much the color of light(s) varies because our eyes automatically adjust for it. But in photography it falls on you, more or less. Although digital cameras have a built-in automatic "white balance" capability it's not always the best. You can generally improve it, at least somewhat, by manually setting it. When the "white balance" is properly set, white things will look white (as opposed to, say, slightly pinkish, or perhaps yellowish, or some other color). And when the white things are white then other colors will also be right, more or less. So what I am suggesting is that for critical work you MANUALLY set the WHITE BALANCE (your camera manual will describe how to do it). Is this getting too complicated? Now for the exposure... you probably know how to meter exposure for your film camera. The same sort of method will work for the digital camera. But... the digital camera typically has a more precise method for checking a test exposure. It's called a histogram, and can be displayed on the back of the camera. Check your camera's manual, but typically you display an image on the back of the camera, then press an "info" button, or similar, until the histogram is displayed. It is basically a graph, from left to right, with a bunch of bars. Here's the trick - if you take a close-up shot of white paper (or gray, or whatever) the histogram will be a single spike (mostly) showing where the exposure is. A spike on the left means "dark," and a spike on the right means "light." If you make manual changes to the exposure this spike will move from side to side. So, you need to know where you want the spike to be. For a gray card it should near the center. For white paper it should be near, but not quite to the far right. Sorry if this is getting too complicated. But... if you can follow along you will be able to get well-exposed images, with good color, of set-up scenes. Finally, let's put it together. Have a friend sit somewhere, with good light, for some photos. Have them hold a piece of white paper while you make a custom white balance for it (per the camera manual). Next, while they are still holding the white paper, use it to fine tune the exposure. (Use a "manual exposure" mode, per the camera manual, as well as a fixed "ISO speed") Make a test exposure, then look at the histogram. For white paper you want the spike to be close to the right side - somewhere between 3/4 and all the way to the right. Increasing the exposure moves the spike to the right, etc. Once the white balance and exposure are set you can go ahead and shoot all the photos you want UNDER THE SAME LIGHTING CONDITION. More than likely you are gonna have the best color and exposure in your class. Caution: immediately after the shooting session I recommend resetting the camera to automatic white balance and auto-exposure. (To make sure that when you have a once-in-a-lifetime chance to photograph a flying saucer landing you'll be ready.) Best of luck.
  11. I'd sorta concur with this, if you've got the time. I'd do it a little differently, though. Rather than store all the lighting types, I'd start with a custom WB of the ambient light (shoot a white or gray card, then set this image for the WB). Shoot same card again, viewing the color histogram to make sure it "took." (The r, g, and b spikes all in the same place.) Next, using the SAME WB (do not let the camera change it) use flash to shoot the same test card. Again view the color histogram. This time the spikes will most likely NOT overlap (unless the flash color exactly matches that of the ambient light). (If the ambient light is tungsten, and you did a WB on this, the flash shot, using that same WB, will now have the blue spike shifted to the right.) What you ideally want to do now is to find a filter, by trial and error, to use over the flash in order to bring the blue spike more in line with the others. Once this is done, to some reasonable approximation, your ambient vs flash color differences will mostly disappear. I would expect that you can mostly get by with perhaps 4 filters. I'd suggest getting a set of 4 of what they call CTO filters for your flash. Try Rosco or Lee, and get CTO in full, 3/4, half, and 1/4. (You'll have to buy a largish sheet and trim to size.) If cost is an issue, you might just start with 1/2 CTO just for a test (two sheets of 1/2 CTO = a full CTO). If it works out for you, you might wanna hold it back as your own secret method. Maybe you'll get better properties that pay better? Best of luck.
  12. Yep, looks like you're onto something. If you can just find out what is causing the camera to shift. Fwiw I used to occasionally make documents for our company tech support line in order to help company shooters correct camera issues on the spot. Here's something that was really helpful. Many DSLR cameras have an "info screen" that shows a summary of the key settings, including white balance. I'd take a snapshot of that screen, when properly set, then number every item. Then below I'd list every item along with a simple way to change it. For a quick check, tech support asks the shooter to follow along with each icon - perhaps 15 or so, of them. When something is found amiss, ok, here's how to fix it. In your case, just note where the white balance shows up. Whenever you wanna check it just hit the info screen.
  13. Probably not much of an issue. Certainly not with the 580EXII which has good UV filtration. But... assumptions about other hot shoe flash units? I've seen a couple that did NOT have good UV filtration (with respect to making brighteners in white clothing take on a bluish tinge). So I would say that a person looking at such a flash unit will NOT be able to tell, for sure, if it has UV filtration. (An actual shooting test can be used to determine this by comparing two white materials - one that HAS fluorescent brighteners and one that does not.) In the case of the Canon 580EXII, which I would say is considered top-line, it is not matching "photographic daylight." It actually has a bit higher color temperature (as did most of the hot-shoe flash units I looked at). Coincidentally the units that I mentioned earlier, THAT DID NOT have good UV filtration, actually DID have (roughly) daylight-balanced flash output. (I'm not gonna name the brand, though.) I should probably be clear that I don't personally see this non-daylight thing as much of an issue. Real "daylight," as opposed to "photographic daylight" (5500K) varies quite a bit, and color neg film easily accommodates such moderate differences. And digital cameras have little problem correcting white balance. HOWEVER, if one tries to intermix a high-color-temp flash (many hot shoe units, in my limited experience) with a daylight-balanced flash, perhaps for portrait use, there will likely be color issues.
  14. Hi, let me pose a question... what would you do in a difficult-to-meter situation? Say, for example, performers on stage, such that you cannot get close enough for precise metering? In such a case any under-exposure error means that you won't be able to get a good black on the print, right?
  • Create New...