Discussion in 'Black and White' started by laurie_hallick|1, Oct 25, 2012.
How does the temperature of water and chemicals affect the development of film?
Warmer speeds it up cooler slows it down.
Chemical reactions proceed faster at higher temperatures as a general rule.
The most widespread temperature recommendation for black and white film and paper developing is 68⁰F (20⁰C). This temperature was chose because it was considered the "normal" room temperature in Europe in the early days of the last century.
While true, higher temperatures accelerate chemical reactions, chemical development is mainly proscribed by the infusion rate of the emulsion. The binder that holds the light sensitive goodies onto the film base is gelatin. This material is chosen because it is flexible, transparent, and has low solubility in water. Additionally, gelatin is an animal product and it generally brings with it some impurities that favorably elevate the final ISO of the product.
When film is immersed in the developer, which is mainly water, the gelatin swells much like the action of a dry sponge. This swelling happens quickly . The structure of gelatin resembles noodles of transparent spaghetti, as seen under the microscope. This structure is ideal as it allows the processing fluids to enter and percolate about. The fluids carry in the chemical of the process, do their job and exit powered by agitation.
The porosity of the gelatin is the main factor that controls developing time. Warm encourage greater swelling, increasing porosity and cold reduces infusion time. Other factors are hardeners in the film recipe and temporarily and permanent hardeners in the various fluids. The acidity or the alkalinity of the developer is also a key factor. If the pH is high (more alkalinity) development is most rapid.
As others have implied, the main effect of higher temperatures on B&W film developing is faster developing times. However, there is another, potentially significant effect: a change in contrast. Generally, more development means more contrast. But even if, when you use two different temperatures (e.g., 68 deg. F / 20 deg. C and 75 deg. F / 24 deg. C), you use a shorter developing time with the higher temperature, so that the two will show the same density at some particular exposure, you may well not get the same density at other exposures.
Your question suggests that you're not overly familiar with exposure / denisty graphs. But to try to put it simply with an example: you take two rolls of Tri-X, put them in identical cameras, sit them on side-by-side tripods, and take simultaneous pictures with identical exposures. You develop one roll at 68 deg. F, and the other at 75 deg. F, and for the roll developed at 75 deg. F, you decrease the development time so that some particular highlight has the same density as the corresponding highlight on the roll you developed at 68 deg. F. But the midtones on the two rolls (and/or the even brighter highlights) may very well not have the same densities. If you print the two rolls the same way, the prints will have different contrasts.
Kodak's film data sheets do or at least did give you good information on this. You could readily see (once you learned to interpret the aforementioned exposure / denisty graphs) how using different developer temperatures, with correspondingly changed development times, affects overall contrast.
You affect the temperature by adding hot or cold. The effect is that when you heat the chemicals, reactions happen faster. Likewise, when you lower temps, reactions happen slower.
Thanks for the responses!
Separate names with a comma.