Discussion in 'Film and Processing' started by alexandergambino, Oct 30, 2017.
Hmm... okay, you're
I'm actually OK with someone using the term "analogue" when describing film or film cameras unless they're trying to make some technical argument regarding film based on the idea that it's "analogue".
"Film" is the factual and correct term, but when someone uses the word "analogue" it actually communicates more to me. What they are saying essentially is "not digital". Based on the context in which it was used you can glean more information about their age, attitude towards film or digital formats, etc.
A single digital pixel of maybe 5 microns square can, even at a crappy 8 bits/channel, represent over 16 million colours.
A similar area of film might contain 20 or 30 dye-cloud blobs throughout the emulsion depth, or a dozen or so opaque silver 'grains'.
So which is more 'analogue' and which more 'digital'?
Sorry, but use of the word analogue in conjunction with (especially B&W) film just shows a depth of ignorance, and nothing more.
The only time I refer to anything as "analogue" on my film cameras is when the camera has a moving needle of some sort or another. In that case, the meter has an analogue display.
That's the extent of my use of the word analogue, however.
I think it arose out of the fact that that anything not digital is assumed to be "analogue." As Joe hinted at, each photosite on a sensor responds in proportion to the number of photons that hit it. Up until that signal goes through an A/D converter, the signal from a sensor is purely analog.
By contrast, film grains and dye clouds have two states-"there" or "not there." A bunch of them in an area form increasingly more opaque areas in the negative/transparency, which can then either be viewed directly, projected onto a medium where more grains/dye clouds are exposed to give a viewable image on a piece of paper, or looked at by a CCD which views an attenuated light source, gives a signal in response to the amount of light passing, and then gets fed into an A/D converter.
During manufacture and spooling of photographic film, edge printing is applied. This will be a mix of frame numbers, dots and symbols, barcode, logo, and other marks. Edge printing is applied by contact printing, using light, a high contrast image. In other words, edge printing is photographically applied to the film. As the film develops, so does the edge printing. If a finished film is totally clear, with no edge print visible. We can state categorically that the film was misprocessed. The fact that edge printing is clearly legible and that the leader is opaque, proves the processing was blameless.
I'm not arguing that "analogue" is technically correct but I don't think there's much doubt about what people are talking about when they say "analogue camera". And the fact that they chose the word "analogue" rather than "film", says something else that can go beyond just showing ignorance.
Like it or not, and accurate or not, "analog" has come to mean anything that is not digital in common usage. And it's not like there are a lot of alternative words to choose from. Would you prefer "nondigital"? Or "undigital"? There has to be a word for it, because digital is the new norm. Analog sounds better than obsolete, which is what many people think (not I, so don't go there).
When someone refers to a camera as being analog, they are talking about more than just the physical film. It's an all-encompassing term that includes the whole process from producing film to exposing it to developing and/or printing the results. It includes the mechanical or electromechanical device. It includes the implication that the process is old-fashioned, and it may include a tinge of derision or, conversely, a hint of admiration from those who appreciate craftsmanship. "Film" doesn't convey as much as "analog" because it is so specific to one aspect of the whole.
I see it as both ignorance and somewhat elitist. I can see the logic in saying that "if cameras which record on a piece of silicon are digital, cameras which use film must be analog" but there again that shows ignorance about the process. I see it as elitist because "film" is a word that is unambiguous and has served us well for over 100 years, but it comes across to me as wanting to sound more sophisticated.
I first took an interest in "real" photography(i.e. not snapshots) in 2005. I was in high school, and at the time, even consumer dSLRs(like the Nikon D70 and its Canon equivalent) were $1K or better. Instead, I opted for a Canon A-1, and the money I didn't spend on digital bought a lot of film and processing.
Even though I've had a dSLR since 2010 and now have a well equipped dSLR kit(several bodies including two full frames and bunches of lenses that will work on film and digital) I have never STOPPED shooting film. I shoot 35mm because it's convenient, fun, and I both enjoy farting around in the darkroom and seeing my work "pop" on a light table with Velvia. I shoot larger formats(645, 6x6, 6x7, 4x5) for the same darkroom and "wow" factor, although convenient certainly isn't in my vocabulary for 4x5(I should add the caveat that when I got a Grafmatic film holder, I did call it a big convenience even though I rarely use it).
So, to me, film is film and not an incorrect term that's less convenient to say. The only transparent-backed silver halide medium I've used that's not film is glass plates. I call those glass plates, or just plates.
You'll have to pardon my being pedantic. I'm a chemist, and this is one of those areas where I'm very "sticky" on definitions.
"Analog" at its core refers to something-whether an indicator, reading, or output being continuously variable. An analog clock or watch, for example, has a set of hands which point to scales to indicate the time. As I look down at my watch now, the second hand is around the 30 second mark, and the minute hand is roughly halfway between hour markers. I can read the time time to the nearest half second without much trouble, although it would be very unusual that I would need to do that. By contrast, even though a "digital" timepiece is often associated with electronic displays, there do exist plenty of mechanical digital timepieces. The display the time by looking at a rotating number wheel through an aperture in the dial such that only one number is visible at a time. BTW, the date display on my watch(and most watches) is like this-it doesn't even "wander" but rather "jumps" at 12:00 midnight.
If we want a more sophisticated sounding description, we could say "chemical based photography" or "silver based photography." In fact, I've noticed that the standard phrasing used for a B&W wet print in art galleries now is "silver gelatin"-a somewhat bulky and fancy-sounding name that is none the less an accurate descriptor of the medium.
Sorry Ben, but even your 'analog' clockwork watch probably jumps in quarter or fifth-second intervals. The standard rotary escapement mechanism holds the hands static while it rotates between ticks. At each release the hands jump forward by a miniscule amount, but they don't move smoothly and continuously.
However, I entirely agree that the use of the word 'analogue' in relation to photography shows a degree of pretence, affectation or snobbishness - as well as technical ignorance.
Once every 1/8 of a second(28,800 bph), but who's counting . I can usually see discreet second hand movement on 18,000 bph(1/5 second) movements, and of course also 16,800(older low grade American watches, 4.5 bps) and 14,000(all but the best English watches). By 28,800 the movement becomes more or less continuous, but I can still see the hand "stutter."
(I am also a watchmaker so should have thought of that). With that said, synchronous motor AC clocks more or less move continuously, as does Seiko's spring drive. The Bullova Accutron and related tuning fork movements are close, but still move at imperceptibly small discrete steps 360 times per second.
Sorry, Ben, but no one except other pedants EVER forgives anyone for being pedantic. And, as I said, it doesn't matter whether you like, or agree with, it, or not. Language is a living thing. It evolves at the whim of society. You can refuse to acknowledge the changes, but they occur, just the same.
The world-at-large is not going to adopt a clunky phrase like "chemical based photography", no matter how accurate it is, when there's a slick, and widely accepted, alternative like "analog". You can rail against it all you like but, the more you do, the more those who've moved on are going to scoff, and the more sure they're going to become that only pedants don't call it analog.
So, there again, why adopt the clunky(and there again factually inaccurate) word "analog" when "film" perfectly conveys the process?
Like I said, walk into pretty much any art gallery displaying black and white prints and you'll see "silver gelatin" listed as the medium. There again, that's both a factually correct and "sophisticated" descriptor of the image.
The usual analog clock now uses a 32768Hz quartz crystal, divided down to one pulse per second, such that the second hand moves only every second. The readout is analog, but the mechanism is digital. There is very little that is continuous when you get down far enough. We think of electrical current as continuous, but it is made up of discrete electrons. A big noise source in digital photography is the small number of electrons that make up each pixel in the sensor array. Yet the output of the sensor is a voltage, to be amplified and fed to an A/D converter. Analog magnetic tape has magnetic grains that are magnetized one way or the other. On the average, they represent the signal level, but individually, they are discrete magnetic domains.
Silver halide film has an analog readout mechanism, measuring the light transmitted (or reflected) as averaged over some number of grains.
"The world-at-large is not going to adopt a clunky phrase like"chemical based photography", no matter how accurate it is, when there's a slick, and widely accepted, alternative like "analog"."
That might be a valid point, if indeed it was 'the world at large' peddling this nonsense. However, I suggest that the great majority of people still call film photography 'film photography' and would have some difficulty relating that to the phrase 'analogue photography'.
It's only those with a vested interest in promoting the use of film that have introduced and push the use of 'analogue'. Which seems a bit strange and ironic, since on the one hand they want to promote an outdated and inferior medium, but at the same time re-brand it as hip and modern by slapping a newly invented label on it.
A rose (or something less pleasant smelling) by any other name.......
And the emperor's contemporary vestments are still invisible.
I don't know if the term "analog camera" becoming common is the result of some sort clandestine marketing on behalf of the film industry or not, but if that was goal, then they've succeeded.
One can choose to be bothered by it, - or not. It struck me as odd when I first started to here it in the same way as the expression "analog watch" did. It's strange to hear a differ a different name applied to something you've known by a certain term all your life, but I've moved on. To me an analog watch is just a "watch". I don't think I'd ever say something like "I'm going to wear my analog watch tonight", but I don't care if someone else does.
This particular instance of social engineering isn't something that keeps me awake at night, but it is symptomatic of the way that a few wrong-thinking marketing types or propagandists can influence our use of language, and hence our own thinking.
Language is a powerful tool. It's use should be policed by all of us. And the way to do that is to be selective in what we propagate and allow to fall into our personal usage.
"Resistance is futile" - Believe that and all is lost!
To me it's more like picking your battles. This simply is not that important. Calling a film camera an "analog" camera does not change how it works or the results that you will get. If someone get into film photography because they think it's trendy like vinyl records then they will figure out soon enough what it's all about and either continue with it or not.
Further it's not realistic to expect your average person to understand how film photography works well enough to know it's not actually "analog", - and most don't care.
Someone from the physics department offered this to me today. I suspect that this may actually be a true analog camera . Unfortunately, given the obvious home made nature of it, I'll have to reverse engineer it to see what-if anything-is needed to get a signal from it.
Hi, I'd say best guess is an NTSC video output. If you note the little indented area around the front of the sensor package, that probably once held a blur filter/IR cut package, which was maybe too thick for this camera (due to the shutter).
Back in the day when the Intel '486 was still king you could have used an AT&T "targa" board (later Cardinal "snap" board) to catch the incoming "frames" or "full fields" and digitize them into a ".tga" file. We used to run a couple thousand of them in our mass market portrait operation, piggybacked onto a film camera, to allow "proofing" on the spot. (Kodak had a special video camera that could momentarily pause video output, illuminate both video "fields" under flash, then resume the interlaced output; the Cardinal board could reassemble them into a single digital image.) It sure seemed complicated at the time; it was brand new technology to the regular photo guys.
I sure wouldn't spend any time on it nowadays. But whatever interests you... without the blur filter, get ready for aliasing out the wazoo!
Ps, I still work from the description that a digitized image has to be both "sampled" and "quantized;" the (presumably) CCD sensor "samples", but that's as far as old-style video went. Conventional photo film doesn't "sample," per se.
Separate names with a comma.