Jump to content

rodeo_joe1

Members
  • Posts

    15,450
  • Joined

  • Last visited

  • Days Won

    9

Everything posted by rodeo_joe1

  1. That's all very well, but the normalisation and XYZ values are geared toward reproducing the mysterious horseshoe. A self-fulfilling prophesy. There are still no absolute units nor definition for colour saturation, except that which plonks it somewhere in the horseshoe. Clearly a pointless exercise if you're trying to determine the meaning of the horseshoe itself in some absolute terms. Luminosity is slightly irrelevant, except that it obviously affects the ability of the human eye to perceive and discriminate colour. It's also clear that a monochromatic Laser light must be 100% saturated, but still sits within that horseshoe? Illogical Captain!
  2. Years ago I was asked to produce some B&W slides to be projected as a backdrop to a stage performance. I just shot the exterior scenes on FP4 and developed it normally, then made contact positive dupes onto blue sensitive slow copy film under an amber safelight in the darkroom. IMO the minimal extra work was actually easier than piddling about with an unknown reversal kit that probably wouldn't give good results without some experimenting and several tries. Plus there was more exposure latitude and contrast control with the copy film method. The play's producer was happy with the results anyway.
  3. Experience yes, as below. Suggested camera? Mirrorless and full-frame? That's going to be expensive, even used. How much do you want to spend? I've accumulated quite a collection of 50 to 58mm lenses over the years, and recently decided to find the 'best' one to use on my Mirrorless Sony A7r4. Sadly, not many of them stood up to the test of a 60 megapixel full-frame digital sensor. A Minolta P-F 58mm f/1.4 was about the weakest performer - hardly surprising since Minolta tried to squeeze an f/1.4 maximum aperture out of only a 6 element design, when every other maker uses 7 elements. So if you want sharpness corner to corner, don't expect to get it from an old film-era lens. Expect 'characterful' and a bit (or a lot!) soft round the edges, and maybe with some colour-fringing thrown in for good measure. Here's one of the better examples: A West German Zeiss 50mm f/1.8 Planar (7 element) made for Rollei's venture into the 35mm camera market. Corner resolution shown top, and centre resolution shown below. That's at f/4. Wide open the corners are much softer and a fair bit darker due to vignetting. The Minolta P-F lens is much, much worse, despite it having quite a few online fans. They must like 'characterful' quite a lot!
  4. Many years ago I was looking for Tamron Adaptall 2 adapters with the 'rabbit ear' metering forks. They were like hen's teeth back then, and commanded a premium price because of their rarity. My solution was to obtain a couple of bog-standard Tamron Adaptall Nikon-fit adapters and epoxy glue some thin metal home-made forks to them. They worked just fine. I had some thin gauge stainless steel sheet that I cut and hand filed to the same pattern as Nikon's metering fork, but a bit longer to allow some bonding area. However, the metering forks don't need much strength, so thin aluminium or plastic sheet would probably work just as well. I came across one of those modified adapters recently and the epoxied rabbit ears were still firmly attached. IIRC I had to scrape a bit of the shiny black anodising off the back of the aperture scale to get the home-made SS forks to stick firmly.
  5. Then what use is it? Because colour is a human construct. It does not exist outside of our perception (obviously the electromagnetic radiation that stimulates that perception is real, but concepts like 'red', 'orange', 'purple', etc. are a human construct placed on a very small region of the EM spectrum). Therefore any objective mapping of colour must be limited to what the human eye/brain can perceive. Otherwise it's not 'colour', but only something we can detect artificially through instrumentation.
  6. Thanks for the replies so far. My real interest is in how the perimeter (locus) of the CIE horseshoe was originally arrived at, and in what objective units it's measured or defined. It presumably defines a limit of saturation. How is that saturation defined, and in what units? The hue scale in wavelength is obvious, but again with no objective indication of how refined human differentiation of hue difference is. For example: I suspect that no-one can distinguish a difference in colour between the spectral Sodium D-lines, which are less than 1 nanometer different in wavelength. So how far apart in wavelength do two similar hues have to be before the 'average' human eye can distinguish them? It's this lack of absolute parametric definition that I find curious about the horseshoe. Because without objective units it's just a doodle made up in 1931 - before Lasers, before LEDs, before cheap diffraction gratings, before dichroic filters. Even before the perfection of the Photo-multiplier tube. Basically before any of the tools we'd expect to currently use in colorimetry were available. So does it really define the limits of human vision, or is it just a doodled illustration of it? P.S. I posted this here deliberately to avoid an obnoxious, patronising and insulting intrusion from a certain self-styled expert. Who often fails to provide clear answers.
  7. There's also the word 'saturation' and a large exclamation mark. But I fear the experiment has been completely screwed up by PN's re-coding or compressing of my original PNG files. I downloaded the yellow file above, only to find that the brightness difference had been reduced to one pixel level and the lettering had zero difference in saturation. Hmmff! Anyway, try it for yourself in PS or another editor. Create a rectangle of R180+G180+B0, then select an area and alter only the saturation to 89 or 90% in the HSL boxes. It's difficult, if not impossible to detect that 10 or 11% saturation change. In any colour you care to mix.
  8. As I said before. There's little point in shooting RAW if you're going to convert the file to JPEG at your earliest opportunity and then continue editing it! Keep your files in a high bit-depth and uncompressed format (like Tiff) until all editing is finished. Computer memory and disk space is cheap these days. There's absolutely no need to compress files or reduce bit-depth between edits. The ability to apply virtual filters to a digital colour image during B&W conversion is also invaluable. It's much more flexible than shooting B&W film - even with a box of lens filters at your disposal.
  9. P. P. S After uploading, I can just detect the lettering in 3 of the rectangles when viewed on my OLED screen phone. But then I know where to look.
  10. Preamble (skip if easily bored): I must admit to always being puzzled by the CIE 'horseshoe' as to what its perimeter boundary actually means. I always understood it to delineate the limit of human perception of colour - our ability to differentiate different hues or frequencies of colour and their saturation. Now, colour-saturation can be defined in several ways. As a starting point; a monochromatic colour (single frequency - i.e. Laser light) must have 100% saturation, and as the bandwidth of that hue is widened, the colour must become less saturated. Agreed? So what parameter defines the human eye's ability to differentiate colours and colour saturation and hence the 'horseshoe' perimeter? Is it a simple frequency difference? For example; can we differentiate a 1nm wavelength change? 5nm? 10nm? Or is it the 'dilution' of colour purity by contamination with white light or other frequencies of colour? And what units do we put on that ability to differentiate colours? Because those objective units appear to be lacking in the perimeter of the CIE 'horseshoe' Anyway The Experiment - Below are rectangles of 'pure' RGB colours, with words hidden in them. The words are in a desaturated and slightly different hue. The eyedropper tool in Photoshop easily picks up the difference - my eyesight, not so much. So the experiment is simple. Who can see the words? Or even detect the subtle colour variation? I admit to totally failing to detect a difference, but then I doctored the text for that very purpose. However the saturation does vary by 11 percent according to Photoshop's HSL tool. If, as I suspect, nobody can detect the desaturated areas, then that throws doubt on the veracity of that 'horseshoe', which after all was created over 90 years ago in 1931! Because if nobody can detect a 10% variation in saturation within the sRGB triangle, then what is the perimeter of the CIE horseshoe doing being well outside of it? Can anyone explain that apparent anomoly? FWIW, I have no impairment of colour perception and consider myself to have an excellent ability to differentiate between colours (for a male person). P. S. I hope the PNG files I've uploaded don't get mangled to JPEGs by PN's system.
  11. Not sure where that information comes from. - Maybe from old Leaf or other MF digital purveyors, desperate to make their 'MF' (not!) very slightly bigger CCD sensors look attractive in the face of competition from much cheaper full-frame CMOS sensors and cameras. Whatever, the practical fact is that most CCD sensors were stretched to defeat noise at 400 ISO, with 1600 ISO being barely acceptable. Modern CMOS sensors eat 1600 ISO for breakfast, dinner and supper without breaking a sweat. I remember accidentally leaving the ISO at 1600 on my D800 and not noticing for a whole day. And that was in the fairly early days of CMOS sensors. Now so-called 'backside illuminated' (snigger) CMOS sensors have upped the ante further by improving the fill-factor - the ratio of light-capturing area to that of interconnect and other non light-sensitive ancillary circuit components. CCD is definitely yesterday's technology, but it helped get us where we are now.
  12. I think the intention of the medium/large format additions here was to provide a specific forum for digital MF/LF, but that's not at all clear. Nor are many of the other categories that have now been moved or created. Basically the taxonomy of the whole site needs a rethink and restructuring, but the opportunity for that seems to have been missed and a lot of sticking plasters applied where drastic surgery is needed! Could I also politely direct the OP here, where they'll find a more visited, lively and more appropriate forum.
  13. Addendum to above after editing window: After using ACR the RAW file remains untouched, except that it has a 'sidecar' file attached with an XMP extension. This file basically just remembers the last adjustments you made in ACR and automatically applies them the next time that raw file is opened. The parameters can be subsequently changed while absolutely no alteration is made to the precious Raw file. I suspect that those same parameters can be transferred to other RAW files by copying and changing the name of the XMP file, but I haven't tried that. Must experiment next time I use ACR and Photoshop! 🤔
  14. IMO a JPEG should be a file type of last resort. In other words do not convert your file to Jpeg until you've absolutely finished editing it to your complete satisfaction, and only then if a small file size is absolutely necessary. So, to transfer between DPP and Photoshop, use Tiff, which will preserve a 16 bit, uncompressed, unsharpened image for further editing. JPEG applies automatic lossy compression (which is vulnerable to data corruption), can only save at 8 bit depth (which limits the tonal and colour adjustment that can be subsequently made). It also applies a default sharpening and introduces edge artefacts that damage definition. In short, JPEGs are a bit crap! No, make that very crap. If you transfer files between editors, use a lossless, high bit-depth format like TIFF or DNG. There's still the issue of keeping the colour space compatible between editors, but at least you'll have more than 8 bits worth of colour to play with. *With JPEGs, once you've baked in sRGB or AdobeRGB there's no going back without some loss of colour fidelity.* If you have any recent version of Photoshop, there's absolutely no reason not to use the Adobe Camera Raw (ACR) plugin that comes with it to open and process your RAW files. ACR handles a wide variety of RAW formats from almost every brand of camera. Although it may need updating if your camera is a very recent model. ACR allows many adjustments, like colour-temperature and tint, selection of a neutral grey-point, exposure adjustment and much more, including target colour space and bit-depth - which should be set to 16 bits; there's no point shooting RAW to only use 8 bits! After which the file automatically opens in Photoshop for further editing, printing, saving in multiple different versions or file formats or whatever.
  15. Sorry, but in what way does a Sony a850 and Minolta lens qualify as Medium/Large format? I know this is a recent addition to the DSLR fora, and that whoever restructured the site appears to have no concept of a sensible taxonomy, but let's not make this a dumping ground for random posts.
  16. I've thought that about almost every rangefinder camera I've picked up. Why do they fit such a tiny squint-hole, with an even more minute and difficult to see coloured blob or split rectangle in the middle? The only decently-sized rangefinder I can remember looking through was the one on the monster Mamiya Press camera. Yes, I wear spectacles, but the amount of dioptre adjustment is usually totally inadequate for my unaided eyesight. Or the eye-hole is metal-rimmed and apparently deliberately designed to scratch eyeglasses. Not the case with the EVF on my Sony mirrorless camera, which I can see with crystal clarity thanks to its wide dioptre adjustment. And the Sony body is as small as a good many film rangefinders.
  17. I think you might have your answer right there.
  18. The Agifold nameplate looks as if it covers a cutout in the top-plate. I wonder if the designer(s) left space for a selenium cell and meter? Slight nitpick. AGI was an acronyn for Aeronautical and General Instruments, not Industries. I've mentioned owning Agiflex I and II cameras before. The camera bodies were quite well made, but the lens optical quality was barely acceptable. So I'd be interested in seeing what the image quality of that Agifold is like. P. S. I'd almost forgotten also owning an AGI Agiscope enlarger, bought out of pocket money when I was about 13 years old. Apart from an aluminium column, it was a lightweight all-plastic thing - including the non-interchangeable lens! Needless to say, the IQ of the prints wasn't that great either.
  19. The AF-D is actually the 3rd generation of Nikon AF lenses, and given the lack of a hard infinity-stop and short focus throw, it's one of the last lenses I'd choose for astro photography. The Chinese made AF-D also suffers from a rather sloppy build quality and quality control, making the all-metal pre-AI or AI 50mm f/2 Nikkor H a much better choice. The manual focus Ai-S 50mm f/1.8 Nikkor is also pretty good at f/2 and smaller apertures. The AI or AI-S f/1. 4 Nikkor S-C has even better corner definition and a flatter field - not at f/1.4 though, it needs stopping down to f/4 for near-optimum corner-to-corner definition. Better yet, if you can find one, is an MC ('green' coated) Chinon 55mm f/1.4 lens. Yes, Chinon! My sample has better field flatness and corner definition, and from wide-open, than any other 50-58mm lens I've yet been able to lay hands on. FWIW Chinon's 55mm f/1.7 version isn't anywhere near so good.
  20. I was being metaphorical with the 'deadly poisonous' bit. I meant deadly poisonous to lenses, shutters and other delicate mechanisms! Should've made that clearer. Even so - spraying it on your joints??? Probably doesn't stop your joints clicking and creaking, but the lingering smell just prevents other people getting close enough to hear it. 🤭
  21. The shutter looks mostly closed in the OP's picture and what usually stops a closed shutter from tripping the mirror return, etc. is a lack of spring tension. Sometimes you can get to the tensioning 'screws' through the base, sometimes not. One or other, or both of them, may be obscured by the cocking or speed train mechanism. Whatever the case, I don't recommend a novice to start stripping their camera and fiddling with the shutter tension. If indeed that's the fault. Could also be a sticky or dirty shutter track, or any of 5 or 6 other points of failure. The repair is usually fairly trivial - If you know what you're doing and which bits to strip and which to leave severely alone!
  22. I just wonder exactly how many recently-produced negatives get wet-printed at all these days. Because I suspect the vast majority only get scanned and shared online.
  23. Quite frankly, any meter that's voltage sensitive within +/- 0.5v is a very poor design. Disposable batteries are not constant voltage devices, and any electronics designer worthy of the title should take account of that. The Zener diode and other such regulating devices have been around since the mid 1960s.
  24. Incidentally, I found Ken Rockwell's take on the Zone system, ermm, interesting(?). I especially liked the "Ansel and I.." bit, making it sound as if they were good buddies that worked together. 😂 FWIW, if you are going to use the Zone system, you should know that Zone 8 (described by Adams as 'white with texture' and =< a 100% Lambertian reflectivity) is only 2.5 stops above Zone 5, which Ansel swears is equal to 13% reflectivity. Draw your own conclusions from that fact.
×
×
  • Create New...