Jump to content

paddler4

Members
  • Posts

    2,520
  • Joined

  • Last visited

Everything posted by paddler4

  1. Also, lots of files related to the OS could be either out of date or corrupted. You could run the Windows file system checker and let it repair what it finds, https://support.microsoft.com/en-us/topic/use-the-system-file-checker-tool-to-repair-missing-or-corrupted-system-files-79aa86cb-ca52-166a-92a3-966e85d4094e. that might not help, but it's easy enough to try.
  2. My old windows machine became cranky because the graphics card was not sufficient for current Adobe programs. I upgraded to a new computer with Windows 10 in June 2022 and stayed on it only Sept 2023 because that was still what my university was using. I had zero problems with Adobe under Windows 10 after getting a new computer. The transition to Windows 11 was seamless.
  3. You might get more answers in the wet darkroom forum. This is the forum for digital darkroom work.
  4. It depends on what you shoot. For me, having the 70-105mm range and the IS trumps the difference in optical quality. I had both the EF versions and agree that the II is not greatly better than the I, but it did avoid zoom creep. I now use an RF 24-105, and it's one of my two most used lenses.
  5. In this case, more than good enough, if it's in mint condition. The original 70-200 f/4 L was a superb lens.
  6. Given how easy uninstalling and reinstalling adobe products is now that it's a subscription model, sometimes the simplest thing to do with a problem that won't go away is to uninstall and reinstall
  7. By 1:4, do you mean f/4? If so: I've owned both the first generation and the second generation 70-200 f/4 L lenses. Both are excellent. The current II is truly a superb lens. In fact, I bought it because there was a rumor that it was one of the EF lenses that would be discontinued, and I wanted to buy one before they disappeared. That apparently didn't happen; they are still available at retailers. It's so good that I kept it when I switched to mirrorless and use it with an RF adapter. it is NOT the case that the f/2.8 is a better lens. It's just one stop faster. And the cost of that, when I bought mine (the RF specs are different) was that the f/2.8 was twice the price, twice the weight, and a lot bulkier. On a telephoto, I never need the slightly narrower DOF f/2.8 offers, and in the very rare cases where I need the extra stop, I just boost ISO by one stop. On modern cameras, a one stop increase in ISO is not a big deal. So for me, the f/4 was clearly the superior choice. I've had one or the other of these lenses for a long time, probably well over a decade, and I've never once regretted not buying the f/2.8.
  8. An electronic, non-global shutter can create rolling shutter artifacts when the subject is fast moving. The point of the global shutter is to allow a fully electronic shutter without rolling shutter artifacts. I don't have this problem, but it can be avoided by using electronic first curtain shutter, which is fast enough for anything I do. See https://photographylife.com/mechanical-electronic-shutter-efcs
  9. I don't know why people are getting so excited by this. I think the article's title is foolish and misleading. A global shutter will change photography for a very, very small subset of photographers, and for them, this is a big deal. For the rest of us, it provides no benefit at all and exacts a price in other aspects of image quality. I have zero use for a global shutter, and while I know a lot of photographers, I can't think of one who would benefit from this. For people who photograph car races and the like, maybe. And if you read about this, you'll see that Sony had to make other compromises to accomplish this. Bottom line is that if this were offered as an option for the camera I recently bought, I wouldn't have bought it. Remember what Ansel Adams said: the most important piece of photographic equipment is the 12 inches behind the viewfinder.
  10. Certainly true. Lots of Windows and Linux machines don't match Adobe requirements. I had to specify a GPU when I ordered my current desktop. OOH, Adobe is virtually the sole software vendor for which I have ever found that true, and I switched away from Apple after the Apple II. The only other example I can think of is that one of the software packages I use holds your entire database in memory and therefore required specific amounts of memory to be able to handle large databases, but that had nothing to do with the PC architecture. People in my group ended up using a compute server farm running Linux rather than their Macs or PCs for really big jobs.
  11. I can only speak from my experience, that's not been my experience. I can't recollect the last crash I've had with LR Classic (always kept up to date) on either my desktop or my laptop. Neither is super high end, but both are only a few years old, and the desktop meets Adobe's listed requirements. Seems to me that one has to take into account market share, as Robin noted. Also, a lot of the PC GPU problems are from people who haven't checked to make sure their machines meet Adobe requirements.
  12. I'd post this in the digital darkroom forum. The Dog is the person to give you the best answer. However, not being an expert, I'll take a stab at it: 1. yes 2. I suspect this depends on the monitor, but I would calibrate it to sRGB, which is a standard, unlike the characteristics of your particular monitor. I use a NEC wide gamut monitor, the same model Dog uses, and NEC advises first calibrating the monitor and then using the full native gamut. I think that's because a printer isn't exactly Adobe RGB, and the idea is to get the best possible rendition and then let the ICC and softproofing narrow it if need be. But really, ask Dog. He's the expert.
  13. I don't take huge numbers of photographs with my digital cameras, but I take a lot more than I did with my film cameras. And that's a good thing. I feel free to experiment and take chances without worrying about cost or the number of frames I had left in my camera or my bag. And I can essentially change "films" with the press of a button. No more stopping everything to take out a changing bag, carefully rewinding just enough, inserting another roll, etc.) The complexity of modern digital cameras is unavoidable: the manufacturers have to spread the costs of development over a wide variety of potential buyers. I don't take videos with my digital cameras, so all of the increasingly sophisticated video capabilities are wasted. I don't even understand some of them. Doesn't bother me in the slightest because I simply ignore those features. They never get in the way of doing what I want to do. The only time I find the complexity a nuisance is when I buy a new camera and have to spend a few hours figuring out how it works, how to customize it for my needs, and what I can ignore. What's omitted from this discussion, if I'm not missing something, is the vastly greater control digital gives us. I can do things that I was incapable of doing with analog (e.g., focus stacking), and I can do others vastly more easily and better (tonality adjustments, sharpening, color adjustments, yada yada). I loved my old FTb, which I still have, but I haven't shot a single roll of film since I bought my first DSLR years ago,. Re people who use digital to shoot far too many images: not my problem. Unlike, say, the many people who misuse the capabilities of cars, they pose no danger.
  14. Doesn't take any math to recalibrate a monitor. Just a little bit of patience. I use an X-Rite, but it has the same issue: by default, it prompts me too early. In the days of CRTs, calibration apparently went off kilter fairly quickly. It doesn't seem to with modern displays, at least the ones I've had. So, I set the software to remind me less often, and then I ignore it until I'm ready to spend the time doing it.
  15. that's what I have, and I love it, but they are hard to find--discontinued quite a long time ago. Sharp bought NEC several years ago, and they have substantially reduced the number of wide-gamut monitors they sell. John--You didn't say what your goal is. In particular, do you print a lot? If you do (I do), then a wide-gamut monitor like the NEC is a real help. If you display only online, I don't think you'll find it much of an advantage, since most monitors that people viewing your photos will use are at best sRGB. If you want wide gamut and don't want to spend Eizo-level prices, you might want to look at BenQ. From what I have read and from the reports I've had from users. their quality control has sometimes left something to be desired, but if you get one that's working properly, they are very good for the price.
  16. Not my field, but my impression is that Linux is primarily used for cloud management, which I think it dominates, and for heavy-duty professional work, e.g., statistical analysis of huge datasets, and not all that much for personal computing. I just looked up a couple of the 2023 estimates for desktop market share: Linux: 3.2%, 2.8% MacOS: 21.2%, 14.6% Windows: 68.9%, 70.4% So not a lot of reason for a company like Adobe to put money into recoding its suite for Linux. The word-around I noted earlier was via a Windows emulation under Linux, not native Linux.
  17. I looked, and there are workarounds to get Adobe products to run under Linux, e.g., https://www.maketecheasier.com/install-adobe-creative-cloud-linux/. However, it's nothing I would want to do.
  18. A very sensible article. I think he could go further. The latest craze, global shutters, isn't just absolutely useless for most of us. It's actually detrimental because of its negative effect on DR. Whether any of the new doodads are worthwhile depends on what you do. If you do landscape photography or urban night photography or studio portraits, AI subject detection is useless. If you do candids of kids, which I do, it's actually useful because they don't stay still. I have a much higher keep rate for candids with my R6 II than I had with my 5D IV. However, for most other types of photography I do, that feature is worthless. His point about the silly resolution race is spot on. Unless you print large or crop severely, it doesn't much matter how much about 20 MP you go. I have printed up to 17 x 22 with cameras that have 22, 24, and 30 MP, and not a single person has ever commented that the photos lack detail. I did exhibit one 11 x 19 that was an 8 MP crop from a 12 MP image, and that one really did take a lot of postprocessing work, but the only comments I've received about the final product have been very positive. The rule I have tried to enforce on myself--I have to admit, I have enough GAS that I have at times violated it--is that if I can't say what a new piece of gear will allow me to to appreciably better, I shouldn't buy it. Ricochetrider: you must be using an ad-vulnerable browser without an ad blocker extension. I read the column in Vivaldi with no ad blocker added, but with Disconnect.me installed, and the page was quite clean.
  19. That's a plus. it's easy enough to handle in Windows, but I've had to explain it to people many times. I don't think so, but I never tried. I used Linux primarily for heavy-duty statistical work (it was the front end of a large compute server network) and related tasks, never for photo editing.
  20. Re OS upgrades: Indeed, it used to be a pain with windows and much easier with Macs. However, things are changing. My university finally gave the green light to upgrade from Windows 10 to Windows 11, and it was almost entirely seamless. I had to do almost nothing. I think I had to reinstall one of my three printers, and there may have been one or two other tweaks, but almost everything worked just as it had before. I was surprised and delighted. There are still things that are much easier with Macs, like changing to a new computer. Even if you aren't a power user, there are some differences in interface that are initially a PITA whichever way you change. For example, I had three GUIs to deal with: Windows, my wife's mac, and Linux Gnome. The windows work differently in all three of them. Not a big deal, but I spent some time fumbling when the actions that were automatic for me didn't work. A few weeks, however, and that problem vanishes. There may be one thing about using Adobe products that is different for the two platforms, but a mac user would have to say. In windows, at least with Canon printers, you access the printer's firmware via the properties link in the Windows print dialog. For example, if you want to avoid double profiling, you have to turn off the printer's control of colors, and that's done via the print dialog. I don't know if that's the same on a Mac.
  21. GPUs are only one issue. The web was full of people complaining about initial compatibility problems with at least 4 MacOS updates. The sole OS-related update problem I've had under Windows was the need to specify a better GPU when I bought my most recent computer. But I think this is not really the core point. Most people using adobe products will be fine on either OS, with the occasional annoying hiccup. And inside the Adobe programs, there is essentially no difference, apart from things like print dialogs. So my advice is to consider the hiccups only one factor in making the decision. Others are the cost of an unfamiliar OS if one changes, the costs of the gear itself, and the availability of other software for each of the OSs. These will lead to different decisions for different people. For me, given that I'm an OS power user, the issue of changing OSs looms large. If I were on a Mac now, I would be very resistant to changing to Windows, just as I am now very resistant to changing from Windows to the Mac. But different strokes for different folks.
  22. I don't care either, but search <adobe photoshop problems mac os> without quotes. Not a short list.
  23. Buying a brand-name GPU that meets their specs is all that is needed. Other than having to specify that, I have had exactly zero problems running Adobe software on three or four different windows machines. I have never had driver problems related to adobe.
  24. Yes. Adobe has on its website the GPU requirements to get things like the neural filters to work. I have an Nvidia GForce GTX 1660 Super, which is not super high end and works flawlessly with Adobe products. If you don't have a GPU that meets their requirements (my old computer didn't), then you will find, as I did, that some features won't work. You can find the requirements spelled out in detail at https://helpx.adobe.com/photoshop/system-requirements.html. I simply specified a Dell computer that met or exceeded the specs. Years ago, my wife brought home the first generation of MacBook Air. I was blown away--that is, until I tried using it for a while and started confronting all of the little odds and ends I rely on that I would have to re-learn and for some of which I would need different software. Simple things, most of them, but they take time. And solving problems when they arise are a whole additional thing to learn. E.g., when a print driver installation failed on my wife's Mac recently, I was helpless, even though I generally can solve these problems on Windows. I had to call a Mac expert friend, who explained why the default Mac installation wouldn't work in that case and how to fix it. Having spent the time learning my way around Linux and the Gnome GUI for statistical work, I decided that two OS's were more than enough for me. I really think the issue is how much you stray from simple, plain-vanilla use of the OS. The more you rely on the details, the more pain there is in switching. I'm a real "power user", so the cost to me would be substantial. None of which is to say you shouldn't switch. I wouldn't presume to say one way or the other. I am just pointing out a real cost of switching IN EITHER DIRECTION that you ought to add to your calculations.
×
×
  • Create New...