Discussion in 'Digital Darkroom' started by herma, Jan 27, 2012.

  1. I am still recovering from a motherboard failure that happened in December. It's been weeks and I am still working on re-installing and re-assembling all the parts: 2 monitors, multiple external drives, re-install software, etc., a mayor PITA.
    Last year I bought a Dell U2410 as my main monitor (Resolution is now 1920x1200). My new computer (HP Pavillion HPE) allows me to use a HDMI cable (one that supports 1080p) to connect the monitor, whearas before I had to connect with a DVI-d cable. (I don't know if that makes a difference)
    My new system has a NVIDIA Geforce GT 520 with cuda graphics card (what ever that means), I am using Windows 7. I am also using an 8 year old Dell 2001FP as my second monitor, connected with the DVI-d cable. I have a Spyder 3 and I am ready to start calibrating the monitors with my new system.
    My observation before even starting the calibration is that the colors on my 8 year old Dell are so much more pleasing than on the new U2410. As a preset on the U2410 monitor, I've picked Adobe, because it looked the best of all presets. I have turned the brightness down to 25%, because I remember those settings from my old system.
    I have a test image file and a test print from Mpix, to help be find a starting point. I am trying to make my monitor look like the test print as much as possible, but it is really hard. I hate the way the U2410 displays the test image. The colors are washed out, the blacks are too whimpy. Now I really don't know where to start.
    I assume that just running the Spyder 3 software and calibration on the baseline ugliness that is being displayed is not going to make the colors on the U2410 look good.
    Any tips would be appreciated. I would not know where elso I could get some ideas.
  2. I'm thinking you should calibrate the monitors first, and then compare. No two monitors right off the shelf are going to look the same...
  3. The U2410 is a excellent monitor use by many pro photographer i know, and some retoucher i also know as a second monitor or as there prime one... Until calibration most if not all monitor look like s***.. so start by using a good device for calibration, and see after.
    Also, LCD monitor have less vibrant color vs older CRT, the glass covering the panel make everything look more vibrant less say.. but far from a real life print if not tune down ..
    Dont forget to do this before calibration;
    1_turn OFF / remove Adobe Gamma from your system.
    2_put your new monitor to is factory default
    3_install any driver you have BEFORE connecting anything
  4. "My new computer (HP Pavillion HPE) allows me to use a HDMI cable (one that supports 1080p) to connect the monitor, whearas before I had to connect with a DVI-d cable. (I don't know if that makes a difference) "

    I have an HPE Pavillion and had spent months with HP customer service trying to figure out what was wrong with the computer, since it kept crashing. Finally after about 4 months of haggling, I sent it back to HP. They replaced the mother board and everything is fine now, but once in a while it still crashes. According to HP something in Windows 7 is causing some type of incompatability and it will cost $150 to upgrade. Yeah right !
    I have an NEC 221W monitor that is connected to the computer by an Analog D-sub 15 pin cable. I tried using the DVI connection, but that did not seem to work on my computer. My monitor did not come with an HDMI cable as far as I know, so maybe I need to buy one. I even called HP customer service to get some instruction and they told me to use the blue 15 pin cable.
    The resolution on my screen is very accurate, but I like the smoothness and the vibrancy of my 7 year old Dell CRT a little bit better. Although my monitor came with Spectraview calibrating software, I didn't have to do much as far as calibration. Right out of the box the colors were fine, so you could be having another issue because your colors are way off.
  5. Both HDMI and DVI are digital signals, and trying to use an analog D-Sub socket (VGA) and cable, without proper converter, is a futile activity.
    This could be confusing since there are vendors selling VGA to DVI or to HDMI direct connection cables, to despair of after purchase owners.
    The HMDI outputs that I encountered were as good as 1980 x 1080, but not any higher resolution was provided. It seems to be limitted and tailored to the HD TV standard, blu-ray players, and the HDMI formats on latest LCD large HD TVs.
    From DVI, or even from the old analog VGA output, you usually can configure video adapter card to allow much higher resolution, if your monitor is capable of. Something that you perhaps would not get out of the HDMI interface.
    Since HDMI and DVI are digitals format signals, there are simple adapters available that one can use a HDMI cable on the DVI output socket.
    There was recent complain that some newest laptops only provide HDMI socket, and no VGA or DVI socket provided.
    Even though the built-in NVIDIA or ATI cards are capable to produce much higher resolution, the only HDMI socket is all what they get.
  6. ..duplicate..
  7. I can't really help you much with the look of your screens, other than to note that every time I've changed video cards and/or monitors, what I see on the screen looks at least slightly different (assuming I've made some manual adjustments, possibly guided by something like Adobe Gamma but not something as sophisticated as a proper profiling product). But I can confirm that what you're seeing is not a result of HDMI vs. DVI-D. HDMI quite intentionally started out using the same electrical and digital specs for its video as DVI-D used, so a simple adapter that has the appropriate mapping between input and output pins can convert between the two for typical computer display purposes*.
    Given a hypothetical video card with both DVI-D and HDMI outputs, and a hypothetical monitor with both DVI-D and HDMI intputs, you should see the same thing on screen regardless of whether you connect the two via DVI-D, HDMI, or a mixture of the two with an appropriate adapter.
    *: There are areas where one standard supports something the other doesn't. The most obvious one is that HDMI supports audio while DVI doesn't. Both standards support many resolutions beyond 1080p, but not necessarily in ways that are compatible with each other, so if you have a video card and a monitor that both support ultra-high resolutions but use different connectors, you may be stuck. But for up to 1920x1080 or 1920x1200 output from a computer to a computer monitor, they're essentially interchangeable.
  8. Harry, it's funny you should mention the crashing. This new computer is indeed crashing quite frequently. From Windows "trying to find a solution" to a few blue screens of death.
    Another interesting find: Through windows>Settings>Screen resolution, the highest was only 1920x1080, leaving me with an unused strip at the top and bottom. I was only able to change the resolution to 1920x1200 in the Nvidia graphics card settings. Problem solved.
    Then I did what Patrick told me (other than the removing Gamma bit), reset the monitor to ugly factory reset and ran Spyder 3: It is SO much better! I even Spydered the ol' Dell 2001, now my test image between the 2 monitors and the print itself is pretty darn close.
    The only question I now have is: Since the HDMI only supports 1080, what happened when I set it to 1200? Did I loose some quality? Or is that all a wash anyway?
  9. Herma, re the dreaded BSOD and other failure issues, check to make sure the RAM DDR modules are properly seated in their sockets...
  10. The only question I now have is: Since the HDMI only supports 1080, what happened when I set it to 1200?
    Depending on your monitor setting, there are most likely 2 cases, that none will please you.
    1. The picture will be stretched onto 1200 lines, and the monitor will not work at its best, having non- native picture pixel resolution projected on it.
    2. You will see dark bands of some 90 lines at the top and the bottom of the screen, but the picture quality in the center will be much sharper, better.
    On monitor capable of 1200 lines, I do not use HDMI, and use the old and good VGA output socket, that my Vaio laptops have both. I playback Blu-Ray videos via HDMI on a large size HD TV. I watch pictures and home movies via HDMI connector, directly from a camera or camcorder.
  11. **Duplicate**
  12. Here is what my manual says about HDMI, VGA, DVI. Currently I'm using VGA but now I'm wondering if I should be using DVI. According to the Manual VGA gives you more customizing options.
  13. For LCD monitor, the best is if your video adapter can produce native screen resolution, that matches exactly the screen maximum pixel capability.
    If VGA, and DVI both provide the optimal (native resolution) signal, then perhaps you will not see any difference using either one.
    The DVI is a newer technology and some claim that produces more stable pictures, with less flickering, but I have not noticed any difference with either.
    The old and good anlog VGA was always kept updated, and is capable to produce highest resolution that video card can, so does the DVI. (unlike the HDMI implementations).
  14. Simple answer: use the DVI connection at the monitor’s native resolution.
    (Forget about HDMI and VGA.)
  15. Hate to high-jack this post, but I just switched the connection to my monitor from VGA to DVI-I. There is not that much of a difference, but it does seem that the images on my LCD monitor improved a bit. The images are a little more vibrant and there is slightly more contrast. The colors are a bit more accurate too the way they came out of the camera.
    The reason why I can say this, is that I just compared all the images on my Photo.Net portfolio using my NEC LCD monitor and my Dell CRT. The contrast, color and vibrance between both screens seem to match alot better now though not by much.
    I wasn't very satisfied before with the comparisons, because the contrast on my CRT was higher by about a stop 1/2 than on my LCD now the difference is about 1/2 of a stop. I think I'll stick with the DVI-I connection for now, unless I run into problems then I'll revert back to VGA. Thanks !
  16. VGA is an outdated analog connection standard. Using it with any current display adds a completely unnecessary digital to analog and back to digital conversion that can only degrade the signal quality.
    DVI keeps the signal digital all the way.

Share This Page