Jump to content

HDMI vs DVI-d


Recommended Posts

<p>I am still recovering from a motherboard failure that happened in December. It's been weeks and I am still working on re-installing and re-assembling all the parts: 2 monitors, multiple external drives, re-install software, etc., a mayor PITA.<br>

Last year I bought a Dell U2410 as my main monitor (Resolution is now 1920x1200). My new computer (HP Pavillion HPE) allows me to use a HDMI cable (one that supports 1080p) to connect the monitor, whearas before I had to connect with a DVI-d cable. (I don't know if that makes a difference) <br>

My new system has a NVIDIA Geforce GT 520 with cuda graphics card (what ever that means), I am using Windows 7. I am also using an 8 year old Dell 2001FP as my second monitor, connected with the DVI-d cable. I have a Spyder 3 and I am ready to start calibrating the monitors with my new system.</p>

<p>My observation before even starting the calibration is that the colors on my 8 year old Dell are so much more pleasing than on the new U2410. As a preset on the U2410 monitor, I've picked Adobe, because it looked the best of all presets. I have turned the brightness down to 25%, because I remember those settings from my old system.<br>

I have a test image file and a test print from Mpix, to help be find a starting point. I am trying to make my monitor look like the test print as much as possible, but it is really hard. I hate the way the U2410 displays the test image. The colors are washed out, the blacks are too whimpy. Now I really don't know where to start.<br>

I assume that just running the Spyder 3 software and calibration on the baseline ugliness that is being displayed is not going to make the colors on the U2410 look good. <br>

Any tips would be appreciated. I would not know where elso I could get some ideas.<br>

Thanks.</p>

<p> </p><div>00Zvfa-436973584.jpg.3c4907375dea514eb6809719fff2139a.jpg</div>

Link to comment
Share on other sites

<p>The U2410 is a excellent monitor use by many pro photographer i know, and some retoucher i also know as a second monitor or as there prime one... Until calibration most if not all monitor look like s***.. so start by using a good device for calibration, and see after.</p>

<p>Also, LCD monitor have less vibrant color vs older CRT, the glass covering the panel make everything look more vibrant less say.. but far from a real life print if not tune down ..</p>

<p>Dont forget to do this before calibration;</p>

<p>1_turn OFF / remove Adobe Gamma from your system.</p>

<p>2_put your new monitor to is factory default</p>

<p>3_install any driver you have BEFORE connecting anything</p>

<p> </p>

Link to comment
Share on other sites

<p>"<em>My new computer (HP Pavillion HPE) allows me to use a HDMI cable (one that supports 1080p) to connect the monitor, whearas before I had to connect with a DVI-d cable. (I don't know if that makes a difference) "</em><br /><em></em><br />I have an HPE Pavillion and had spent months with HP customer service trying to figure out what was wrong with the computer, since it kept crashing. Finally after about 4 months of haggling, I sent it back to HP. They replaced the mother board and everything is fine now, but once in a while it still crashes. According to HP something in Windows 7 is causing some type of incompatability and it will cost $150 to upgrade. Yeah right !</p>

<p>I have an NEC 221W monitor that is connected to the computer by an Analog D-sub 15 pin cable. I tried using the DVI connection, but that did not seem to work on my computer. My monitor did not come with an HDMI cable as far as I know, so maybe I need to buy one. I even called HP customer service to get some instruction and they told me to use the blue 15 pin cable.</p>

<p>The resolution on my screen is very accurate, but I like the smoothness and the vibrancy of my 7 year old Dell CRT a little bit better. Although my monitor came with Spectraview calibrating software, I didn't have to do much as far as calibration. Right out of the box the colors were fine, so you could be having another issue because your colors are way off.</p>

Link to comment
Share on other sites

<p>Both HDMI and DVI are digital signals, and trying to use an analog D-Sub socket (VGA) and cable, without proper converter, is a futile activity.<br>

This could be confusing since there are vendors selling VGA to DVI or to HDMI direct connection cables, to despair of after purchase owners.</p>

<p>The HMDI outputs that I encountered were as good as 1980 x 1080, but not any higher resolution was provided. It seems to be limitted and tailored to the HD TV standard, blu-ray players, and the HDMI formats on latest LCD large HD TVs.</p>

<p>From DVI, or even from the old analog VGA output, you usually can configure video adapter card to allow much higher resolution, if your monitor is capable of. Something that you perhaps would not get out of the HDMI interface.</p>

<p>Since HDMI and DVI are digitals format signals, there are simple adapters available that one can use a HDMI cable on the DVI output socket.</p>

<p>There was recent complain that some newest laptops only provide HDMI socket, and no VGA or DVI socket provided.<br>

Even though the built-in NVIDIA or ATI cards are capable to produce much higher resolution, the only HDMI socket is all what they get.</p>

Link to comment
Share on other sites

<p>I can't really help you much with the look of your screens, other than to note that every time I've changed video cards and/or monitors, what I see on the screen looks at least slightly different (assuming I've made some manual adjustments, possibly guided by something like Adobe Gamma but not something as sophisticated as a proper profiling product). But I can confirm that what you're seeing is not a result of HDMI vs. DVI-D. HDMI quite intentionally started out using the same electrical and digital specs for its video as DVI-D used, so a simple adapter that has the appropriate mapping between input and output pins can convert between the two for typical computer display purposes*.</p>

 

<p>Given a hypothetical video card with both DVI-D and HDMI outputs, and a hypothetical monitor with both DVI-D and HDMI intputs, you should see the same thing on screen regardless of whether you connect the two via DVI-D, HDMI, or a mixture of the two with an appropriate adapter.</p>

 

<p>*: There are areas where one standard supports something the other doesn't. The most obvious one is that HDMI supports audio while DVI doesn't. Both standards support many resolutions beyond 1080p, but not necessarily in ways that are compatible with each other, so if you have a video card and a monitor that both support ultra-high resolutions but use different connectors, you may be stuck. But for up to 1920x1080 or 1920x1200 output from a computer to a computer monitor, they're essentially interchangeable.</p>

Link to comment
Share on other sites

<p>Harry, it's funny you should mention the crashing. This new computer is indeed crashing quite frequently. From Windows "trying to find a solution" to a few blue screens of death.<br>

Another interesting find: Through windows>Settings>Screen resolution, the highest was only 1920x1080, leaving me with an unused strip at the top and bottom. I was only able to change the resolution to 1920x1200 in the Nvidia graphics card settings. Problem solved.<br>

Then I did what Patrick told me (other than the removing Gamma bit), reset the monitor to ugly factory reset and ran Spyder 3: It is SO much better! I even Spydered the ol' Dell 2001, now my test image between the 2 monitors and the print itself is pretty darn close.<br>

The only question I now have is: Since the HDMI only supports 1080, what happened when I set it to 1200? Did I loose some quality? Or is that all a wash anyway? </p>

Link to comment
Share on other sites

<p><em>The only question I now have is: Since the HDMI only supports 1080, what happened when I set it to 1200?</em><br>

Depending on your monitor setting, there are most likely 2 cases, that none will please you.</p>

<p>1. The picture will be stretched onto 1200 lines, and the monitor will not work at its best, having non- native picture pixel resolution projected on it.</p>

<p>2. You will see dark bands of some 90 lines at the top and the bottom of the screen, but the picture quality in the center will be much sharper, better.</p>

<p>On monitor capable of 1200 lines, I do not use HDMI, and use the old and good VGA output socket, that my Vaio laptops have both. I playback Blu-Ray videos via HDMI on a large size HD TV. I watch pictures and home movies via HDMI connector, directly from a camera or camcorder.</p>

Link to comment
Share on other sites

<p>For LCD monitor, the best is if your video adapter can produce native screen resolution, that matches exactly the screen maximum pixel capability.</p>

<p>If VGA, and DVI both provide the optimal (native resolution) signal, then perhaps you will not see any difference using either one.<br>

The DVI is a newer technology and some claim that produces more stable pictures, with less flickering, but I have not noticed any difference with either.</p>

<p>The old and good anlog VGA was always kept updated, and is capable to produce highest resolution that video card can, so does the DVI. (unlike the HDMI implementations).</p>

Link to comment
Share on other sites

<p>Hate to high-jack this post, but I just switched the connection to my monitor from VGA to DVI-I. There is not that much of a difference, but it does seem that the images on my LCD monitor improved a bit. The images are a little more vibrant and there is slightly more contrast. The colors are a bit more accurate too the way they came out of the camera.</p>

<p>The reason why I can say this, is that I just compared all the images on my Photo.Net portfolio using my NEC LCD monitor and my Dell CRT. The contrast, color and vibrance between both screens seem to match alot better now though not by much. <br />I wasn't very satisfied before with the comparisons, because the contrast on my CRT was higher by about a stop 1/2 than on my LCD now the difference is about 1/2 of a stop. I think I'll stick with the DVI-I connection for now, unless I run into problems then I'll revert back to VGA. Thanks !</p>

Link to comment
Share on other sites

<p>VGA is an outdated analog connection standard. Using it with any current display adds a completely unnecessary digital to analog and back to digital conversion that can only degrade the signal quality.</p>

<p>DVI keeps the signal digital all the way.</p>

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...