DVI (and the digital part thereof) is the superior technology for LCD. As stated before there's no digital to analogue or vice versa to worry about (unless your monitor converts the DVI signal to analogue and fees that into it's analogue receiver ... but that would be silly).
There should be no loss on DVI due to the cable unless you have a seriously awful cable that's picking up some strong interference. The analogue signal has to think about whether the received voltage was [0V 0.05V ... 0.75V 0.8V 0.85V... 5.0V] (actual voltages probably quite different but you get the idea) whereas the digital signal just has to worry whether it was 0V or 5V. It's like DTV - you either get the picture or you don't. (edit: okay, there's a small region where you get the picture but with massive corruption) It's a hard-fail system - until it fails it should work just fine
There's also probably some truth in the statement that a VGA-only monitor will have a better analog picture than a DVI and VGA capable monitor in a similar price bracket.
Besides, most graphics cards I see these days have only DVI out and provide a DVI-VGA adapter for those stuck with old tech. It's a lot easier to set up - no messing about with weird blurry patches on the screen as it send the right number of dots worth of information.
On my monitor the DVI input scales far far better than the VGA one. i.e. If I send out a signal at 1280x800 the DVI picture is a little fuzzy as you'd expect (kind of like free AA in games!) but the VGA image is horrible. Probably expected me to use DVI....
Get DVI No point not to nowadays
Last edited by jamena; 22-05-2007 at 10:47 PM.
A belinea 10 15 30 or whatever the number is (the non-DVI one) is probably better than el-cheapo no brand DVI capable monitor, I think was the inference.
DVI is useful, but not at any cost, and panel quality is sometimes more important than interface.
In theory a DVI-only monitor should be cheaper than a Dsub only monitor which should be cheaper than a dsub&DVI monitor, assuming all other components are the same. DVI is like PCIe in that currently it makes sense to get PCIe even if there's been little performance improvement to be had given identical gpus. If you have a DVI-out graphics card you might as well get a DVI-in monitor.
Unfortunately I think it's a case of the manufacturers know they can still bloat the prices because DVI is "better".
Buy your monitor based on the quality of the panel and also based on what other features (USB hub, s-video-in, speakers, ipod dock, whatever) you want/need it to have. A cheaper monitor probably has a cheaper panel and/or AA chip for sub-native resolutions - do your research for a panel that suits you. Chances are the better monitors will have DVI inputs (as well as or instead of Dsub) though there are a few nice panels out there that are Dsub only still.
so in summary the panel and scaling tech is probably more important than whether you use Dsub or DVI, however DVI will allow for a better and more stable way to transmit the image data. Whether it looks better on the same panel only you can decide. Personally I see a lot a gfx cards without Dsub anymore - it follows that DVI will become the dominant technology (and once the shops stop ripping us off with £30 DVI cables it should be cheaper too!)
Oh I agree, I just assumed by 'good Dsub' they meant 'good Dsub monitor', not the interface.
A single DVI port can be single link or dual link, a normal dual DVI graphics card will have two Dual Link DVI ports on the card. Dual link means that the one plug has two sets of data lines, so it can run at a higher data rate.
A single link will do WUXGA (1920 × 1200) @ 60 Hz
Where as dual link will do WQXGA (2560 × 1600) @ 60 Hz
http://en.wikipedia.org/wiki/Image:D...ctor_Types.svg
Analogue will only ever approach the quality of DVI, it will never be able to beat it.
There are currently 1 users browsing this thread. (0 members and 1 guests)