Ok i've got a bit of a nerdy problem.
I've got 3 computers hooked up via a dual-head vga kvm switch. It needs to be vga as two of the computers are (very) old and even if it's possible to find dvi video cards for them, i don't want to fork out massive amounts on a dual dvi switch.
The problem is that the image from the switch is just slightly smudgey. It might even be the analog input of my lcd displays rather then the kvm, either way it's annoying if i'm writing a lot of text.
Now I spend about 90% of the time using one of the computers (the modern one) which is dvi equipped and the other 10% switching a lot between all three.
It's my understanding that DVI-I carries both digital and analog signals. There are also splitter cables that can give you VGA and DVI from a DVI-I input. However, will this work if i send DVI > monitor, VGA > KVM and switch between 1 pc and kvm modes (DVI and VGA) on the monitor?
My concern is that the automatic resolution detection thing (sorry don't know the technical term) will get confused as the KVM's display emulation will be giving different messages to the graphics card then the monitor will. Can most (or any) graphics cards handle this? Will it cause any damage if not? Anyone done this before?
Thanks for any insights..