It's normal for LCD monitors, whether you're using DVI or VGA input, to have some tens of milliseconds of "input lag", as they buffer the incoming data in their panel driver hardware. This doesn't make the image blur, but it does make LCDs that much slower than a pure analogue monitor to get an image onto the screen. This can affect audio/video sync in movie playback, and make games feel slightly more sluggish too, but not everybody can notice the difference. I'm pretty good at spotting, and being annoyed by, minor lip-sync problems in video; most people don't seem to notice errors below 100ms.
(I also guarantee you that most people will notice the difference a great deal more if they think they've got a "slow monitor" than if they actually have got one, but have never heard of input lag. And yes, in case you were wondering, 30ms of monitor input lag does make something of a nonsense of gamers' rabid attempts to scrape that last few milliseconds off their ping time with special routers and TCP/IP tweaks.)
Input lag ought to be slightly worse for "VGA" input, since the monitor has to turn the analogue VGA signal into digital before it can even start buffering it. VGA also cannot, theoretically speaking, be as accurate to the video card's orders as digital-all-the-way DVI. In practice, though, it's seldom easy to tell the difference even with two monitors next to each other showing the same content. Unless your VGA cable's lousy, it's next to impossible to tell the difference if you're switching between modes on the one screen, even at quite high resolutions.