Right, I'm having some technical issues and I need some advice.
I've got a Radeon X850XT dual DVI gfx card linked up to a 20" Dell widescreen monitor via DVI - it's the same setup I've had for a couple of years.
Now, when I turn the PC on I get nothing at all on the monitor until Windows loads up. The monitor doesn't actually detect a signal until I get the Windows login screen. Then, the colours are faded with washed out graphics and streaking lines going across the screen.
Then, as soon as I switch to an application or game that is set to full screen mode, or as soon as I run something that is graphics intensive such as a game in windowed mode, 3D Mark or even the Direct 3D tests in dxdiag the whole screen just goes blank, the monitor turns off and the sound starts looping, forcing me to reboot.
Initially I thought the gfx card was overheating but that isn't the case, it usually hovers around the 55 - 65 degrees mark.
I've replaced the DVI cable, re-installed fresh gfx card drivers, reinstalled the monitor drivers but it hasn't had any effect.
I've got a Linux box connected to the monitor's VGA slot and that doesn't have any problems with distorted graphics at all, yet when I use a DVI-VGA converter and connect the main PC to the VGA slot I get exactly the same issue.
Clearly, either my monitor or gfx card is on the blink but I can't figure out which.