I recently migrated from XP to Windows 7 (Sad to see the back of it..) Everything went smooth apart from resolution on my 22" monitor. No matter what i did i could not get Win7 or the ati drivers to display 1680x1050. I installed my monitors .ini file, created a custom one. Nothing helped.

I eventually used Powerstrip to set a custom resolution and now it's flying along at it's native resolution. However the problem is, when trying to load any game i'll either get a crash (*GAME* Application has stopped working) or in for instance Supreme Commander ill get -

"Unable to create Direct3D, Please ensure system has current video drivers."

Now i have all .net frameworks installed (including 4), both 2005+2008 C++ runtimes, upto date DirectX files and im running 10.2 ATI drivers (older drivers make no difference).

Now the interesting part is, if i change my primary monitor to the secondary monitor (19", with resolution correctly detected by EDID) i can run any game with no problems at all. (This can be done on the fly with no restart required.)

Any ideas why games will only run when the other monitor is selected as primary monitor?

My powerstrip custom resolution data is correct, and is working perfectly. dxdiag is showing no errors with either screen. Cant understand why it wont load a game.

Never had any of these problems in XP.


EDIT - I suspected the monitors EDID data was corrupt, so i removed the EDID pins. However this makes no difference. Also it might be worth noting that Win7 detects the 22" monitor as a "Generic Non-PnP Monitor" - however even with correct .ini installed, it makes no difference....