Definately I blame nVidia.
Why is their support for directx 10 so poor? (Actually I know the answer to this, as I read an article. They're 'half' commiting to dx10 but don't want to alienate dx9 users who are still the main market, as they might go to ATI, and ATI doesn't have the same kind of dx10 price / performance as nvidia yet, so they don't need to panic too much on that end)
dx 6/7/8/9 were all backwards compatible, but because dx10 isn't it's one or the other and at the minute games are still being written with the dx9 API so people are sticking with XP (mistrust of Vista? Luddite principle? Kidding!
), so card manufacturers are not seeing the massive markets for developing directx10 fully yet, so instead are putting more resources into their dx9 firmware / drivers.
It's a market reality, unfortunately, and change can be hard
Why buy a dx10 card when there are no dx10 games, no dx10 drivers and the operating system that supports it isn't popular so you probably don't have it anyway. So then, why should card developers provide lots of support for dx10 when (guessing) 80% of the market is still playing and using dx9 (even on the 8800gts - where it performs amazingly well on dx9).