TECH REPORT'S just-published summary of Valve's Half-Life 2 presentation at an ATI function is nothing short of grim when it comes to the company's projections of how Half-Life 2 will perform on Nvidia DX9-class hardware. While a certain amount of Radeon-push / GeForce FX-bashing can be expected - this is, after all, an ATI-sponsored event - there are differences to the marketing-speak that demonstrate just how serious the NV3X problems are. You'd normally expect lots of presentations filled with statements like: "ATI is a superior platform for our game", not graphs showing ATI hardware pulling no less then double the frame rate of Nvidia's best, to say nothing of slide after slide discussing the problem.
As one of the first (and certainly one of the most-awaited) fully DirectX 9 games, Half-Life 2 will be one of the biggest hits of the year and will serve as a technology demo and benchmark for years to come. The new Half-Life 2 engine will doubtlessly be used in other games; implying that the problems Valve is having with Nvidia's DX9 hardware may reoccur repeatedly, especially if they are not particularly unique to the Half-Life 2 engine—and we're inclined to think they aren't.
While it was no secret to anyone that NV30 performance wasn't all that great, NV35 is looking increasingly leaky as things go by. First we had the 3DMark 2003 controversy (where Nvidia's poor Pixel 2.0 performance first came to light). A lot of people actually blamed this one on a bad benchmark on FutureMark's part, and considering that 3DMark '03 isn't actually built on a game engine, that criticism had some teeth.
Then some enterprising MIT youngsters found a way to get Nvidia's very impressive "Dawn" demo running on ATI hardware (a supposedly "impossible" feat according to Nvidia) only to find that it actually ran faster under ATI's hardware than Nvidia's. Bring on the driver issues, optimization problems, 3DCenter's recent in-depth focus on NV3X hardware, and now this, and the tide of evidence is turning very much against Nvidia. Its not clear why the NV30 series has the problems that it does, but its clear that either FutureMark and Valve are in close cahoots to make Nvidia look bad, or there are serious architectural problems with NV30 and 35 that keep them from performing well. While some fanboys will no doubt jump for the former, an objective look at the data points more towards the latter.
Just how bad are things? Pretty bad. Even the Radeon 9600 Pro out-paces the GF FX by over 50%, while the Radeon 9800 Pro doubles the FX's feeble 30 fps. These are in benchmarks without AA or AF enabled. The game isn't even playable on anything below an FX 5900.
What about Nvidia's claim that the FX benefits dramatically from optimized code? It might be true, but it won't save the situation. While GF FX 5900 Ultra performance did increase a fair amount from optimization, the 5200 and 5600 are still completely unplayable and barely benefited—though this may be because these two designs are, in fact, based upon the even-worse-performing 5800 Ultra design. This isn't really a solution either, since it took Valve 5 times as long to code the Nvidia specific data as the general DX9 path. Valve's solution at the moment? Run the GF FX series as a DX8 part. Of course this means giving up all the DX9 eye-candy, but eye-candy isn't really useful when you can't play the damn game.
With Half-Life 2 scheduled to ship before NV40, it looks like there's no getting around the fact that ATI is the only game in town when it comes to HL2. As for Nvidia, NV40 had better bring substantial performance increases—or its going to get real ugly in the graphics war.