It was reported earlier in the month that a new benchmark called Rydermark had unmasked shader precision stinginess with Nvidia hardware. According to The Inquirer, it was said that Nvidia forced developers to use 16-bit precision, with no ability to go up to 24- or 32-bit precision. Interestingly, 16-bit precision would make for a non-DX9 compliant process.
The Inq now has screenshots from the benchmark, that seem to back up the Rydermark claims. The two screenshots show an ATI and Nvidia graphics card rendering the same scene but with different image quality, particularly in the water. However, it's not known what driver versions, or in fact what graphics cards, were used for the benchmarks, though they were both taken at 1600x1200.