Originally Posted by
watercooled
I mean, I do understand that when you're reviewing for a website, you have probably hundreds of benchmarks to process, along with hardware changes, and having that much information in front of you means you're not as likely to spot weirdness which might seem like nothing at first glance. Independent reviewers and even users often pick up on things like this, and it would be nice if it got some more attention in the media, to better explain the results. I respect sites more for doing things like this than publishing a ton of robotic test results, not least because it demonstrates the reviewers have a better understanding of what they're actually seeing
I know a few like Hexus do this when the need arises, but it would be nice if more sites could take this sort of new information and run their own investigation into why some results are the way they are.
We've seen something similar happen with GPUs too, where in some cases we get benchmarks with release drivers, and that's it. 6 months down the line when you're looking to upgrade, those numbers could be completely invalid because of both software and driver updates, and few places re-run benchmarks (again, I do understand it's a time-consuming process so it's probably not practical to do it frequently). In a way, I suppose this is the sort of reason we get rebrand launches from the likes of AMD as it, in a way, forces sites to re-test the products with the latest drivers, and can show performance improvements even if nothing has changed on the hardware side.