Read more.Quote:
Nvidia's first Windows 10 driver has full WDDM 2.0 and DX12 compliance.
Printable View
Read more.Quote:
Nvidia's first Windows 10 driver has full WDDM 2.0 and DX12 compliance.
So Witcher 3 performance on Win10 will suck?
And 'bundled' with various cards means 'if we're a tight retailer we'll only give you a code if you nag us' :p
I'm brand agnostic when it comes to hardware, but really nvidia? the superstrong geforce 780 play wither 3 as good as half the price, radeon 280X and geforce 960?
wtf now, can someone explain this to me?
Doesn't nvidia want to give any future value to their cards? it feels that kepler cards doesn't get much driver love anymore, I feel sorry for a friend who gave a good amount of money to get the 780 last year.
According to: http://www.guru3d.com/news-story/the...enchmarks.html
And my old and once proud gtx460 is amusingly at the bottom of the chart. But that's OK, my £120 R9 285 should turn up today, which must be the first ATI graphics card in my main PC since the Radeon Rage 128 pro in the late '90s.
I will see how it goes, if the Linux drivers turn out to be unworkable then I guess I spend again and my son gets a free graphics card update. But I am getting fed up of Nvidia's shenanigans, so I thought it was time to vote with my wallet.
Which (ha) points to the original article here:
http://www.pcgameshardware.de/The-Wi...marks-1159196/
Well lets see - that's just one benchmark. It certainly looks like the engine has some oddities with different bottlenecks. It also looks like they've gone tessellation mad (hello Crisis 2), so some of the caps AMD put in place to counter that might be of great use here.
I'll see how my ye olde 7870 holds up later.
Hmm digging into this some more, it seems like a lot of the game engine was re-written after 2013 in order to release on consoles. I'm not trying to start a game dumbed down to fit consoles argument, but I suspect that one of the consequences is a lack of optimisation for PC hardware just due to time/budget. That might explain why they had to take GameWorks in the hope it would shortcut their development time.
There will almost certainly be lots of use of HSA in the console versions, which has to be hacked around to work on PC.
And that's before we see if there are changes in the PC vs PS4/XB1 versions of the GameWorks libraries (which I can only assume there judging from the compatibility matrix).
http://images.anandtech.com/doci/854...7%29_575px.jpg
Actually, there *is* console hardware. Just because it's based on existing PC hardware doesn't stop it being console hardware.
On a console you know the exact hardware and operating environment you're working in - that makes it much easier to wring every last drop of performance out of the system.
When you develop for PC you have no idea what system the game is going to run on - two gaming PCs can have absolutely no components in common at all, not even the operating system. That makes it a lot harder to optimise performance on PCs; you have to use very abstract tools and trust the underlying software systems (drivers, APIs etc.) to do the right thing. In this case they've chosen to partner with nvidia to develop the PC port, so it's hardly surprising that the game isn't optimised for AMD-based PCs.
I'm also pretty sure if you dropped the PC settings to match the consoles you get within the same band of performance too, allowing for the optimisation jim mentions above. My card is only a bit better than a console card, and I'm hopeful I'll be running at or above console detail levels.
Or that may be optimistic!
http://gamegpu.ru/rpg/rollevye/the-w...-test-gpu.html
http://gamegpu.ru/images/remote/http...cher3_1920.jpg
Looks like it really does suit maxwell, though the 780 puts in a better showing at least.