Read more.Issues a counter-complaint about AMD partner games titles.
Read more.Issues a counter-complaint about AMD partner games titles.
AMD is just hurt because they're struggling with sales and competing with both Intel and NVidia however competition is a good thing!
I'd actually go with AMD on this and I use an nvidia card. AMD haven't exactly said that Nvidia are actually making it impossible to work around, just 'deliberately' making it considerably more difficult to make the optimizations for their drivers which in turn make the game perform worse, affecting gamers. With Nvidia sending in their own coders to code things they are 'hiding' what they are doing from the game staff so when the game staff and AMD work together they don't know what Nvidia have done meaning they could have to work through the entire game code before they can do their optimizations etc.
Remember Nvidia love propriety code, they use cuda over the cross platform opencl, which performs far worse than cuda and they want to add g-sync to monitors when there are open/cross platform options already out there.
Last edited by LSG501; 30-05-2014 at 04:19 PM.
I fear a future where where gamers will require both AMD and NVIDIA GPUs in order to play their favorite titles Maybe a little extreme, but it will put people off PC gaming and that's not good!
Yeh I can only see this as a bad thing and dont like where its headed
Whats even funnier,is that AMD GPU based Physics,ie,TressFX worked fine on my GTX660,yet PhysX which works on consoles won't work on AMD cards.
On top of this Nvidia used NVAPI for BF3 which gave it a performance advantage over AMD even in the early days of the GCN launch.
So yet again NV thumbs its nose at open standards and makes new proprietary ones.
AMD and others support open standards and improving overall experience. Its VERY clear that NV wants you locked into their world and go hang the concequences. Think carefully where you want to be with your next upgrades folks.
You know what is even more funny.
That even on a Nvidia the game doesn't perform as well as it should.
I mean my SLI 780ti cards are dipping every few seconds in the 10-20FPS while driving. And on other occasions shoot up to 100FPS. One can't really claim that the DEVS did a good job optimizing on this title and proclaiming it runs best on Nvidia, which it doesn't.
The processing speed of the images are so random. On foot my cards renders them at roughly 14MS while driving it spikes to 80MS+
I believe during development the DEVS lost what graphic cards they intended to make it run on. Causing it to be unrealistic for best graphics to run on normal consumer cards. And by default also making it extremely hard for AMD get the best graphics with the latest cards. Consider they had no pre-release access to stress test on.
Graphic wise the game looks awesome but the DEVS are building games just cause they can and Nvidia & AMD keep providing better cards which removes the needs to optimize. Which is a bad idea for the Gamers (wallet) :=)
He said
She said
Ho hum, as usual lots of finger pointing and no real conclusions as to who is in the wrong (if anyone).
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
I've heard if you put UPlay in offline mode, all the issues go away. You might want to try that.
Also, the reason why CUDA is bragged about by NVidia more than OpenCL support is because CUDA is much, much easier for programmers to work with, and also runs much faster on NVidia hardware than OpenCL. As, in effect, CUDA is just a language, there's little stopping AMD from writing a CUDA-compatible compiler which would give OpenCL bytecode except manpower. At one point NVidia released a statement saying they weren't against AMD attempting this. This would also allow AMD GPUs to use PhysX too, as that's essentially just an abstraction layer for CUDA to make it easier to make specific effects.
I think it's just something being blown out of proportion. Nvidia game partnerships have existed for...longer than I can remember. AMD does the same thing too now as they've been expanding. To say one side is guilty of something, and the other is innocent, is misleading since they both have done the same things.
If we didn't have these "next-gen" consoles yet (current Gen now), and Watch_dogs was a usual game, I doubt we would hear anything about it. I just see it as a chance for the PR boys to get easy attention whether believed or not.
(Not to mention Watch_Dogs runs horribly on both AMD AND Nvidia hardware--so to me right now it's nigh impossible to say there's wrongdoing).
OpenCL is supported by many companies including Intel,Google,Sun and Apple too.
The reason why Adobe CS started moving over to OpenCL from CUDA is because Apple started pushing for it under OS X.
Plus Nvidia has actively locked out PhysX working on systems which also have an AMD card. Even when PhysX was supported on non-Nvidia systems,it pushed it onto the CPU cratering performance. Yet,the consoles with their low power CPUs and AMD GPUs appear to be able to run PhysX.
There was a massive thread on Hexus a few years ago where reps from both companies were duking it out.
Yet AMD did not lock out their attempt at GPU physics,ie,TressFX,and even though it took Nvidia a bit longer to optimise for it,within two weeks performance on both companies GPUs were comparable.
Last edited by CAT-THE-FIFTH; 30-05-2014 at 05:53 PM.
So what about AMD Mantle? It's entirely closed-source, only available on AMD GPUs. NVidia's GameWorks (Which is just a library of sample code and a little developer time to help implement it) still works on AMD GPUs and drivers, even if it's not entirely optimized for them, whereas Mantle doesn't at all, and AMD are not willing to open up the SDK to allow NVidia to build their own support.
ok AMD now is crying because they messed up, they forgot tomb raider and BF4 days when performance was crap on Nvidia at launch, what **** goes around, comes around AMD.
Last edited by peterb; 01-06-2014 at 05:03 PM. Reason: Language/swear filter
It's actually even worse than that, Nvidia actively de-optimised the CPU physx code to make it run as slowly as they could possibly make it go http://semiaccurate.com/2010/07/07/n...les-physx-cpu/ http://www.realworldtech.com/physx87/ making it single-threaded and using pre-SSE highly inefficient x87 code (SSE is supported on every single CPU fast enough to run a physx game)
It's one of the main reasons i won't even consider buying Nvidia nowadays, i refuse to support a company using dirty tactics of that nature...
Last edited by failquail; 31-05-2014 at 02:47 PM.
It's not a new thing though. Nvidia have always been very aggressive, and proprietary in everything they do going back to the days when they triumphed over 3DFX with their very aggressive release cycles & always hyping up their product before release (the hyping is still there for Tegra).
My last Nvidia purchases were 8800GT parts which ultimately failed due Nvidia not having the hardware engineering competence of selecting the correct solder (which for a hardware company is kind of crazy). As Nvidia managed to sell countless million parts with an inherent design flaw and paid out very little compensation (there was a miserly settlement in the US but nothing in Europe), they are not a company whose product I ever intend to buy again.
The irony is of course that they considering themselves a 'premium' brand. But seems that is only terms of price but not terms of them standing behind their product. Guess as far as the Nvidia's CEO is concerned, buyers of their hardware are only buying a rental and anyone how expects their hardware to last more than 2-3 years is a fool.
There are currently 1 users browsing this thread. (0 members and 1 guests)