Read more.Quote:
Issues a counter-complaint about AMD partner games titles.
Printable View
Read more.Quote:
Issues a counter-complaint about AMD partner games titles.
AMD is just hurt because they're struggling with sales and competing with both Intel and NVidia however competition is a good thing!
I'd actually go with AMD on this and I use an nvidia card. AMD haven't exactly said that Nvidia are actually making it impossible to work around, just 'deliberately' making it considerably more difficult to make the optimizations for their drivers which in turn make the game perform worse, affecting gamers. With Nvidia sending in their own coders to code things they are 'hiding' what they are doing from the game staff so when the game staff and AMD work together they don't know what Nvidia have done meaning they could have to work through the entire game code before they can do their optimizations etc.
Remember Nvidia love propriety code, they use cuda over the cross platform opencl, which performs far worse than cuda and they want to add g-sync to monitors when there are open/cross platform options already out there.
I fear a future where where gamers will require both AMD and NVIDIA GPUs in order to play their favorite titles :( Maybe a little extreme, but it will put people off PC gaming and that\'s not good!
Yeh I can only see this as a bad thing and dont like where its headed
Whats even funnier,is that AMD GPU based Physics,ie,TressFX worked fine on my GTX660,yet PhysX which works on consoles won't work on AMD cards.
On top of this Nvidia used NVAPI for BF3 which gave it a performance advantage over AMD even in the early days of the GCN launch.
So yet again NV thumbs its nose at open standards and makes new proprietary ones.
AMD and others support open standards and improving overall experience. Its VERY clear that NV wants you locked into their world and go hang the concequences. Think carefully where you want to be with your next upgrades folks.
You know what is even more funny.
That even on a Nvidia the game doesn't perform as well as it should.
I mean my SLI 780ti cards are dipping every few seconds in the 10-20FPS while driving. And on other occasions shoot up to 100FPS. One can't really claim that the DEVS did a good job optimizing on this title and proclaiming it runs best on Nvidia, which it doesn't.
The processing speed of the images are so random. On foot my cards renders them at roughly 14MS while driving it spikes to 80MS+
I believe during development the DEVS lost what graphic cards they intended to make it run on. Causing it to be unrealistic for best graphics to run on normal consumer cards. And by default also making it extremely hard for AMD get the best graphics with the latest cards. Consider they had no pre-release access to stress test on.
Graphic wise the game looks awesome but the DEVS are building games just cause they can and Nvidia & AMD keep providing better cards which removes the needs to optimize. Which is a bad idea for the Gamers (wallet) :=)
He said
She said
Ho hum, as usual lots of finger pointing and no real conclusions as to who is in the wrong (if anyone).
I've heard if you put UPlay in offline mode, all the issues go away. You might want to try that.
Also, the reason why CUDA is bragged about by NVidia more than OpenCL support is because CUDA is much, much easier for programmers to work with, and also runs much faster on NVidia hardware than OpenCL. As, in effect, CUDA is just a language, there's little stopping AMD from writing a CUDA-compatible compiler which would give OpenCL bytecode except manpower. At one point NVidia released a statement saying they weren't against AMD attempting this. This would also allow AMD GPUs to use PhysX too, as that's essentially just an abstraction layer for CUDA to make it easier to make specific effects.
I think it's just something being blown out of proportion. Nvidia game partnerships have existed for...longer than I can remember. AMD does the same thing too now as they've been expanding. To say one side is guilty of something, and the other is innocent, is misleading since they both have done the same things.
If we didn't have these "next-gen" consoles yet (current Gen now), and Watch_dogs was a usual game, I doubt we would hear anything about it. I just see it as a chance for the PR boys to get easy attention whether believed or not.
(Not to mention Watch_Dogs runs horribly on both AMD AND Nvidia hardware--so to me right now it's nigh impossible to say there's wrongdoing).
OpenCL is supported by many companies including Intel,Google,Sun and Apple too.
The reason why Adobe CS started moving over to OpenCL from CUDA is because Apple started pushing for it under OS X.
Plus Nvidia has actively locked out PhysX working on systems which also have an AMD card. Even when PhysX was supported on non-Nvidia systems,it pushed it onto the CPU cratering performance. Yet,the consoles with their low power CPUs and AMD GPUs appear to be able to run PhysX.
There was a massive thread on Hexus a few years ago where reps from both companies were duking it out.
Yet AMD did not lock out their attempt at GPU physics,ie,TressFX,and even though it took Nvidia a bit longer to optimise for it,within two weeks performance on both companies GPUs were comparable.
So what about AMD Mantle? It's entirely closed-source, only available on AMD GPUs. NVidia's GameWorks (Which is just a library of sample code and a little developer time to help implement it) still works on AMD GPUs and drivers, even if it's not entirely optimized for them, whereas Mantle doesn't at all, and AMD are not willing to open up the SDK to allow NVidia to build their own support.
ok AMD now is crying because they messed up, they forgot tomb raider and BF4 days when performance was crap on Nvidia at launch, what **** goes around, comes around AMD.
It's actually even worse than that, Nvidia actively de-optimised the CPU physx code to make it run as slowly as they could possibly make it go http://semiaccurate.com/2010/07/07/n...les-physx-cpu/ http://www.realworldtech.com/physx87/ making it single-threaded and using pre-SSE highly inefficient x87 code (SSE is supported on every single CPU fast enough to run a physx game)
It's one of the main reasons i won't even consider buying Nvidia nowadays, i refuse to support a company using dirty tactics of that nature...
It's not a new thing though. Nvidia have always been very aggressive, and proprietary in everything they do going back to the days when they triumphed over 3DFX with their very aggressive release cycles & always hyping up their product before release (the hyping is still there for Tegra).
My last Nvidia purchases were 8800GT parts which ultimately failed due Nvidia not having the hardware engineering competence of selecting the correct solder (which for a hardware company is kind of crazy). As Nvidia managed to sell countless million parts with an inherent design flaw and paid out very little compensation (there was a miserly settlement in the US but nothing in Europe), they are not a company whose product I ever intend to buy again.
The irony is of course that they considering themselves a 'premium' brand. But seems that is only terms of price but not terms of them standing behind their product. Guess as far as the Nvidia's CEO is concerned, buyers of their hardware are only buying a rental and anyone how expects their hardware to last more than 2-3 years is a fool.
It's nothing that AMD, and ATI before that, haven't done as well, and to the same degree. The reboot of Tomb Raider wasn't that long ago, people... once again, AMD got caught at the release of a major game, which they have had plenty of time to work on, without road ready drivers on the servers.
All this open source this, proprietary that garbage is just that - garbage. AMD got caught slacking, and they're trying to play the victim card.
This is by the editor over on Hardware.frabout the last Batman game and Hardware.fr is the main French review website:
http://i.imgur.com/GcXVeHF.jpg
http://i.imgur.com/GcXVeHF.jpg
So,AMD was complaining about it since last year.
A few months ago,devs at DICE and Ubisoft(the company behind Watch Dogs) said the following:
http://i.imgur.com/mYGuwxZ.png
http://i.imgur.com/mYGuwxZ.png
LOL,yes especially with Nvidia's history of CUDA,PhysX,G-Sync and multiple forms proprietary tech they rather not want the competition to use their tech. People should be pointed to that infamous thread with the AMD and Nvidia PR people arguing over here on things like PhysX.
Things like the the advanced lighting techniques developed by AMD for both DiRT and Sleeping Dogs,TressFX,64 bit extensions to the X86 instruction set,OpenCL,Freesync etc have shown AMD as a company is willing to use standards everybody can use,or tech the competitors can use,even if it takes some time for said competitors to modify or get performance upto speed.
I don't thing that AMD or Nvidia are expected to make their developed tech work on competitors hardware(it should be up to the competitor to try and sort it out themselves),but OTH trying to impede them from even getting it to work is another thing.
Only in one instance I can see AMD with regards to a tech they implemented not really opening it up yet to the competition and even then it really is more about improving AMD card performance on slower CPUs like their own. In fact NVAPI improved Nvidia performance on its cards to the extent even GCN based ones could not beat older generation Nvidia cards that much for yonks in BF3,and Intel has started to do something similar with its IGPs and some games.
It even gets better:
http://www.forbes.com/sites/jasoneva...nd-watch-dogs/
Yet this is what Nvidia said in March:Quote:
Update: An Epic representative emailed me to clarify that "NVIDIA GameWorks is not built into UE4. The engine ships with PhysX." This is curious because on Nvidia's developer portal, the company states that "we’ve incorporated support for NVIDIA GameWorks directly into Unreal Engine 4 making it easier for UE4 licensees to take advantage of our technology."Nvidia is choosing their words carefully, but the intent seems to be touting the inclusion of the GameWorks libraries (not just PhysX) directly into the Unreal Engine 4 core, and Epic has made it abundantly clear to me that that's not the case.
http://blogs.nvidia.com/blog/2014/03/19/epic-games/
It seems Nvidia started to open up the Gameworks code over the last month or so,in response to all of this,but things like some of the lighting techniques and PhysX still are "black boxes" and it remains to be seen if Faceworks and Flameworks will be the same.Quote:
Together with Epic, we’ve incorporated support for NVIDIA GameWorks directly into Unreal Engine 4 making it easier for UE4 licensees to take advantage of our technology. NVIDIA Gameworks libraries are designed to help developers create a wide variety of special effects, such as more realistic clothing or destruction, and now these effects are available to every developer with a UE4 license.
The complaining from AMD has actually paid off in some way it seems.
Edit!!
Well,Intel has started doing its own "game partnerships" too,and I believe it means in certain games,their IGPs perform much better than they should,due to support of certain Intel specific features.
Rome 2 and GRID2 are the first of these.
AMD and Nvidia need to worry more about Intel joining the fray than all this silly in-fighting!!
Is there any actual effect on gamers other than having to wait a bit longer for optimised drivers if you have the 'wrong' GPU?
Nvidia hadn't signed a contract with Ubisoft to optimise the game it was an open deal, AMD had the choice aswell to work with Ubisoft but just never took up the offer.
For the most part, nobody has ever seriously questioned the quality of AMD kit, current issues with excess heat generation excepted... However, there's little to no debate about their poor history with driver support. So yeah, I seriously believe that. Perhaps slacking was too strong of a word - but it seems they were a step behind again, and given that the latest fiasco (Watch Dogs) was done on the latest gen consoles as well, both of which are AMD toys, there's no excuse for them to have screwed up on the PC side. Given that one little stat, AMD has little to no room to complain about NVidia making things non-competitive. They have the inside track on pretty much everything that isn't a PC exclusive for the next however many years it takes for Sony and Microsoft to release new consoles.
I have had dozens of graphics cards from both companies for nearly 12 years like Kalniel and many others here.
Both companies at any one point have had issues. You really want to play that game - lets go back to the FX5000 series and their appalling performance with HL2. What about Nvidia's issues with adaptive v-sync or the fact I have had numerous niggles with my GTX660,which were all documented on various forums?
Also,a lot of the "problems" with AMD/ATI cards were due to the TWIIMTBP programme in the first place meaning Nvidia had first dibs on games and unlike you some of us have been following this whole saga for a while now. Lets look at the original AC which Ubisoft plonked in DX10.1 support which improved AMD/ATI card performance as Nvidia did not support it. Ubisoft removed the support,due to some lame duck excuse about bugs,which multiple websites tried to re-produce.
Nvidia's answer to AMD's console wins was Gameworks,and three Gameworks titles have had issues with AMD cards. In fact WD is probably not the worst of them by far- AC4 and the Batman games were much worse.
This is by the editor over on Hardware.fr about the last Batman game and Hardware.fr is the main French review website:
http://i.imgur.com/GcXVeHF.jpg
http://i.imgur.com/GcXVeHF.jpg
So,AMD was complaining about it since last year. BTW,the very same Batman game was on consoles too.
A few months ago,devs at DICE and Ubisoft(the company behind Watch Dogs) said the following:
http://i.imgur.com/mYGuwxZ.png
http://i.imgur.com/mYGuwxZ.png
Wait,lets look at the devs who are talking there. Johan Andersson is a senior dev at DICE who make the Battlefield series,and have supported BOTH AMD and Nvidia tech like Mantle,and NVAPI.
Bart Wronski works for frikkin Ubisoft themselves.
Micheal Drobot works for the Guerrilla Games who are behind the Killzone series on the PS3 and PS4.
http://www.forbes.com/sites/jasoneva...nd-watch-dogs/
Yet this is what Nvidia said in March:Quote:
Update: An Epic representative emailed me to clarify that "NVIDIA GameWorks is not built into UE4. The engine ships with PhysX." This is curious because on Nvidia's developer portal, the company states that "we’ve incorporated support for NVIDIA GameWorks directly into Unreal Engine 4 making it easier for UE4 licensees to take advantage of our technology."Nvidia is choosing their words carefully, but the intent seems to be touting the inclusion of the GameWorks libraries (not just PhysX) directly into the Unreal Engine 4 core, and Epic has made it abundantly clear to me that that's not the case.
http://blogs.nvidia.com/blog/2014/03/19/epic-games/
Epic games who are firmly in the Nvidia camp,even said that,and did a semi-backtrack on the whole Gameworks thing.Quote:
Together with Epic, we’ve incorporated support for NVIDIA GameWorks directly into Unreal Engine 4 making it easier for UE4 licensees to take advantage of our technology. NVIDIA Gameworks libraries are designed to help developers create a wide variety of special effects, such as more realistic clothing or destruction, and now these effects are available to every developer with a UE4 license.
AMD making a big deal of all of it is a calculated PR move.
So AMD's and devs complaining since last year has worked out in their favour,especially with the fact now Gameworks is starting to be opened up in the last month or so.
Edit!!
PS,who do you think works for AMD now as a VP?? Roy Taylor - many consider him "the father" of the TWIIMTBP programme and he was with Nvidia for yonks- meaning he probably has some insight into what Nvidia has done in the past! ;)
What issues? As far as I hear AMD cards are very reliable so they don't appear to be generating excess heat at all. If you're talking about the 290 series running a higher operating temperature than the 280 etc. series then do understand that if a card has been designed to run at a particular temperature safely then it's not an issue if it meets it's specs! I loved my geforce 6800GT - was a really solid card, but also was massively hot! Especially for such a low density process compared to recent chips. On the other hand I've had several other geforce cards that ran cooler die on me.
No debate as in they don't have a poor history? If so, I agree - the driver thing is a rumour largely touted by PR/fans and has unfortunately been picked up by the less scrupulous reviewers who cite it in their +/- columns without actually checking it out. As Cat says, I've had both cards for a number of years and in recent times I've rated the AMD drivers as superior. Both makers have their issues, as you'd expect when you have such a broad specification and software landscape to work in.Quote:
However, there's little to no debate about their poor history with driver support.
I'm currently running one computer with an oc'd 7870 and another with a stock geforce 560. The AMD drivers are more stable when gaming and give higher image quality, but have an occasional issue when resuming from standby. The nVidia drivers are less stable when gaming, and it's especially annoying heading back to older games which seem to prefer different drivers to modern games for nVidia.
Non-competitive practises that nVidia are infamous for are bad news, even for nVidia fans - the PC is a tough enough market without further fragmenting the user-base. The PC could be the most dev-friendly system out there, but tricks like this are preventing that from happening.
Well adding to that,is that all this fighting between Nvidia and AMD is not realy a good idea,when we have a 800LB Gorilla looming in the shadows - Intel. They are spending more and more money on improving their graphics IP - even a decent amount of Nvidia profits is based on Intel licensing IP from them for another year or so. Intel has done the unthinkable and started collaborating with studios on games like GRID2 and Rome 2 and IIRC,this has lead to be better performance on Intel IGPs,as certain Intel specific features are utilised.
Even though Iris Pro used a lot of die area to achieve its goals,you can see with some refinement,and incorporation into lower end parts,Intel could start to cause some problems in the next few years. Their focus is increasingly going onto graphics and compute now. They are even trying to dislodge Nvidia from their most profitable market,ie,compute cards with MIC.
Many disagree. I normally have at least 1 machine with nVidia and one with AMD (at the moment I have 1 x nV PC and 2 x AMD). The AMD machines always have more issues with drivers....always. For years AMD left the Blizzard mouse cursor bug in their drivers....and I do mean YEARS. Performance fixes seem much slower from them, admittedly you can blame some of it on nVidias practices but ultimately they are responsible for the performance of their own kit and driver....pointing the finger at another company is a complete and utter cop-out IMO.
I went through at least a 12 month period where the drivers always came with parts missing (erroring during install). Frame pacing still isn't 100% fixed.....after how long? Occasionally you get the "black screen" issue during driver install.....sometimes get the drivers stuck in a state where you can then only install via command-line (what a PITA that is!)
For me there is no contest between their driver teams. nVidia make the more reliable drivers, hands-down. The largest issue I have had with nVidia drivers in recent years is the occasional texture flicker in Battlefield. Of course, everyone will have a different experience but to call it a rumour is a bit rich when many of us have battled with those drivers for far too long.
I disagree - I have a GTX660 and a mate has a GTX650TI Boost 2GB. I have other mates with HD7850,HD7770 and HD7870LE/XT cards. Linux is the only area you could say Nvidia is much better.
At least in the last 18 months AMD have been better IMHO under Windows,as both of us with Nvidia cards have niggling faults.
Our personal experiences OFC.
In between our group,we own the better part of at least 700 games.
Among my issues:
1.)SR3 and SR4 would just go black screen with any of the 320 drivers. Solved by the latest ones. A problem which some people have had for longer with older Nvidia cards.
2.)Tombraider would crash unless I disabled tessellation first(and then enabled it once the game started) and I only read about the fix on another forum. Has not happened with the latest drivers.
3.)Metro:Last Light had weird tearing artefacts with the drivers at launch. Solved with later drivers.
4.)Planetside 2 was working perfectly too until I updated to the Nvidia "performance drivers" recently. The driver kept resetting itself and after that the game was a stuttery mess until I restarted it. I am not the first to have this issue and its been affecting some people for ages. Thankfully the latest driver,seems to have stopped this.
5.)Firefox black window crashes. I thought this was FF being rubbish. A forum member here pointed me to a forum thread which indicated it was a known driver issue. The latest drivers solved that issue,but I had it for yonks.
This was coming from an HD5850 which for most part(like another mate) gave me no problems,and only its tessellation performance when pushed was the cause for degraded performance(a hardware not a software issue).
My mates with the HD7870LE/XT,HD7770 and HD7850 had ZERO problems last time I checked in the last year. GCN based cards at launch did have their share of niggles though,but I certainly think AMD has put in more effort than Nvidia recently.
Plus going back even further. HL2 worked fine on my 9500 PRO but was utter crap on my mates FX5000 series card.
But OTH my 6800 series card had support for DX9c when compared to the X800 series which meant better image quality in games and so on.
Previously,my worst problem with drivers was when my HD3870 became a stuttery mess in the last level of Crysis even at lower settings and resolutions when compared to my 8800GTS 512MB when the game was launched(performance did improve with later updates).
Edit!!
PS:
We all play a lot of Blizzard games.
I would say we all seem to have been fine so far,and I would say that after 100s,maybe 1000s of hours their games between us.
Nvidia does supposedly have the performance advantage with them,but I have not noted any problems with the AMD cards with the games after talking to my mates.
Outside the extra performance of my GTX660,my massively overclocked HD5850 seemed to be OK running SC2 and D3 for example.
Its all swings and roundabouts,except Nvidia niggles get less exposure - look at how the $200million+ bumps issue got little or no coverage from the tech press,despite being easily one of the worst hardware issues in years. The tech press went all quiet over that.
I had to personally help out people who did not know why their laptops were dying,so they could actually stick it to the bloody OEMs,who were trying to cheat them by issuing poxy fan speed updates for the laptops.
If it had more exposure,many people would have pushed for getting their laptops replaced instead of letting it go.
Heck even the adaptive v-sync issues and throttling issues with Kepler were quietly hidden away. It took French and German review sites to expose the problems with Nvidia Turbo Boost,while the English language press barely mentioned it. Once AMD had different throttling problems,a big song and dance was made,since the sites got free cards shipped to them by Nvidia.
AMD has its own problems too,but they are more likely to be highlighted.
I don't get why people are getting so wound up here. Video cards is a cut throat business where companies will do all sorts to get a 5% performance boost. It has always been thus, even going back to the old 2D video cards (my T'seng ET4000 rocked at DooM :D).
AMD have improved enormously, but I think they still have a way to go. Amusing comment from a Mojang developer:
"It's comforting in a morbid way, though, to know that they're as spectacularly bad at following the OpenGL specification as they are at following the DirectX specification."
Found that reading up on the recent beta driver that AMD put out that didn't work with Minecraft. I know it was only a beta, but I have to wonder at the QA processes if they don't have some basic automated image quality test against one of the biggest selling PC games out there. Sounds like all the textures went transparent, so not like it was subtle.
Also amusingly he goes on to say that the only vendor specific code in MC is a hack to fake a command line that says "minecraft.exe" else performance drops off on Intel platforms.
I hope that is your personal experiences and not a statement of fact you are making! Because I can tell you now, that is so wrong I almost fell out of my chair reading it.
In the last year AMDs drivers have played catch-up. IMO, they still are not there and my post was about the accusations of people that state nVidia drivers are better......and the drivers have been around a LOT longer then the last 12 months or so!
Wrong about what??
That your personal experience trumps anyone elses??
In the last year and a half there have been loads of issues with Nvidia drivers,and I laughed massively at what you said too.
Considering the amount of builds I help people with in real life and on forums,and the fact I probably have a decent amount of experience in the sub £200 market with cards,your statement is wrong(your language) and just shows the Apple syndrome with too many people who own Nvidia cards.
Since the PR says Nvidia is the best,people will just accept Nvidia problems as not being Nvidia's fault. Yet when it comes to any AMD issues,its definitely AMD's fault.
You can deflect all you want,everyone of those issues have happened to people. Its happened to me and its happened to others. I will link you to the threads on forums if you think I am trying to make them up.
One of my mates is a Hexus member - Bagnaj97.
He is a Linux sysadmin and massive geek and he is partial to Nvidia,since AMD drivers are worse under Linux than Nvidia and he would rather use Nvidia again for this reason.
Yet,even he has had little or no problems with his HD5850 or HD7870LE/XT under Windows AFAIK. He bought his HD7870LE/XT in Q2 last year.
He owns 700+ games alone just for him.
Then consider that most of my mates work in science or tech anyway,under Windows they have had little or no issues in the last year or so when gaming.
Like I said AMD did have issues at the launch of GCN not surprising and how massive a change it was,however it is not 2012 anymore. Kepler was rejigged Fermi anyway and not a total change.
You need to take your blinkers off regarding Nvidia. They have problems under Windows,and plenty of them,and I have owned dozens of cards myself in the sub £200 category over the last 12 years.
It makes me laugh,when Nvidia's own failures in games just get ignored even going back a decade,and AMD's kept being brought up.
Plus I never denied AMD had issues either! Mentioned it many times! ;)
Edit!!
I am not going to get in a stupid circular argument with you BTW. I have seen where these go on other forums.
I,Kalniel,and multiple mates(and people I need to troubleshoot for) are all WRONG.
N-V-I-D-I-A,hell yeah!(tm), is the BESTEST under W-I-N-D-O-W-S, had NO problems not like that CRAPPY AMD with the ****ty hardware and drivers which cause cards to explode.
Feel happy,now?? :)
I just read that SpaceX are using Nvidia SOCs to drive the flight displays of the latest Dragon capsule
http://arstechnica.com/science/2014/...space-capsule/
Part of me thinks "Hell that's brave". I mean, while I do believe that Nvidia drivers are better than AMD's I really don't think I would put my life in their hands.
Another part of me wonders if the frozen vacuum of space would actually be enough to stop my Tegra 3 based phone from overheating if you try and play a game on it :D
"Apple will save us all", or "Oh god another standard" depending on how you view the fruit vendor:
http://arstechnica.com/apple/2014/06...ment-platform/
Can't see it helping all the finger pointing and name calling :D
Well, I hope that if Nivida get anywhere with their car push that their embedded products have better hardware reliability than their solder-flawed graphic cards and chipsets. And more importantly that they do proper validation and age testing. Cars are dangerous enough at the best of times, don't need something critical failing at the wrong time due to bad solder or packaging choices.
Weren't a lot of Vista's initial problems down to BSODs from Nvidia's drivers? They really seemed to have struggled to get them stable for ages. Obviously Microsoft contributed too, and while Vista got a lot more stable after SP1 and that wasn't just the service pack but the fact that Nvidia got their drivers sorted was a major factor too.
Could anyone explain why the link wont work? or is it just me o.O