going a bit off topic here guys...
Due to the memory, the 4870X2 actually runs Warhead better at 2560 than the GTX295, substantially so. This is why 2GB per GPU on the next gen would have been nice, but it's unlikely to happen.
Agreed on the extra features, people are often saying we don't need more performance unless we use 'silly' resolutions. Well, those of us who buy top end cards ought to have 'silly' resolutions to match, and need the graphics grunt to take on today's demanding (but still not necessarily good looking) games engines.
The idle power consumption of the 5800s is also a big deal for me.
Anyway, agreed with Biscuit, that's enough of a Derail for now.
Maybe it's time that the powers that be stop trying to justify their actions (and I'm not pointing fingers here at any one manufacturer). Rather than driving the technology towards what the manufacturer believes Joe Public needs, maybe they should sit back and accept the criticism. We want good performance, low power and reasonably priced graphics card solutions.
Personally, I've not been burnt by Hydra/Batman AA/PhysX debacle, so can rather comfortably sit back and throw my oar in every once and a while. However I am fed up with fanbois.
Back to the topic;
nVidia; IMHO, either enable PhysX with an ATi powered graphics sub system, or accept that some of your customers are not gong to be happy about it.
All of this "back and forth" is becoming rather tiring and it's not doing anyone any favours by trying to defend yourselves to people who, through no fault of their own, have wasted their hard-earned
Last edited by ajones; 17-11-2009 at 04:41 PM. Reason: grammar!
Corsair Air 540, Asus Prime X570-Pro, Win 10 Pro, AMD R9 3900X, Corsair HX 750, EVGA 1080 Ti, 2x Corsair 2TB MP600, 2x 2TB WD20EZRX, 4x8GB Corsair Dominator, custom watercooled (single loop, 2 rads)
Corsair 550D, Asus X470-Prime Pro, Win 10 Pro, AMD R7 2700, Corsair RM750i, Asus GTX780 Poseidon, 2x Sammy 500GB 970 EVO, 2x 2TB Seagate Barracuda, 2x8GB Corsair Vengeance, custom watercooled (single loop, 2 rads)
Synology DS918+ w/ 2xWDC Green 3TB + 2x Seagate Barracuda 6TB, N2200 w/ 2xSammy 1.5TB
backup:
Corsair 500R, Gigabyte GA-Z97MX Gaming 5, Win 10 Pro, i5 4690, Corsair HX750i, Sapphire Fury X, 256GB Sammy SM951 M.2 (System), WD SE16 640GB, 2x8GB Corsair Vengeance, Corsair H100i
No, we can't agree and you're not correct.
I am not by any means an ATI fanboy, but I find Nvidia's actions completely unacceptable. This is a wider principle than graphics cards, too - the opinion of one vendor that they can disable functionality based on the presence of another card is almost without precedent and something that needs stamping on, hard.
It's very simple : if an ATI card (or indeed, a Matrox or anything else) for graphics and an NVidia card for PhysX doesn't work, the person whose fault it is fixes the issue. Since PhysX is device independent this would usually be an Nvidia issue. If by some miracle (unlikely) it's ATI's problem, then they fix it. If it's uncertain, both ATI and Nvidia test the configuration until it works.
That is a consequence of writing under Windows. You don't control all the software and hardware and have a responsibility to make things work together. Same with soundcards and Nvidia, network cards and Nvidia, SCSI cards and Nvidia, TV tuners and Nvidia etc.
Even given that I don't accept you can disclaim your responsibility to write good drivers, if this seeming incompatibility is such an issue : please document a case where an ATI card conflicts with an Nvidia card using drivers prior to Nvidia disabling the functionality.
Anyone tried running a game with a non ATI/Nvidia card as the primary device and Nvidia as PhysX?
The sad thing is, I am very happy with Nvidia hardware. I ran two passive 7600GTs (not in SLI - because of your - and AMD's - stupid multi card motherboard lock in) for years, and now run a 7600GT and a (second hand) 8800GTX. The 8800GTX is an incredible piece of engineering - quiet, powerful and stable. I've got a Zalman 3D monitor - using the Nvidia drivers and who knows - I may even buy a PhysX game at some point.
Nevertheless, whilst Nvidia's current tactics and PhysX lockout continue, I will not consider another Nvidia card - even if you release a DX11 card, and even if it's faster than ATI's unless ATI fails to fix their driver issues before the 8800GTX is so grindingly slow I can't run anything (come on ATI, get your finger out, comment on your forums, accept bugs have been replicated and actually fix them).
PK
Phage (17-11-2009)
"In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."
For some time now we have been trying to communicate clearly our policy on running our GPU PhysX with a non-NVIDIA render. As you can imagine just getting a message out on something like that can be difficult. Our partners mostly have printed boxes for thier GPU months if not years ago so that won't work.
We do have our website here that try's to be explicit.
http://www.nvidia.com/object/physx_faq.html
Also rememer this was not possible under vista which prevented multiple GPU drivers from runnning at the same time. The only time this was enabled was on old XP drivers and an early win7 beta driver from us.
Nevertheless, I know is not great to buy something for a purpose that it turns out is not supported. I mentioned earlier in this thread that we will continue to improve our security on this so if you are not ourtside of your return window I recommend you bring the PhysX GPU back. Hurts nme to tell you that but honestly I don't expect the community hack to work indefinitely.
That looks like a bug to me - and I have seen some data that shows we have a bug on some crysis levels with 8xAA which we are working on. BTW, we are not alone having issues with that config - I have seen similar issues with 5770 and some Xfire configs.
We do try to make thes things work across all games and configs but we do occasionally have some failures. i think that is also an illustration of my point about how frickin hard this stuff is and why we don't add support for configs that are not critical. Each new config costs...big time.
I see your point about even our highend card still struggles with Crysis. It is a very heavy title. On the other hand though it is one of the few titiles that is that punishing. As a matter of fact I look at a lot of this stuff and I can say that I only know of 4 titles that are similarly intense.
Crysis, Dark Athena, Hawx, WIC.
These are the only games I see that pull 295 down below 45fps @25x16 with 4xaa. the other zillions do not, so of course it just becomes a trade-off. And IMHO while Crysis is a truly great game (would have been better if they just stayed in jungle) it is on outlier in terms of perf requirements.
It's no bug, the GTX295 is physically unable to run games properly at 2560x1600 with high AA, because it simply doesn't have enough memory.
Sadly, I'm starting to disagree more and more about Crysis being an outlier for performance, it has plenty of companions now.
I thought I was clear on this point. We have no plans to enable PhysX running on an NV GPU with rendering on an AMD GPU. I accept that some users were burned by this decision and some are probably going to resent NV for some long period of time if not forever. And I do think we could have done a better job communicating this restriction when Win7 beta's became available. We should have been able to predict that folks would want to take advantage of a dedicated GPU for PhysX with and ATI card. We duffed that one.
ajones (17-11-2009)
Not so sure about that. Ill check with the boys here about the CW perf at 25x16. BTW, I love games with high GPU loads that look good. Big monitors, cool effects....that is why I come into work in the morning! And I hope you are right that this type of game become the norm. We push the horsepower of our GPUs up the curve every year and always have to hope that the content shows up to take advantage of it.
BTW, that is why we have a TWIMTBP program. It is not some strange obscure marketing only deal. It is the way we can systematically engage with new content designers to enable studios to quickly leverage new GPU technologies. We just build GPUs and hope content shows up. We help make it happen.
I loved 8800 GTX as well. When it came out it was just so substantially better than anything that came before it. And 8800 GTX's success enabled us to reinvest in the next cycle of innovation. Today we see the product of that virtuous cycle:
- Build something compelling
- Bring it to market
- Make profit
- Reinvest to make the next big thing.
The reason we got to build PhysX, Stereo and even our next GPU is because of our prior successes and our focus on our core market - gaming, while simultaneously controlling and running an efficient business.
The decision to enable or disable PhysX is only revelevant because we have again created something our users value. NVIDIA could have done a better job communicating our policy on this on - no doubt. But accept that we need to run a business that generates profit to continue our cycle of innovation. We need to both control expenses and generate margin, and sometimes we need to make an unpopular decision to help the cycle and this is just one of those examples.
BTW, with a little work on your part I suspect you can find many examples of this. Have you ever noticed that many toyota parts don't fit on lexus? No good reason for that is there? How about the fact that Intel sells thier Atom CPU with a chipset for less than the price of Atom alone? What about MS just turning off the hacked xboxes? All of these are examples of companies making decision to protect thier businesses or IP.
Bottom line is there is nothing unethical or unusual about or decision on PhysX. And to put an even better point to it, There is nothing ususal or unethical about our decision on Lucid. We have no plans to QA it and I don't expect many people do.
The Atom argument yeah ok i can agree on that one Intel are being a bit crap there but their reputation is being tarnished because of it (to a certain degree). The rest of your metaphors are lacking somewhat in relative substance though.
I still dont actually see how Nvidias IP is damaged by using the card in a system with an ATI card also present if, as before mentioned, the user clearly goes through motions to prove that he understands the risks with using a non-officially supported configuration.
As for the Lucid, if you are not going to QA systems with it then how can you possibly allow it to go ahead? If ATI users dont get the privileged of working in harmony with Nvidia without going through testing why is it ok for Lucid?
Corsair Air 540, Asus Prime X570-Pro, Win 10 Pro, AMD R9 3900X, Corsair HX 750, EVGA 1080 Ti, 2x Corsair 2TB MP600, 2x 2TB WD20EZRX, 4x8GB Corsair Dominator, custom watercooled (single loop, 2 rads)
Corsair 550D, Asus X470-Prime Pro, Win 10 Pro, AMD R7 2700, Corsair RM750i, Asus GTX780 Poseidon, 2x Sammy 500GB 970 EVO, 2x 2TB Seagate Barracuda, 2x8GB Corsair Vengeance, custom watercooled (single loop, 2 rads)
Synology DS918+ w/ 2xWDC Green 3TB + 2x Seagate Barracuda 6TB, N2200 w/ 2xSammy 1.5TB
backup:
Corsair 500R, Gigabyte GA-Z97MX Gaming 5, Win 10 Pro, i5 4690, Corsair HX750i, Sapphire Fury X, 256GB Sammy SM951 M.2 (System), WD SE16 640GB, 2x8GB Corsair Vengeance, Corsair H100i
We believe that when a user buys a board that has the Lucid chip on it - there is no doubt that the Lucid chip or the motherboard manufacturer is responsible for the quality of that solution. You are right though that by allowing it we do expose ourselves to potential quality problems as some users won't be able to affix responsibility where is belongs. In this case we decided the technology will sink or swim on it's own merits - and we will just try to work though the issues as the come.
PhysX is not the same. Since we own the PhysX software and GPU and issues with PhysX and AMD will inevitably be a job for NVIDIA to fix.
I don't know what bleating is ...does not sound good though, but I try to be straight forward with my comments and views. Sometimes...I speak a bit too quickly...inserting foot-in-mouth occasionally - maybe even tweaking out a few people... But usually I stay out of trouble and focus on the issues.
NVIDIA in general does not ignore issues...hence I am here, now. Can't do this all the time :> but it seemed worth doing on this issue.
There are currently 1 users browsing this thread. (0 members and 1 guests)