Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV
MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display
HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television
i7 (Bloomfield) Overclocking Guide
Originally Posted by Spock
Perfectionist (02-11-2009)
Will certainly be buying an ATI 5850 soon as I get the funds together. My past 3 GPUs have been nVidia but that's not to say I'm a fanboy. Don't like their attitude now so will be glad to switch + they really are lagging ATI at the moment. Seems to me they want to use proprietary technology/software to create closed system revenues like Apple. But they should take a long hard look at Sony and see difficult, pointless and unpopular it is to create proprietary standards.
Perfectionist (02-11-2009)
AMD sure knows how to throw a punch at other companies, and they've shown it on other occasions too.
just don't see the logic...no Physx when ATI card is present?, you should be thankful they brought an nvidia product, not tell them what they can/can't have in their systems, freedom of choice, i do wonder what the EU thinks of it, especialy considering the IE/Win7 deal.
Perfectionist (02-11-2009)
It was funny when they said who has the fastest gaming gpu out, when the GTX295 still beats the HD5870 most of the time. Although I cannnot wait for the HD5970 to come out.
Hi Richard,
you're trying to tell the world that NVIDIA would neglect gamers. As mentioned before, no, we do not. 3D Vision and PhysX are key technologies - focused on gaming. We'll also fully support any GPU related standard out there. We love anything that brings GPU forward. Plus we'll innovate with our own technologies where important.
Let me respond to your points in detail.
I highly recommend to anyone interested in GPUs to watch the Keynotes and sessions from GTC. It was a developer event focusing on GPU computing. It was not an event on gaming, thus some sessions at GTC were about gaming aspects of GPU computing. Fermi is an awesome graphics processor and we’re confident that it will let us keep the performance crown. We will talk about this side of Fermi very soon. Fermi is an entirely new architecture with many new features specifically for the compute space, so it was important to us that we talk about them first. Especially at a conference with a focus on GPU computing. We are on record saying we support DirectX 11. If you look at the Agenda, you’ll also find a DX11 workshop was held at GTC btw, since DX11 also involves GPU computing.(1) The positive mention of DX11 is a rarity in recent communications from NVIDIA - except perhaps in their messaging that 'DirectX 11 doesn't matter'. For example I don't remember Jensen or others mentioning tessellation (they biggest of the new hardware features) from the stage at GTC. In fact if reports are to be trusted only one game was shown on stage during the whole conference - hardly what I would call treating gaming as a priority!
Regarding your comment of "little traction", PhysX sure has managed to get the attention of AMD and their customers. Physics in games stagnated when it fell to the CPU vendors like AMD, and has seen a resurgent when GPU vendors got involved. As long as in game physics takes a step forward, we are happy, regardless of the path the developers chooses to get there. We support open standards, plus standards that allow NVIDIA to offer new innovations to customers well in advance of industry standards, such as CUDA C. Our goal is to lead the industry in new amazing directions and create value for our customers, which is exactly what PhysX has done. We believe that innovation is good. If innovation comes through DirectX, OpenCL, CUDA C, Bullet or PhysX, it does not matter to NVIDIA. PhysX is not competing with other standards.(2) The tech of PhysX has still yet to gain any significant traction. I note from the most recent NPD sales figures that "Batman AA" figures at 96th place in the PC charts and yet that seems to be NVIDIA's ' showcase' for PhysX. I suspect gaming physics will be better adopted when as an industry we move away from the divisive proprietary standards that Lars advocates so heavily. [I note that you mentioned CUDA no fewer than five times - more than any other technology that you chose to mention!]
Batman AA received superb reviews as you can see on Metacritics. Game reviewers and gamers agree that PhysX effects adds a lot of fun to the game. PhysX comes as a free feature for GeForce users - from GeForce 8 onwards. It does not cost a penny extra and you can turn it off if you do not like it for whatever reason. What is not to like about that? Besides, you should not cast a vote on PhysX based on one title, anymore than you should cast a vote for Dx 11 based on AMD showcase titles.
I don't quite understand that proprietary argument. AMD was working with the Physics engine Havok, which would be a "proprietary" engine by that definition as well - since it is owned by Intel. What's the status there btw? Is GPU support for AMD GPUs coming in Havok in the near future? Or was it maybe only a single hardcoded demo without the involvement of Havok? Last but not least CAL also was proprietary as a language by that definition.
Sometimes it is nescessary to innovate and invent things that do not yet exist. New technologies are always "proprietary" by nature. It seems to me, proprietary for AMD equals “Unfair, I don’t have it”.
As mentioned, GTC was a GPU computing conference - fully booked out and a great success. More misinformation. Does AMD announcing a new Opteron architecture at IDF (At an Intel conference in a hotel) make you believe AMD would no longer do consumer CPUs? Or more, does AMD talking about CPUs mean they neglect GPUs? Does AMD even do developer conferences?(3) There's every reason to believe that NVIDIA is moving its focus away from gaming. I'll list just a few:
Not making it a priority at GTC is the obvious one.
More misinformation and taken out of context. DX11 is a very great thing and we are 100% behind it. Anything that makes the PC gaming experience better is a great thing. This is also why we focus on adding things like PhysX and 3D Vision to PC games. We have already stated that our next generation Fermi-based GeForce GPU will support DirectX 11, along with PhysX and 3D Vision.Arguing against the relevance of DX11 is another.
Sorry, but that is just spreading misinformation. DX11 is a very great thing and we are 100% behind it.Arguing, as NVIDIA did, that AMD working with Codemasters to add DX11 to DiRT2 is harming gamers is another.
Batman AA is not our property. It is owned by Eidos. It is up to Eidos to decide the fate of a feature that AMD refused to contribute too and QA for their customers, not NVIDIA.NVIDIA's behaviour in locking something as trivial as antialiasing to its own hardware (in Batman Arkham Asylum) shows that NVIDIA cares much more about money then gamers.
If it is relatively trivial, Mr. Huddy should have done it himself. The Unreal engine does not support in game AA, so we added it and QAed it for our customers. As Eidos confirmed (Not allowed to post links here, but check PCper for Eidos' statement) AMD refused the same opportunity to support gamers with AA on AMD GPUs. I'm sure Mr. Huddy knows how important QA is for game developers. I recommend AMD starts working with developers to make their HW work in a proper way. That's not our job. We added functionality for NVIDIA GPUs into the game. We did not lock anything out. AMD just did not do their work. This happened with previous UE3 engine titles before, where ATI owners had to rename the executable to make AA work on that title (Bioshock in example). It’s not NVIDIA to blame here.
NVIDIA is actively engaged with every major developer in the world plus we're also working with many smaller innovative game studios. We support game developers wherever we can, to ensure the best gaming experience on GeForce. When DirectX 11 titles hit, Fermi-based GPUs will be here, too.AMD is already working with games developers on over 20 forthcoming games which feature DX11 tech. NVIDIA has been nowhere to be seen! And we're doing that while offering the world's best support for DirectX 9, 10 and 10.1 games too!
I agree with Mr. Huddy on this one. We launch later than AMD does in this case. Fermi is the world’s first computational GPU architecture, with several world’s firsts on the GPU. These take time to design and perfect. Do I wish we had Fermi today… Yes. Is Fermi worth the wait. Absolutely!NVIDIA is late to deliver DirectX 11 hardware to market.
With your comment regarding locking DX11, do you try to indicate that AMD invented DX11 and could have been an AMD-only feature?? DirectX 11 is a new version of DirectX, that will be fully supported by Fermi, as we announced at GTC. It seems that AMD tries to create the perception that DX11 is a AMD only feature. It is not.If you don't agree with my fourth bullet point above then I'd guess you'd probably argue that AMD should lock DX11 functionality to its own hardware, yes? Something we haven't done!
Long story short and no argument coming from AMD will change this: NVIDIA loves games. We play games ourselves and it always was and will be a key area for NVIDIA.
Lars Weinand, NVIDIA
Last edited by Lars Weinand; 03-11-2009 at 09:03 AM. Reason: typo
PD HEXUS (03-11-2009)
Thank you Lars for your reply. But as a gamer I have to disagree on a few points.
3D Vision is an gimick which overuses the GPU in an ineffiecent manner and is only supported by a small fraction of games. Until you make it more accessble you will find very few gamers actually taking up the technology. I perfer to have better colour definition from my moniter, considering I can't get a half decent IPS panel that supports 120Hz without paying more than I do for the entire rig, and if you look at my signature you will note I do pay a considerable amount for the said rig.
PhysX, on the other hand, is an interesting and useful technology, but the closed nature of it, and CUDA I might add, means that very few games support it because they will limit their user base, an important consideration for you to, as it gives gamers less incentive to pay for your technology, which quite frankly is expensive to produce, and thus sell, compared to AMDs.
This results in order to increase sales you do what I can quite simiply call anti-competitive behaviour via vendor lock in involving preventing users placing the brute of the GPU rending on a competitors card, when they still are happy to pay you for a PhysX dedicated card.
Good luck, but given the recent tend for AMD to release cheaper, and faster cards, you are hard pressed to keep that crown for long. AMD is gaining considerable momentium now, and there R&D department have proved time and time again how innovate they can be. It's only a matter of time before they strip you of this crown for a long period. Consider that for the moment the fastest GPUs on the market are AMDs, and that a competiting product from you is another quater away, and it's quickly apparent that you do not have this crown you refer to. If you want to keep that crown, release your GTX 300 series next week.
No, you're right, PhysX is not competiting, and that is in fact the problem. It's a closed standard. Which means that game developers have to spend considerable overheads actually supporting it. OpenCL, by definition, on the other hand, is Open, and will thus allow developers to support a greater variety of gamers. As I said in a previous focus, stop development of PhysX and focus on incorperating PhysXs, and CUDA's, unique features into OpenCL, and open the market up a bit. If your R&D department focuses on actually developing a competitive product, instead of locked in technology, you may find you can continue to perform well in the market.
And yet the gamers on this forum have been saying the complete opposite of what your reviews have. Further more, Batman AA might be a decent game, but agruing the quality of the authoriship of a game has nothing to do with you. What we, as gamers, have a problem with is the locking of Anti-Analising functionality to NVIDIA GPUs. This is wrong, and only goes to show that you are no longer a company that cares about gamers and delieveing a quality product, you are starting to turn into a green eyed monster only interested in Money. I for one hope it doesn't go that far, but I'm sure some would agrue it already has.
It's curious that you neglected to mention that Intel is AMDs biggest comptitor in another market sector, CPUs. Proprietary I agree is the wrong word, I prefer "closed". Properity just means the rights are owned by someone, closed on the other hand, what CUDA and PhysX are, means that only one platform supports the functionality. Havoc would become, if developed, a Properity plateform. I'm sure you could buy the rights off Intel as well if you wished to develop it. However, I don't see NVIDIA oftering AMD the rights to CUDA or PhysX now do I?
So basicly what you're saying is that you are paying game studios to sport your logo at the start of the games with "nVidia - the way it's meant to be played" and putting cash in their back pocket to add technology like PhysX and CUDA support to their games instead of OpenCL? As a GPU manufacturer you should not need to be actively involved in the software process. Your goal is to provide a plateform, and make that plateform as flexiable as possible.
What new innnovations are you refering to? The only noteworthy changes I see are developments to CUDA, which has very little to do with gaming. If anything you could have released Fermi and then developed the extra CUDA functionality and released it in your Quadro series, since that is where most of the developments seem targeted.
Until NVIDIA releases an DX11 capable GPU, DX11 is an AMD only feature.
Lars, you are on the defensive with this posting. You have not posted anything of relevence, just a point by point debunk, attempting to discredit Richard, not his points.
AMD are trying to stir action from you by their comments, and if all they get is a advertiser talking NVIDIA up, they have kinda proved their point haven't they?
Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV
MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display
HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television
i7 (Bloomfield) Overclocking Guide
Originally Posted by Spock
aidanjt (03-11-2009),Biscuit (03-11-2009),Perfectionist (03-11-2009),Syllopsium (03-11-2009)
The other points have been adequately covered, but let me address your claim that PhysX is 'not competing with other standards'.
You, as a company, are perfectly entitled to create your own technologies and lock them into your own products as a means of maintaining marketshare. Where Nvidia cross the line is in disabling PhysX when ATI cards are installed in the system - an action almost without precedence.
If someone wishes to run both DX11 games and PhysX games they would (if it worked) require both an ATI and Nvidia card. Given that Nvidia's drivers are artificially disabling PhysX, this is a case of PhysX competing with DX11.
Would you care to correct your position?
PK
aidanjt (03-11-2009),nightkhaos (03-11-2009),Perfectionist (03-11-2009)
Perfectionist (03-11-2009)
To call the above "a bit fishy" would be a understatement.
Firstly - I find it hard to beleive that while programming one of the hotest titles of the year - Eidos threw their hands in the air and exclaimed "we cannot add AA to our own game.....we had better get the video card manufacturers to do it for us!" - It's obviously an every day occurance......
Secondly, I find it hard to beleive that even IF the above happened, the code you wrote could not have been used for AMD cards as well.
It just screams of a "you scratch my back...." back-hander.
As for the Physx situation.....dispicable practice disabling Physx when an AMD card is present. Anti-compettive BS that I hope you get dragged through the courts over.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
nightkhaos (03-11-2009),Perfectionist (03-11-2009)
@ Richard Huddy
May I remind you that your former boss Hector Ruiz is under investigation with relation to insider trading (re: the Galleon Group) concerning the GlobalFoundries spin off. Your company's share place reflects these revelations. So I think it is a bit rich of you to take the moral high ground over Nvidia on gaming issues. Nvidia and Intel have kept PC gaming alive with their developer relations. AMD has only now started to spend money of dev rel with Dirt 2.
You also need to remember that Nvidia and Intel has a much larger market share than AMD in GPU and CPU segments and this is unlikely to shift dramatically in the near future.
I have owned both AMD and Nvidia GPU's and due to the superiority of TWIMTBP I only use Nvidia GPU's on an AMD Phenom II X4 955 system on an AMD 790FX motherboard.
Every top game I play, Crysis, Crysis Warhead, Fallout 3, The Witcher, Gears of War, Resident Evil 5, Far Cry 2, Dead Space, Mirror's Edge... All are Nvidia games.
How come AMD could not get even one of those??? Sounds like laziness or sour grapes...
I like AMD CPU's and Motherboard logic and I use them in both my rigs. However AMD GPU's have no compelling features at all for me.
First off, welcome to HEXUS!
AMD didn't need to do Developer Relations for Dirt 2 as DX11 is a Microsoft Technology. If you read my post above, I was actually agruing that Developer Relations are bad because they result in anti-competivity behaviour by companies. Hector Ruiz is an unrelated issue, and using him to attempt to discredit Richard or AMD as a whole is below the belt, the same behaviour I noted that NVIDIA are using. Please stick to the relevent facts.
Irrelevent. Market share does not mean they provide better products, only that they do, or have historically, shifted more volume.
So you are a fan of AMD's CPU technology! Good for you. NVIDIA did make GPUs in the past, that is again irrelevent. We are talking about current products and business particiles here, not historical products. If we look at historical products NVIDIA have done well and produced high quality products, but recently they have dropped the ball.
NVIDIA does not make games. They make GPUs. The reason NVIDIA is labeled on these games is because NVIDIA payed those companies to display their logo. This does not in any way mean that the games will play any better on NVIDIA GPUs.
AMD does not need to push their products with every game. They are trying to build a company based upon the technology alone, and in that they are doing very well. They do not need pushy marketing tatics, or to push developers to use closed technologies like NVIDIA does with PhysX.
AMDs does need compelling features. They provide all the basics, and they provide them well. DX11, OpenCL, DXLA, all the things a user actually wants in a GPU. 3D Vision, as I explained before, is a Gimick at the moment with the 3D technology costing to much, PhysX is closed standard and used in a small proporition of games, CUDA is used mostly by productivity apps and CAD design and most of it's functionality is replicated within OpenCL.
If you actually used AMDs GPU products with the games you mentioned, rather than NVIDIAs, I doubt you would notice much difference, with the exception of Mirror's Edge which heaviely uses PhysX and CUDA. If fact with most of them I would go so far as to say that with AMD you would have had a better gaming expereince.
Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV
MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display
HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television
i7 (Bloomfield) Overclocking Guide
Originally Posted by Spock
Hmmmmm ok. My post was actually @ Richard Huddy as stated but what the hey. Can you give conclusive proof where Nvidia "dropped the ball" as you say? The information about Dr Ruiz is totally relevant as he brokered the AMD/ ATI merger, led AMD into deep debt and was an integral part of the AMD fab spinoff. Any outcome will have an effect on AMD and/or GF whether you like it or not. Nvidia is still a richer company with good cash reserves from their successes during the 8800GTX era, so I think your hope of their early demise is rather premature...
AMD have severe shortages of the Cypress GPU in the UK and the USA at this time. Well let me see. ATI Stream is certainly not open for Intel and Nvidia to use and CAL is proprietary. Heck even Direct X is proprietary! AMD's GPU business complain a little too much. If they support all games in future like they have supported Dirt 2 then maybe I will change my mind about AMD GPU support.
The AMD CPU and chipset logic business I have nothing but admiration and respect, especially for my Semiconductor hero and inventor of Athlon 64, AMD CEO Dirk Meyer.
There are currently 1 users browsing this thread. (0 members and 1 guests)