Page 4 of 18 FirstFirst 123456714 ... LastLast
Results 49 to 64 of 286

Thread: News - AMD exec says NVIDIA neglecting gamers

  1. #49
    Overclocking Since 1988 nightkhaos's Avatar
    Join Date
    Apr 2009
    Location
    Sydney, AU
    Posts
    1,415
    Thanks
    93
    Thanked
    127 times in 106 posts

    Re: News - AMD exec says NVIDIA neglecting gamers

    Quote Originally Posted by Perfectionist View Post
    That and the persuading Batman Arkham Asylum developers to make it impossible to turn on anti-aliasing unless the graphics card is nvidia are enough to make me not want to buy from them ever unless they seriously change their policy and start acting like a responsible company, not trying to force competitors out of the market with dirty tricks and treating gamers like ...
    I agree, it almost makes me feel annoyed that I got a 295. Then again if I had waited for the 5850s I would have ended up spending more overall on my system, not including the extra £40 on the GPUs (2x 5850s), but the increase in RAM costs, etc.
    Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV

    MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display

    HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television

    i7 (Bloomfield) Overclocking Guide

    Quote Originally Posted by Spock
    I am not our father.

  2. Received thanks from:

    Perfectionist (02-11-2009)

  3. #50
    Senior Member
    Join Date
    Nov 2005
    Posts
    434
    Thanks
    32
    Thanked
    15 times in 14 posts

    Re: News - AMD exec says NVIDIA neglecting gamers

    Will certainly be buying an ATI 5850 soon as I get the funds together. My past 3 GPUs have been nVidia but that's not to say I'm a fanboy. Don't like their attitude now so will be glad to switch + they really are lagging ATI at the moment. Seems to me they want to use proprietary technology/software to create closed system revenues like Apple. But they should take a long hard look at Sony and see difficult, pointless and unpopular it is to create proprietary standards.

  4. Received thanks from:

    Perfectionist (02-11-2009)

  5. #51
    Senior Member
    Join Date
    Sep 2009
    Posts
    219
    Thanks
    1
    Thanked
    3 times in 3 posts

    Re: News - AMD exec says NVIDIA neglecting gamers

    AMD sure knows how to throw a punch at other companies, and they've shown it on other occasions too.

  6. #52
    Registered+
    Join Date
    Jul 2009
    Posts
    75
    Thanks
    3
    Thanked
    1 time in 1 post

    Re: News - AMD exec says NVIDIA neglecting gamers

    just don't see the logic...no Physx when ATI card is present?, you should be thankful they brought an nvidia product, not tell them what they can/can't have in their systems, freedom of choice, i do wonder what the EU thinks of it, especialy considering the IE/Win7 deal.

  7. Received thanks from:

    Perfectionist (02-11-2009)

  8. #53
    Senior Member
    Join Date
    Mar 2009
    Location
    Birmingham
    Posts
    273
    Thanks
    0
    Thanked
    8 times in 8 posts
    • Badbonji's system
      • Motherboard:
      • GIGABYTE G1.Sniper M3
      • CPU:
      • Core i7 3770k 4.5GHz
      • Memory:
      • 16GB TeamElite 1600MHz
      • Storage:
      • 256GB M4 SSD + 150GB Raptor
      • Graphics card(s):
      • GTX 980
      • PSU:
      • Corsair HX850+ Rev. 2
      • Case:
      • Antec 1200
      • Operating System:
      • Windows 8
      • Monitor(s):
      • Samsung 24" 1920x1200 + LG 32" 1080P TV
      • Internet:
      • BT Infinity 2 76/19Mbps

    Re: News - AMD exec says NVIDIA neglecting gamers

    It was funny when they said who has the fastest gaming gpu out, when the GTX295 still beats the HD5870 most of the time. Although I cannnot wait for the HD5970 to come out.

  9. #54
    Gentoo Ricer
    Join Date
    Jan 2005
    Location
    Galway
    Posts
    11,048
    Thanks
    1,016
    Thanked
    944 times in 704 posts
    • aidanjt's system
      • Motherboard:
      • Asus Strix Z370-G
      • CPU:
      • Intel i7-8700K
      • Memory:
      • 2x8GB Corsiar LPX 3000C15
      • Storage:
      • 500GB Samsung 960 EVO
      • Graphics card(s):
      • EVGA GTX 970 SC ACX 2.0
      • PSU:
      • EVGA G3 750W
      • Case:
      • Fractal Design Define C Mini
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • 240mbps Virgin Cable

    Re: News - AMD exec says NVIDIA neglecting gamers

    Quote Originally Posted by Badbonji View Post
    It was funny when they said who has the fastest gaming gpu out, when the GTX295 still beats the HD5870 most of the time. Although I cannnot wait for the HD5970 to come out.
    Huh? That's a funny comparison, especially when the GTX295 is a twin GPU card, and a price tag to show for it.
    Quote Originally Posted by Agent View Post
    ...every time Creative bring out a new card range their advertising makes it sound like they have discovered a way to insert a thousand Chuck Norris super dwarfs in your ears...

  10. #55
    Oh Crumbs.... Biscuit's Avatar
    Join Date
    Feb 2007
    Location
    N. Yorkshire
    Posts
    11,193
    Thanks
    1,394
    Thanked
    1,091 times in 833 posts
    • Biscuit's system
      • Motherboard:
      • MSI B450M Mortar
      • CPU:
      • AMD 2700X (Be Quiet! Dark Rock 3)
      • Memory:
      • 16GB Patriot Viper 2 @ 3466MHz
      • Storage:
      • 500GB WD Black
      • Graphics card(s):
      • Sapphire R9 290X Vapor-X
      • PSU:
      • Seasonic Focus Gold 750W
      • Case:
      • Lian Li PC-V359
      • Operating System:
      • Windows 10 x64
      • Internet:
      • BT Infinity 80/20

    Re: News - AMD exec says NVIDIA neglecting gamers

    Quote Originally Posted by aidanjt View Post
    Huh? That's a funny comparison, especially when the GTX295 is a twin GPU card, and a price tag to show for it.
    And power consumption

    GPU stands for graphics processing unit and it refers to the chip inside... GTX295 is the fastest card but not chip as its actually like a GTX260/275.

  11. #56
    Gentoo Ricer
    Join Date
    Jan 2005
    Location
    Galway
    Posts
    11,048
    Thanks
    1,016
    Thanked
    944 times in 704 posts
    • aidanjt's system
      • Motherboard:
      • Asus Strix Z370-G
      • CPU:
      • Intel i7-8700K
      • Memory:
      • 2x8GB Corsiar LPX 3000C15
      • Storage:
      • 500GB Samsung 960 EVO
      • Graphics card(s):
      • EVGA GTX 970 SC ACX 2.0
      • PSU:
      • EVGA G3 750W
      • Case:
      • Fractal Design Define C Mini
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • 240mbps Virgin Cable

    Re: News - AMD exec says NVIDIA neglecting gamers

    Quote Originally Posted by Biscuit View Post
    And power consumption

    GPU stands for graphics processing unit and it refers to the chip inside... GTX295 is the fastest card but not chip as its actually like a GTX260/275.
    Indeedy, and saying the HD5870 gets you almost GTX295 level performance for less money and energy consumption is definitely is a reason to buy it.
    Quote Originally Posted by Agent View Post
    ...every time Creative bring out a new card range their advertising makes it sound like they have discovered a way to insert a thousand Chuck Norris super dwarfs in your ears...

  12. #57
    Registered User Lars Weinand's Avatar
    Join Date
    Oct 2009
    Posts
    2
    Thanks
    0
    Thanked
    2 times in 2 posts

    Re: News - AMD exec says NVIDIA neglecting gamers

    Hi Richard,

    you're trying to tell the world that NVIDIA would neglect gamers. As mentioned before, no, we do not. 3D Vision and PhysX are key technologies - focused on gaming. We'll also fully support any GPU related standard out there. We love anything that brings GPU forward. Plus we'll innovate with our own technologies where important.

    Let me respond to your points in detail.

    (1) The positive mention of DX11 is a rarity in recent communications from NVIDIA - except perhaps in their messaging that 'DirectX 11 doesn't matter'. For example I don't remember Jensen or others mentioning tessellation (they biggest of the new hardware features) from the stage at GTC. In fact if reports are to be trusted only one game was shown on stage during the whole conference - hardly what I would call treating gaming as a priority!
    I highly recommend to anyone interested in GPUs to watch the Keynotes and sessions from GTC. It was a developer event focusing on GPU computing. It was not an event on gaming, thus some sessions at GTC were about gaming aspects of GPU computing. Fermi is an awesome graphics processor and we’re confident that it will let us keep the performance crown. We will talk about this side of Fermi very soon. Fermi is an entirely new architecture with many new features specifically for the compute space, so it was important to us that we talk about them first. Especially at a conference with a focus on GPU computing. We are on record saying we support DirectX 11. If you look at the Agenda, you’ll also find a DX11 workshop was held at GTC btw, since DX11 also involves GPU computing.

    (2) The tech of PhysX has still yet to gain any significant traction. I note from the most recent NPD sales figures that "Batman AA" figures at 96th place in the PC charts and yet that seems to be NVIDIA's ' showcase' for PhysX. I suspect gaming physics will be better adopted when as an industry we move away from the divisive proprietary standards that Lars advocates so heavily. [I note that you mentioned CUDA no fewer than five times - more than any other technology that you chose to mention!]
    Regarding your comment of "little traction", PhysX sure has managed to get the attention of AMD and their customers. Physics in games stagnated when it fell to the CPU vendors like AMD, and has seen a resurgent when GPU vendors got involved. As long as in game physics takes a step forward, we are happy, regardless of the path the developers chooses to get there. We support open standards, plus standards that allow NVIDIA to offer new innovations to customers well in advance of industry standards, such as CUDA C. Our goal is to lead the industry in new amazing directions and create value for our customers, which is exactly what PhysX has done. We believe that innovation is good. If innovation comes through DirectX, OpenCL, CUDA C, Bullet or PhysX, it does not matter to NVIDIA. PhysX is not competing with other standards.

    Batman AA received superb reviews as you can see on Metacritics. Game reviewers and gamers agree that PhysX effects adds a lot of fun to the game. PhysX comes as a free feature for GeForce users - from GeForce 8 onwards. It does not cost a penny extra and you can turn it off if you do not like it for whatever reason. What is not to like about that? Besides, you should not cast a vote on PhysX based on one title, anymore than you should cast a vote for Dx 11 based on AMD showcase titles.

    I don't quite understand that proprietary argument. AMD was working with the Physics engine Havok, which would be a "proprietary" engine by that definition as well - since it is owned by Intel. What's the status there btw? Is GPU support for AMD GPUs coming in Havok in the near future? Or was it maybe only a single hardcoded demo without the involvement of Havok? Last but not least CAL also was proprietary as a language by that definition.
    Sometimes it is nescessary to innovate and invent things that do not yet exist. New technologies are always "proprietary" by nature. It seems to me, proprietary for AMD equals “Unfair, I don’t have it”.

    (3) There's every reason to believe that NVIDIA is moving its focus away from gaming. I'll list just a few:

    Not making it a priority at GTC is the obvious one.
    As mentioned, GTC was a GPU computing conference - fully booked out and a great success. More misinformation. Does AMD announcing a new Opteron architecture at IDF (At an Intel conference in a hotel) make you believe AMD would no longer do consumer CPUs? Or more, does AMD talking about CPUs mean they neglect GPUs? Does AMD even do developer conferences?

    Arguing against the relevance of DX11 is another.
    More misinformation and taken out of context. DX11 is a very great thing and we are 100% behind it. Anything that makes the PC gaming experience better is a great thing. This is also why we focus on adding things like PhysX and 3D Vision to PC games. We have already stated that our next generation Fermi-based GeForce GPU will support DirectX 11, along with PhysX and 3D Vision.

    Arguing, as NVIDIA did, that AMD working with Codemasters to add DX11 to DiRT2 is harming gamers is another.
    Sorry, but that is just spreading misinformation. DX11 is a very great thing and we are 100% behind it.

    NVIDIA's behaviour in locking something as trivial as antialiasing to its own hardware (in Batman Arkham Asylum) shows that NVIDIA cares much more about money then gamers.
    Batman AA is not our property. It is owned by Eidos. It is up to Eidos to decide the fate of a feature that AMD refused to contribute too and QA for their customers, not NVIDIA.

    If it is relatively trivial, Mr. Huddy should have done it himself. The Unreal engine does not support in game AA, so we added it and QAed it for our customers. As Eidos confirmed (Not allowed to post links here, but check PCper for Eidos' statement) AMD refused the same opportunity to support gamers with AA on AMD GPUs. I'm sure Mr. Huddy knows how important QA is for game developers. I recommend AMD starts working with developers to make their HW work in a proper way. That's not our job. We added functionality for NVIDIA GPUs into the game. We did not lock anything out. AMD just did not do their work. This happened with previous UE3 engine titles before, where ATI owners had to rename the executable to make AA work on that title (Bioshock in example). It’s not NVIDIA to blame here.

    AMD is already working with games developers on over 20 forthcoming games which feature DX11 tech. NVIDIA has been nowhere to be seen! And we're doing that while offering the world's best support for DirectX 9, 10 and 10.1 games too!
    NVIDIA is actively engaged with every major developer in the world plus we're also working with many smaller innovative game studios. We support game developers wherever we can, to ensure the best gaming experience on GeForce. When DirectX 11 titles hit, Fermi-based GPUs will be here, too.

    NVIDIA is late to deliver DirectX 11 hardware to market.
    I agree with Mr. Huddy on this one. We launch later than AMD does in this case. Fermi is the world’s first computational GPU architecture, with several world’s firsts on the GPU. These take time to design and perfect. Do I wish we had Fermi today… Yes. Is Fermi worth the wait. Absolutely!

    If you don't agree with my fourth bullet point above then I'd guess you'd probably argue that AMD should lock DX11 functionality to its own hardware, yes? Something we haven't done!
    With your comment regarding locking DX11, do you try to indicate that AMD invented DX11 and could have been an AMD-only feature?? DirectX 11 is a new version of DirectX, that will be fully supported by Fermi, as we announced at GTC. It seems that AMD tries to create the perception that DX11 is a AMD only feature. It is not.

    Long story short and no argument coming from AMD will change this: NVIDIA loves games. We play games ourselves and it always was and will be a key area for NVIDIA.

    Lars Weinand, NVIDIA
    Last edited by Lars Weinand; 03-11-2009 at 09:03 AM. Reason: typo

  13. Received thanks from:

    PD HEXUS (03-11-2009)

  14. #58
    Overclocking Since 1988 nightkhaos's Avatar
    Join Date
    Apr 2009
    Location
    Sydney, AU
    Posts
    1,415
    Thanks
    93
    Thanked
    127 times in 106 posts

    Re: News - AMD exec says NVIDIA neglecting gamers

    Thank you Lars for your reply. But as a gamer I have to disagree on a few points.

    Quote Originally Posted by Lars Weinand View Post
    Hi Richard,

    you're trying to tell the world that NVIDIA would neglect gamers. As mentioned before, no, we do not. 3D Vision and PhysX are key technologies - focused on gaming. We'll also fully support any GPU related standard out there. We love anything that brings GPU forward. Plus we'll innovate with our own technologies where important.
    3D Vision is an gimick which overuses the GPU in an ineffiecent manner and is only supported by a small fraction of games. Until you make it more accessble you will find very few gamers actually taking up the technology. I perfer to have better colour definition from my moniter, considering I can't get a half decent IPS panel that supports 120Hz without paying more than I do for the entire rig, and if you look at my signature you will note I do pay a considerable amount for the said rig.

    PhysX, on the other hand, is an interesting and useful technology, but the closed nature of it, and CUDA I might add, means that very few games support it because they will limit their user base, an important consideration for you to, as it gives gamers less incentive to pay for your technology, which quite frankly is expensive to produce, and thus sell, compared to AMDs.

    This results in order to increase sales you do what I can quite simiply call anti-competitive behaviour via vendor lock in involving preventing users placing the brute of the GPU rending on a competitors card, when they still are happy to pay you for a PhysX dedicated card.

    Quote Originally Posted by Lars Weinand View Post
    I highly recommend to anyone interested in GPUs to watch the Keynotes and sessions from GTC. It was a developer event focusing on GPU computing. It was not an event on gaming, thus some sessions at GTC were about gaming aspects of GPU computing. Fermi is an awesome graphics processor and we’re confident that it will let us keep the performance crown. We will talk about this side of Fermi very soon. Fermi is an entirely new architecture with many new features specifically for the compute space, so it was important to us that we talk about them first. Especially at a conference with a focus on GPU computing. We are on record saying we support DirectX 11. If you look at the Agenda, you’ll also find a DX11 workshop was held at GTC btw, since DX11 also involves GPU computing.
    Good luck, but given the recent tend for AMD to release cheaper, and faster cards, you are hard pressed to keep that crown for long. AMD is gaining considerable momentium now, and there R&D department have proved time and time again how innovate they can be. It's only a matter of time before they strip you of this crown for a long period. Consider that for the moment the fastest GPUs on the market are AMDs, and that a competiting product from you is another quater away, and it's quickly apparent that you do not have this crown you refer to. If you want to keep that crown, release your GTX 300 series next week.

    Quote Originally Posted by Lars Weinand View Post
    Regarding your comment of "little traction", PhysX sure has managed to get the attention of AMD and their customers. Physics in games stagnated when it fell to the CPU vendors like AMD, and has seen a resurgent when GPU vendors got involved. As long as in game physics takes a step forward, we are happy, regardless of the path the developers chooses to get there. We support open standards, plus standards that allow NVIDIA to offer new innovations to customers well in advance of industry standards, such as CUDA C. Our goal is to lead the industry in new amazing directions and create value for our customers, which is exactly what PhysX has done. We believe that innovation is good. If innovation comes through DirectX, OpenCL, CUDA C, Bullet or PhysX, it does not matter to NVIDIA. PhysX is not competing with other standards.
    No, you're right, PhysX is not competiting, and that is in fact the problem. It's a closed standard. Which means that game developers have to spend considerable overheads actually supporting it. OpenCL, by definition, on the other hand, is Open, and will thus allow developers to support a greater variety of gamers. As I said in a previous focus, stop development of PhysX and focus on incorperating PhysXs, and CUDA's, unique features into OpenCL, and open the market up a bit. If your R&D department focuses on actually developing a competitive product, instead of locked in technology, you may find you can continue to perform well in the market.

    Quote Originally Posted by Lars Weinand View Post
    Batman AA received superb reviews as you can see on Metacritics. Game reviewers and gamers agree that PhysX effects adds a lot of fun to the game. PhysX comes as a free feature for GeForce users - from GeForce 8 onwards. It does not cost a penny extra and you can turn it off if you do not like it for whatever reason. What is not to like about that? Besides, you should not cast a vote on PhysX based on one title, anymore than you should cast a vote for Dx 11 based on AMD showcase titles.
    And yet the gamers on this forum have been saying the complete opposite of what your reviews have. Further more, Batman AA might be a decent game, but agruing the quality of the authoriship of a game has nothing to do with you. What we, as gamers, have a problem with is the locking of Anti-Analising functionality to NVIDIA GPUs. This is wrong, and only goes to show that you are no longer a company that cares about gamers and delieveing a quality product, you are starting to turn into a green eyed monster only interested in Money. I for one hope it doesn't go that far, but I'm sure some would agrue it already has.

    Quote Originally Posted by Lars Weinand View Post
    I don't quite understand that proprietary argument. AMD was working with the Physics engine Havok, which would be a "proprietary" engine by that definition as well - since it is owned by Intel. What's the status there btw? Is GPU support for AMD GPUs coming in Havok in the near future? Or was it maybe only a single hardcoded demo without the involvement of Havok? Last but not least CAL also was proprietary as a language by that definition. Sometimes it is nescessary to innovate and invent things that do not yet exist. New technologies are always "proprietary" by nature. It seems to me, proprietary for AMD equals “Unfair, I don’t have it”.
    It's curious that you neglected to mention that Intel is AMDs biggest comptitor in another market sector, CPUs. Proprietary I agree is the wrong word, I prefer "closed". Properity just means the rights are owned by someone, closed on the other hand, what CUDA and PhysX are, means that only one platform supports the functionality. Havoc would become, if developed, a Properity plateform. I'm sure you could buy the rights off Intel as well if you wished to develop it. However, I don't see NVIDIA oftering AMD the rights to CUDA or PhysX now do I?

    Quote Originally Posted by Lars Weinand View Post
    NVIDIA is actively engaged with every major developer in the world plus we're also working with many smaller innovative game studios. We support game developers wherever we can, to ensure the best gaming experience on GeForce. When DirectX 11 titles hit, Fermi-based GPUs will be here, too.
    So basicly what you're saying is that you are paying game studios to sport your logo at the start of the games with "nVidia - the way it's meant to be played" and putting cash in their back pocket to add technology like PhysX and CUDA support to their games instead of OpenCL? As a GPU manufacturer you should not need to be actively involved in the software process. Your goal is to provide a plateform, and make that plateform as flexiable as possible.

    Quote Originally Posted by Lars Weinand View Post
    I agree with Mr. Huddy on this one. We launch later than AMD does in this case. Fermi is the world’s first computational GPU architecture, with several world’s firsts on the GPU. These take time to design and perfect. Do I wish we had Fermi today… Yes. Is Fermi worth the wait. Absolutely!
    What new innnovations are you refering to? The only noteworthy changes I see are developments to CUDA, which has very little to do with gaming. If anything you could have released Fermi and then developed the extra CUDA functionality and released it in your Quadro series, since that is where most of the developments seem targeted.

    Quote Originally Posted by Lars Weinand View Post
    With your comment regarding locking DX11, do you try to indicate that AMD invented DX11 and could have been an AMD-only feature?? DirectX 11 is a new version of DirectX, that will be fully supported by Fermi, as we announced at GTC. It seems that AMD tries to create the perception that DX11 is a AMD only feature. It is not.
    Until NVIDIA releases an DX11 capable GPU, DX11 is an AMD only feature.

    Lars, you are on the defensive with this posting. You have not posted anything of relevence, just a point by point debunk, attempting to discredit Richard, not his points.

    AMD are trying to stir action from you by their comments, and if all they get is a advertiser talking NVIDIA up, they have kinda proved their point haven't they?
    Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV

    MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display

    HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television

    i7 (Bloomfield) Overclocking Guide

    Quote Originally Posted by Spock
    I am not our father.

  15. Received thanks from:

    aidanjt (03-11-2009),Biscuit (03-11-2009),Perfectionist (03-11-2009),Syllopsium (03-11-2009)

  16. #59
    Senior Member
    Join Date
    Sep 2009
    Location
    Bolton
    Posts
    324
    Thanks
    6
    Thanked
    27 times in 23 posts
    • Syllopsium's system
      • Motherboard:
      • D975XBX2
      • CPU:
      • Q6700
      • Memory:
      • 8GB ECC DDR2 667
      • Storage:
      • 500GB
      • Graphics card(s):
      • 8800GTX and 7600GT - four monitors
      • PSU:
      • 600W Seasonic S12
      • Case:
      • Coolermaster Stacker
      • Operating System:
      • Vista x64, OpenBSD
      • Monitor(s):
      • 2 IBM C220p 22" CRT, one 17" VP730 TFT, one Zalman Trimon 19" 3D monitor
      • Internet:
      • 12Mb Be Internet

    Re: News - AMD exec says NVIDIA neglecting gamers

    Quote Originally Posted by Lars Weinand View Post
    Your comment of "little traction", PhysX sure has managed to get the attention of AMD and their customers. Physics in games stagnated when it fell to the CPU vendors like AMD, and has seen a resurgent when GPU vendors got involved. As long as in game physics takes a step forward, we are happy, regardless of the path the developers chooses to get there. We support open standards, plus standards that allow NVIDIA to offer new innovations to customers well in advance of industry standards, such as CUDA C. Our goal is to lead the industry in new amazing directions and create value for our customers, which is exactly what PhysX has done. We believe that innovation is good. If innovation comes through DirectX, OpenCL, CUDA C, Bullet or PhysX, it does not matter to NVIDIA. PhysX is not competing with other standards.
    The other points have been adequately covered, but let me address your claim that PhysX is 'not competing with other standards'.

    You, as a company, are perfectly entitled to create your own technologies and lock them into your own products as a means of maintaining marketshare. Where Nvidia cross the line is in disabling PhysX when ATI cards are installed in the system - an action almost without precedence.

    If someone wishes to run both DX11 games and PhysX games they would (if it worked) require both an ATI and Nvidia card. Given that Nvidia's drivers are artificially disabling PhysX, this is a case of PhysX competing with DX11.

    Would you care to correct your position?

    PK

  17. Received thanks from:

    aidanjt (03-11-2009),nightkhaos (03-11-2009),Perfectionist (03-11-2009)

  18. #60
    Oh Crumbs.... Biscuit's Avatar
    Join Date
    Feb 2007
    Location
    N. Yorkshire
    Posts
    11,193
    Thanks
    1,394
    Thanked
    1,091 times in 833 posts
    • Biscuit's system
      • Motherboard:
      • MSI B450M Mortar
      • CPU:
      • AMD 2700X (Be Quiet! Dark Rock 3)
      • Memory:
      • 16GB Patriot Viper 2 @ 3466MHz
      • Storage:
      • 500GB WD Black
      • Graphics card(s):
      • Sapphire R9 290X Vapor-X
      • PSU:
      • Seasonic Focus Gold 750W
      • Case:
      • Lian Li PC-V359
      • Operating System:
      • Windows 10 x64
      • Internet:
      • BT Infinity 80/20

    Re: News - AMD exec says NVIDIA neglecting gamers

    I totally agree with Nightkaos here, good post.

  19. Received thanks from:

    Perfectionist (03-11-2009)

  20. #61
    Anthropomorphic Personification shaithis's Avatar
    Join Date
    Apr 2004
    Location
    The Last Aerie
    Posts
    10,857
    Thanks
    645
    Thanked
    872 times in 736 posts
    • shaithis's system
      • Motherboard:
      • Asus P8Z77 WS
      • CPU:
      • i7 3770k @ 4.5GHz
      • Memory:
      • 32GB HyperX 1866
      • Storage:
      • Lots!
      • Graphics card(s):
      • Sapphire Fury X
      • PSU:
      • Corsair HX850
      • Case:
      • Corsair 600T (White)
      • Operating System:
      • Windows 10 x64
      • Monitor(s):
      • 2 x Dell 3007
      • Internet:
      • Zen 80Mb Fibre

    Re: News - AMD exec says NVIDIA neglecting gamers

    Quote Originally Posted by Lars Weinand View Post
    Batman AA is not our property. It is owned by Eidos. It is up to Eidos to decide the fate of a feature that AMD refused to contribute too and QA for their customers, not NVIDIA.

    If it is relatively trivial, Mr. Huddy should have done it himself. The Unreal engine does not support in game AA, so we added it and QAed it for our customers. As Eidos confirmed (Not allowed to post links here, but check PCper for Eidos' statement) AMD refused the same opportunity to support gamers with AA on AMD GPUs. I'm sure Mr. Huddy knows how important QA is for game developers. I recommend AMD starts working with developers to make their HW work in a proper way. That's not our job. We added functionality for NVIDIA GPUs into the game. We did not lock anything out. AMD just did not do their work. This happened with previous UE3 engine titles before, where ATI owners had to rename the executable to make AA work on that title (Bioshock in example). It’s not NVIDIA to blame here.
    To call the above "a bit fishy" would be a understatement.

    Firstly - I find it hard to beleive that while programming one of the hotest titles of the year - Eidos threw their hands in the air and exclaimed "we cannot add AA to our own game.....we had better get the video card manufacturers to do it for us!" - It's obviously an every day occurance......

    Secondly, I find it hard to beleive that even IF the above happened, the code you wrote could not have been used for AMD cards as well.

    It just screams of a "you scratch my back...." back-hander.


    As for the Physx situation.....dispicable practice disabling Physx when an AMD card is present. Anti-compettive BS that I hope you get dragged through the courts over.
    Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
    HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
    HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
    Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
    NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
    Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive

  21. Received thanks from:

    nightkhaos (03-11-2009),Perfectionist (03-11-2009)

  22. #62
    Registered User
    Join Date
    Nov 2009
    Posts
    5
    Thanks
    0
    Thanked
    0 times in 0 posts
    • tejas84's system
      • Motherboard:
      • MSI 790FX-GD70
      • CPU:
      • AMD Phenom II X4 955
      • Memory:
      • OCZ DDR3 1333MHz
      • Storage:
      • Samsung 1TB F1
      • Graphics card(s):
      • Nvidia GTX 295
      • PSU:
      • Corsair HX1000W
      • Case:
      • Antec 902
      • Operating System:
      • Windows 7
      • Monitor(s):
      • Dell 2408WFP

    Re: News - AMD exec says NVIDIA neglecting gamers

    @ Richard Huddy

    May I remind you that your former boss Hector Ruiz is under investigation with relation to insider trading (re: the Galleon Group) concerning the GlobalFoundries spin off. Your company's share place reflects these revelations. So I think it is a bit rich of you to take the moral high ground over Nvidia on gaming issues. Nvidia and Intel have kept PC gaming alive with their developer relations. AMD has only now started to spend money of dev rel with Dirt 2.

    You also need to remember that Nvidia and Intel has a much larger market share than AMD in GPU and CPU segments and this is unlikely to shift dramatically in the near future.

    I have owned both AMD and Nvidia GPU's and due to the superiority of TWIMTBP I only use Nvidia GPU's on an AMD Phenom II X4 955 system on an AMD 790FX motherboard.

    Every top game I play, Crysis, Crysis Warhead, Fallout 3, The Witcher, Gears of War, Resident Evil 5, Far Cry 2, Dead Space, Mirror's Edge... All are Nvidia games.

    How come AMD could not get even one of those??? Sounds like laziness or sour grapes...

    I like AMD CPU's and Motherboard logic and I use them in both my rigs. However AMD GPU's have no compelling features at all for me.

  23. #63
    Overclocking Since 1988 nightkhaos's Avatar
    Join Date
    Apr 2009
    Location
    Sydney, AU
    Posts
    1,415
    Thanks
    93
    Thanked
    127 times in 106 posts

    Re: News - AMD exec says NVIDIA neglecting gamers

    First off, welcome to HEXUS!

    Quote Originally Posted by tejas84 View Post
    @ Richard Huddy

    May I remind you that your former boss Hector Ruiz is under investigation with relation to insider trading (re: the Galleon Group) concerning the GlobalFoundries spin off. Your company's share place reflects these revelations. So I think it is a bit rich of you to take the moral high ground over Nvidia on gaming issues. Nvidia and Intel have kept PC gaming alive with their developer relations. AMD has only now started to spend money of dev rel with Dirt 2.
    AMD didn't need to do Developer Relations for Dirt 2 as DX11 is a Microsoft Technology. If you read my post above, I was actually agruing that Developer Relations are bad because they result in anti-competivity behaviour by companies. Hector Ruiz is an unrelated issue, and using him to attempt to discredit Richard or AMD as a whole is below the belt, the same behaviour I noted that NVIDIA are using. Please stick to the relevent facts.

    Quote Originally Posted by tejas84 View Post
    You also need to remember that Nvidia and Intel has a much larger market share than AMD in GPU and CPU segments and this is unlikely to shift dramatically in the near future.
    Irrelevent. Market share does not mean they provide better products, only that they do, or have historically, shifted more volume.

    Quote Originally Posted by tejas84 View Post
    I have owned both AMD and Nvidia GPU's and due to the superiority of TWIMTBP I only use Nvidia GPU's on an AMD Phenom II X4 955 system on an AMD 790FX motherboard.
    So you are a fan of AMD's CPU technology! Good for you. NVIDIA did make GPUs in the past, that is again irrelevent. We are talking about current products and business particiles here, not historical products. If we look at historical products NVIDIA have done well and produced high quality products, but recently they have dropped the ball.

    Quote Originally Posted by tejas84 View Post
    Every top game I play, Crysis, Crysis Warhead, Fallout 3, The Witcher, Gears of War, Resident Evil 5, Far Cry 2, Dead Space, Mirror's Edge... All are Nvidia games.
    NVIDIA does not make games. They make GPUs. The reason NVIDIA is labeled on these games is because NVIDIA payed those companies to display their logo. This does not in any way mean that the games will play any better on NVIDIA GPUs.

    Quote Originally Posted by tejas84 View Post
    How come AMD could not get even one of those??? Sounds like laziness or sour grapes...

    I like AMD CPU's and Motherboard logic and I use them in both my rigs. However AMD GPU's have no compelling features at all for me.
    AMD does not need to push their products with every game. They are trying to build a company based upon the technology alone, and in that they are doing very well. They do not need pushy marketing tatics, or to push developers to use closed technologies like NVIDIA does with PhysX.

    AMDs does need compelling features. They provide all the basics, and they provide them well. DX11, OpenCL, DXLA, all the things a user actually wants in a GPU. 3D Vision, as I explained before, is a Gimick at the moment with the 3D technology costing to much, PhysX is closed standard and used in a small proporition of games, CUDA is used mostly by productivity apps and CAD design and most of it's functionality is replicated within OpenCL.

    If you actually used AMDs GPU products with the games you mentioned, rather than NVIDIAs, I doubt you would notice much difference, with the exception of Mirror's Edge which heaviely uses PhysX and CUDA. If fact with most of them I would go so far as to say that with AMD you would have had a better gaming expereince.
    Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV

    MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display

    HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television

    i7 (Bloomfield) Overclocking Guide

    Quote Originally Posted by Spock
    I am not our father.

  24. #64
    Registered User
    Join Date
    Nov 2009
    Posts
    5
    Thanks
    0
    Thanked
    0 times in 0 posts
    • tejas84's system
      • Motherboard:
      • MSI 790FX-GD70
      • CPU:
      • AMD Phenom II X4 955
      • Memory:
      • OCZ DDR3 1333MHz
      • Storage:
      • Samsung 1TB F1
      • Graphics card(s):
      • Nvidia GTX 295
      • PSU:
      • Corsair HX1000W
      • Case:
      • Antec 902
      • Operating System:
      • Windows 7
      • Monitor(s):
      • Dell 2408WFP

    Re: News - AMD exec says NVIDIA neglecting gamers

    Quote Originally Posted by nightkhaos View Post
    First off, welcome to HEXUS!



    AMD didn't need to do Developer Relations for Dirt 2 as DX11 is a Microsoft Technology. If you read my post above, I was actually agruing that Developer Relations are bad because they result in anti-competivity behaviour by companies. Hector Ruiz is an unrelated issue, and using him to attempt to discredit Richard or AMD as a whole is below the belt, the same behaviour I noted that NVIDIA are using. Please stick to the relevent facts.



    Irrelevent. Market share does not mean they provide better products, only that they do, or have historically, shifted more volume.



    So you are a fan of AMD's CPU technology! Good for you. NVIDIA did make GPUs in the past, that is again irrelevent. We are talking about current products and business particiles here, not historical products. If we look at historical products NVIDIA have done well and produced high quality products, but recently they have dropped the ball.



    NVIDIA does not make games. They make GPUs. The reason NVIDIA is labeled on these games is because NVIDIA payed those companies to display their logo. This does not in any way mean that the games will play any better on NVIDIA GPUs.



    AMD does not need to push their products with every game. They are trying to build a company based upon the technology alone, and in that they are doing very well. They do not need pushy marketing tatics, or to push developers to use closed technologies like NVIDIA does with PhysX.

    AMDs does need compelling features. They provide all the basics, and they provide them well. DX11, OpenCL, DXLA, all the things a user actually wants in a GPU. 3D Vision, as I explained before, is a Gimick at the moment with the 3D technology costing to much, PhysX is closed standard and used in a small proporition of games, CUDA is used mostly by productivity apps and CAD design and most of it's functionality is replicated within OpenCL.

    If you actually used AMDs GPU products with the games you mentioned, rather than NVIDIAs, I doubt you would notice much difference, with the exception of Mirror's Edge which heaviely uses PhysX and CUDA. If fact with most of them I would go so far as to say that with AMD you would have had a better gaming expereince.

    Hmmmmm ok. My post was actually @ Richard Huddy as stated but what the hey. Can you give conclusive proof where Nvidia "dropped the ball" as you say? The information about Dr Ruiz is totally relevant as he brokered the AMD/ ATI merger, led AMD into deep debt and was an integral part of the AMD fab spinoff. Any outcome will have an effect on AMD and/or GF whether you like it or not. Nvidia is still a richer company with good cash reserves from their successes during the 8800GTX era, so I think your hope of their early demise is rather premature...

    AMD have severe shortages of the Cypress GPU in the UK and the USA at this time. Well let me see. ATI Stream is certainly not open for Intel and Nvidia to use and CAL is proprietary. Heck even Direct X is proprietary! AMD's GPU business complain a little too much. If they support all games in future like they have supported Dirt 2 then maybe I will change my mind about AMD GPU support.

    The AMD CPU and chipset logic business I have nothing but admiration and respect, especially for my Semiconductor hero and inventor of Athlon 64, AMD CEO Dirk Meyer.

Page 4 of 18 FirstFirst 123456714 ... LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Replies: 15
    Last Post: 13-04-2009, 12:54 PM
  2. News - AMD exec questions Intel Xeon claims
    By HEXUS in forum HEXUS News
    Replies: 4
    Last Post: 09-04-2009, 01:50 PM
  3. Replies: 2
    Last Post: 31-03-2009, 12:10 PM
  4. Replies: 0
    Last Post: 26-03-2009, 03:11 PM
  5. News - AMD launches Fusion for Gaming utility
    By HEXUS in forum HEXUS News
    Replies: 7
    Last Post: 29-09-2008, 03:14 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •