Page 4 of 4 FirstFirst 1234
Results 49 to 54 of 54

Thread: Opinions - A look back at NVIDIA's GTC

  1. #49
    Look Ma, a Title!
    Join Date
    Jun 2009
    Location
    Milton Keynes
    Posts
    451
    Thanks
    115
    Thanked
    15 times in 15 posts
    • Arthran's system
      • CPU:
      • i5 3570K
      • Memory:
      • 8Gb Corsair Black
      • Storage:
      • 120gb Kingston Now SSD, 4+TB Storage
      • Graphics card(s):
      • AMD 7870 2GB Ghz
      • Case:
      • NZXT Phantom
      • Operating System:
      • Win 7 Enterprise
      • Monitor(s):
      • HP 24" 1980x1080p + Viewsonic Projector
      • Internet:
      • Plusnet Fibre

    Re: Opinions - A look back at NVIDIA's GTC

    Quote Originally Posted by Agent View Post
    Hardly seems that to me, but rather 2 people putting their point of view across. Certainly nothing wrong with that and we actively encourage both to participate.

    If you have an issue with the way a thread is heading, give us a shout and we'll look into it. Cheers
    merely trying to remind them of civilities before it got out of hand, adult constructive discussion ftw I'll agree
    Wow (shadowsong): Arthran, Arthra, Arthrun, Amyle (I know, I'm inventive with names)

  2. #50
    HEXUS.social member Agent's Avatar
    Join Date
    Jul 2003
    Location
    Internet
    Posts
    19,185
    Thanks
    738
    Thanked
    1,609 times in 1,048 posts

    Re: Opinions - A look back at NVIDIA's GTC

    Quote Originally Posted by Arthran View Post
    merely trying to remind them of civilities before it got out of hand,
    That's our job - Not because we are trying to stomp our power over users, but what happens to be one mans discussion is another mans 'our of hand'. For that reason if you think things are getting our of hand, shout us and we'll decide if it's the case or not. Here I don't see an issue, and I really don't want either nVidia or ATI to think they have to 'tone anything down' - their posts are fine.

    Cheers
    Quote Originally Posted by Saracen View Post
    And by trying to force me to like small pants, they've alienated me.

  3. Received thanks from:

    Arthran (08-10-2009)

  4. #51
    Look Ma, a Title!
    Join Date
    Jun 2009
    Location
    Milton Keynes
    Posts
    451
    Thanks
    115
    Thanked
    15 times in 15 posts
    • Arthran's system
      • CPU:
      • i5 3570K
      • Memory:
      • 8Gb Corsair Black
      • Storage:
      • 120gb Kingston Now SSD, 4+TB Storage
      • Graphics card(s):
      • AMD 7870 2GB Ghz
      • Case:
      • NZXT Phantom
      • Operating System:
      • Win 7 Enterprise
      • Monitor(s):
      • HP 24" 1980x1080p + Viewsonic Projector
      • Internet:
      • Plusnet Fibre

    Re: Opinions - A look back at NVIDIA's GTC

    Much love Agent, i'll stop being a mother hen!
    Wow (shadowsong): Arthran, Arthra, Arthrun, Amyle (I know, I'm inventive with names)

  5. Received thanks from:

    Agent (08-10-2009)

  6. #52
    Oh Crumbs.... Biscuit's Avatar
    Join Date
    Feb 2007
    Location
    N. Yorkshire
    Posts
    11,193
    Thanks
    1,394
    Thanked
    1,091 times in 833 posts
    • Biscuit's system
      • Motherboard:
      • MSI B450M Mortar
      • CPU:
      • AMD 2700X (Be Quiet! Dark Rock 3)
      • Memory:
      • 16GB Patriot Viper 2 @ 3466MHz
      • Storage:
      • 500GB WD Black
      • Graphics card(s):
      • Sapphire R9 290X Vapor-X
      • PSU:
      • Seasonic Focus Gold 750W
      • Case:
      • Lian Li PC-V359
      • Operating System:
      • Windows 10 x64
      • Internet:
      • BT Infinity 80/20

    Re: Opinions - A look back at NVIDIA's GTC

    guess its a bit late to add




  7. #53
    Registered User
    Join Date
    Oct 2009
    Posts
    2
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: Opinions - A look back at NVIDIA's GTC

    Disclaimer - I am an NVIDIA employee --

    Response to Mercyground's questions in thread post #45 (sorry for delayed response).

    Where is NVIDIA's next generation technology for the gamer?

    ANSWER: We announced our Fermi DirectX11-capable graphics architecture at our GPU Tech Conference last week. Frankly, Fermi is later than we’d like because it took more time to develop some very significant new features for both the graphics AND compute “personalities” of the chip. We have disclosed the compute features which you can see in our Fermi Compute Whitepaper. I would post the link, but not up to 5 posts here yet! Would love to be the first to tell you about the cool graphics enhancements beyond just DX11 support, but I can’t talk about that right now. Our GeForce announcements are imminent.

    What is NVIDIA's answer to ATI Eyefinity technology?

    ANSWER: With GeForce you would need to use more than one graphics board to support more than two active monitors. If this is a feature our customers want, we will look in to adding it for GeForce. Our focus has been on other display technologies like 3D stereo, and our 3D Vision products, but we understand multi-mon gaming with many monitors could be a cool addition.

    Why does NVIDIA detect AMD GPUs in Batman: AA and turn off AntiAliasing?
    Answer: There was an article at PC Perspective on Oct 5 that addressed this. It was called “The State of NVIDIA: For Better or Worse” (sorry can’t post links here still ). Ryan interviewed Eidos (Batman Arkham Asylum publisher) and they said…

    “In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that. NVIDIA knew that this title was going to be a big seller on the PC and spent the money / time to get it working on their hardware. Eidos told us in an email conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined.”


    Why do new NVIDIA drivers punish AMD GPU owners who want to leverage an NVIDIA card to compute PhysX?

    Answer: We’re not trying to punish gamers. Our GPU and PhysX drivers are interconnected to optimize performance. In the future we expect this interdependence to deepen. This alone makes it difficult to support a third party GPU.

    In order to make sure our customers have a great experience, we QA every release of our PhysX and Graphics drivers by testing approximately 14 NVIDIA GPUs for graphics processing with 8 GPUs for PhysX processing on 6 common platforms with 6 OS’s using 6 combinations of CPU and memory. This is over 24000 possible configurations. While we don’t test every possible combination, the work and cost is substantial. Adding AMD GPUs would significantly increase the necessary work and cost for NVIDIA so we decided not to support this.

    AMD does not support PhysX for their customers, and we don’t QA this configuration. With no QA, it is risky to run this configuration so we removed this capability in a recent driver release.

    Bottom line is that we only support configurations that we know work, and it’s time for AMD to stop complaining about the work we’ve done and roll up their sleeves and start working on their own stuff.

    And my personal questions.

    Why did Nvidia coverup the laptop GPU issues? (To the point that apple has a scathing Support FAQ which states that NV lied to them)

    Answer: We’ve been through this in detail in the past. It is an old issue now. We quickly worked with our system partners and set aside a whole lot of money to help remedy any problems reported by end users. We have fixed the problem.

    Why kill off your chipset division? Nvidia has had some amazing chipsets. (Soundstorm rocked. Killing that turned me off the NF3. You had the lead in onboard sound and threw it away.)

    Answer: Thanks I love our chipsets also!.

    Here is our formal reply on this situation:

    We will continue to innovate integrated solutions for Intel’s FSB architecture. We firmly believe that this market has a long healthy life ahead. But because of Intel’s improper claims to customers and the market that we aren’t licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we’ll postpone further chipset investments for Intel DMI CPUs.

    Despite Intel's actions, we have innovative products that we are excited to introduce to the market in the months ahead. We know these products will bring with them some amazing breakthroughs that will surprise the industry, just as GeForce 9400M and ION have shaken up the industry this year.
    We expect our MCP business for both Intel and AMD to be strong well into the future.

    Why is SLI artificially restricted? When it was discovered that certain motherboards could SLI with just dual slots. NV patched drivers to disable this. To the consumer this looks very much like playing the bully.

    Answer: Allowing SLI to run on unlicensed systems would be a support nightmare and would mean that we would have to test SLI with every motherboard solution that could conceivably be made to run with SLI. I’m sure you can imagine how this would negatively impact our ability to deliver driver updates in a timely fashion if testing SLI on every possible motherboard combination was added to our test process.

    You will notice a recurring theme in many of these questions. NVIDIA innovates in areas like PhysX, SLI and even adding AA to games that don’t natively support it. We can’t throw technology over the wall and hope that it works. We want our customers to have a great experience, and in turn protect our brand. This requires QA, testing and support.

    Care to comment on NVidia's emasculating of DX10 and the resulting DX10.1? Was it due to NV not having hardware ready? And in addition to that... the HORRENDOUS mess you guys made of the Vista drivers?

    Answer: If you are asking why we now have DX 10.1 in certain new GT200-class mainstream chips and we didn’t in the past, it’s because it’s a checkbox feature that was easy to add when we taped out our follow-on GPUs. It was a natural progress for our products. But we maintain it really doesn't add much value over DirectX 10, and there are few games using it, and we can do much of what DX10.1 provides already in our DX10 hardware.

    DX11 is great and we are 100% behind it. Anything that makes PC gaming better is a good thing. We have already stated that our next generation Fermi-based GeForce GPU will support DirectX 11, along with PhysX and 3D Vision.

    Answer: Regarding Vista drivers. We agree we messed up at first. We spent a very long time with MS working on them, and still had issues at launch- like many other companies. We worked diligently and with much end-user support to find the majority of bugs and squash them. We appreciate our user-base for helping us here by reporting problems. We had a special site setup just for Vista bug reporting and solutions. Things became more stable after a few months. We have learned from this and feel our Win 7 drivers will provide a much better experience for our users.

  8. #54
    Oh Crumbs.... Biscuit's Avatar
    Join Date
    Feb 2007
    Location
    N. Yorkshire
    Posts
    11,193
    Thanks
    1,394
    Thanked
    1,091 times in 833 posts
    • Biscuit's system
      • Motherboard:
      • MSI B450M Mortar
      • CPU:
      • AMD 2700X (Be Quiet! Dark Rock 3)
      • Memory:
      • 16GB Patriot Viper 2 @ 3466MHz
      • Storage:
      • 500GB WD Black
      • Graphics card(s):
      • Sapphire R9 290X Vapor-X
      • PSU:
      • Seasonic Focus Gold 750W
      • Case:
      • Lian Li PC-V359
      • Operating System:
      • Windows 10 x64
      • Internet:
      • BT Infinity 80/20

    Re: Opinions - A look back at NVIDIA's GTC

    the whole 'we removed the capability because we couldnt assure support' doesnt fly with me... all you have to say is Nvidia doesnt OFFICIALLY SUPPORT THIS CONFIGURATION and goodnight.

    I have no issues with nvidia removing the support but i have massive doubts to the reasons posted, just sounds like you are covering money making business tactics with marketing bs. Just admit you dont want non full NVidia users to have that support and at least you can claim to be honest.

  9. Received thanks from:

    shaithis (11-10-2009)

Page 4 of 4 FirstFirst 1234

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Replies: 5
    Last Post: 01-10-2009, 02:17 PM
  2. Replies: 0
    Last Post: 01-10-2009, 10:35 AM
  3. Replies: 10
    Last Post: 27-01-2009, 11:02 PM
  4. Motherboard Comparison/Replacement Opinions
    By DougMcDonald in forum PC Hardware and Components
    Replies: 2
    Last Post: 23-01-2008, 04:51 PM
  5. Renault Laguna - opinions?
    By MA_Moby in forum Automotive
    Replies: 24
    Last Post: 01-08-2005, 01:15 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •