That's our job - Not because we are trying to stomp our power over users, but what happens to be one mans discussion is another mans 'our of hand'. For that reason if you think things are getting our of hand, shout us and we'll decide if it's the case or not. Here I don't see an issue, and I really don't want either nVidia or ATI to think they have to 'tone anything down' - their posts are fine.
Cheers
Arthran (08-10-2009)
Much love Agent, i'll stop being a mother hen!
Wow (shadowsong): Arthran, Arthra, Arthrun, Amyle (I know, I'm inventive with names)
Agent (08-10-2009)
Disclaimer - I am an NVIDIA employee --
Response to Mercyground's questions in thread post #45 (sorry for delayed response).
Where is NVIDIA's next generation technology for the gamer?
ANSWER: We announced our Fermi DirectX11-capable graphics architecture at our GPU Tech Conference last week. Frankly, Fermi is later than we’d like because it took more time to develop some very significant new features for both the graphics AND compute “personalities” of the chip. We have disclosed the compute features which you can see in our Fermi Compute Whitepaper. I would post the link, but not up to 5 posts here yet! Would love to be the first to tell you about the cool graphics enhancements beyond just DX11 support, but I can’t talk about that right now. Our GeForce announcements are imminent.
What is NVIDIA's answer to ATI Eyefinity technology?
ANSWER: With GeForce you would need to use more than one graphics board to support more than two active monitors. If this is a feature our customers want, we will look in to adding it for GeForce. Our focus has been on other display technologies like 3D stereo, and our 3D Vision products, but we understand multi-mon gaming with many monitors could be a cool addition.
Why does NVIDIA detect AMD GPUs in Batman: AA and turn off AntiAliasing?
Answer: There was an article at PC Perspective on Oct 5 that addressed this. It was called “The State of NVIDIA: For Better or Worse” (sorry can’t post links here still ). Ryan interviewed Eidos (Batman Arkham Asylum publisher) and they said…
“In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that. NVIDIA knew that this title was going to be a big seller on the PC and spent the money / time to get it working on their hardware. Eidos told us in an email conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined.”
Why do new NVIDIA drivers punish AMD GPU owners who want to leverage an NVIDIA card to compute PhysX?
Answer: We’re not trying to punish gamers. Our GPU and PhysX drivers are interconnected to optimize performance. In the future we expect this interdependence to deepen. This alone makes it difficult to support a third party GPU.
In order to make sure our customers have a great experience, we QA every release of our PhysX and Graphics drivers by testing approximately 14 NVIDIA GPUs for graphics processing with 8 GPUs for PhysX processing on 6 common platforms with 6 OS’s using 6 combinations of CPU and memory. This is over 24000 possible configurations. While we don’t test every possible combination, the work and cost is substantial. Adding AMD GPUs would significantly increase the necessary work and cost for NVIDIA so we decided not to support this.
AMD does not support PhysX for their customers, and we don’t QA this configuration. With no QA, it is risky to run this configuration so we removed this capability in a recent driver release.
Bottom line is that we only support configurations that we know work, and it’s time for AMD to stop complaining about the work we’ve done and roll up their sleeves and start working on their own stuff.
And my personal questions.
Why did Nvidia coverup the laptop GPU issues? (To the point that apple has a scathing Support FAQ which states that NV lied to them)
Answer: We’ve been through this in detail in the past. It is an old issue now. We quickly worked with our system partners and set aside a whole lot of money to help remedy any problems reported by end users. We have fixed the problem.
Why kill off your chipset division? Nvidia has had some amazing chipsets. (Soundstorm rocked. Killing that turned me off the NF3. You had the lead in onboard sound and threw it away.)
Answer: Thanks I love our chipsets also!.
Here is our formal reply on this situation:
We will continue to innovate integrated solutions for Intel’s FSB architecture. We firmly believe that this market has a long healthy life ahead. But because of Intel’s improper claims to customers and the market that we aren’t licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we’ll postpone further chipset investments for Intel DMI CPUs.
Despite Intel's actions, we have innovative products that we are excited to introduce to the market in the months ahead. We know these products will bring with them some amazing breakthroughs that will surprise the industry, just as GeForce 9400M and ION have shaken up the industry this year.
We expect our MCP business for both Intel and AMD to be strong well into the future.
Why is SLI artificially restricted? When it was discovered that certain motherboards could SLI with just dual slots. NV patched drivers to disable this. To the consumer this looks very much like playing the bully.
Answer: Allowing SLI to run on unlicensed systems would be a support nightmare and would mean that we would have to test SLI with every motherboard solution that could conceivably be made to run with SLI. I’m sure you can imagine how this would negatively impact our ability to deliver driver updates in a timely fashion if testing SLI on every possible motherboard combination was added to our test process.
You will notice a recurring theme in many of these questions. NVIDIA innovates in areas like PhysX, SLI and even adding AA to games that don’t natively support it. We can’t throw technology over the wall and hope that it works. We want our customers to have a great experience, and in turn protect our brand. This requires QA, testing and support.
Care to comment on NVidia's emasculating of DX10 and the resulting DX10.1? Was it due to NV not having hardware ready? And in addition to that... the HORRENDOUS mess you guys made of the Vista drivers?
Answer: If you are asking why we now have DX 10.1 in certain new GT200-class mainstream chips and we didn’t in the past, it’s because it’s a checkbox feature that was easy to add when we taped out our follow-on GPUs. It was a natural progress for our products. But we maintain it really doesn't add much value over DirectX 10, and there are few games using it, and we can do much of what DX10.1 provides already in our DX10 hardware.
DX11 is great and we are 100% behind it. Anything that makes PC gaming better is a good thing. We have already stated that our next generation Fermi-based GeForce GPU will support DirectX 11, along with PhysX and 3D Vision.
Answer: Regarding Vista drivers. We agree we messed up at first. We spent a very long time with MS working on them, and still had issues at launch- like many other companies. We worked diligently and with much end-user support to find the majority of bugs and squash them. We appreciate our user-base for helping us here by reporting problems. We had a special site setup just for Vista bug reporting and solutions. Things became more stable after a few months. We have learned from this and feel our Win 7 drivers will provide a much better experience for our users.
the whole 'we removed the capability because we couldnt assure support' doesnt fly with me... all you have to say is Nvidia doesnt OFFICIALLY SUPPORT THIS CONFIGURATION and goodnight.
I have no issues with nvidia removing the support but i have massive doubts to the reasons posted, just sounds like you are covering money making business tactics with marketing bs. Just admit you dont want non full NVidia users to have that support and at least you can claim to be honest.
shaithis (11-10-2009)
There are currently 1 users browsing this thread. (0 members and 1 guests)