And it's become a bit of a moot point
http://www.ngohq.com/news/16560-patc...s-present.html
And it's become a bit of a moot point
http://www.ngohq.com/news/16560-patc...s-present.html
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Just wanted to pop in and say: Great Article Sylvie
So many in the thread pissed moaned and whined about the lack of "technical" speak in the writing, and how it appears "childish" - I would assume these are the same folks who have never picked up a news paper and read the editorial section or for that matter, read any form of writing except technical white papers and scientific journals.
I don't think it was intend it to be a tech info overload, but rather a nice, light and enjoyable overview of what went on at GTC 2009. You covered the main points of the conference and in the end I felt like I understood exactly what went on at the GTC conference without having to be a part of the science club to understand it - it was perfectly summed up in an amusing and entertaining nutshell.
Also - Loved the game references... some people lack a sense of humour - maybe it's time they dismounted the horse (or pony take your pick )
Cheers again
Last edited by FiXT; 07-10-2009 at 06:40 PM.
Disclaimer - I am an NVIDIA employee...
In response to AMD Gary's comments / questions see below....
- Where is this Fermi of which everyone speaks so glowingly?
ANSWER – At GTC we launched the architecture and provided tons of detail, and we said products based on Fermi would be coming out over the next few months.
The GeForce version of Fermi is in bring-up mode right now, and it’s going to be an awesome product that supports PhysX, 3D Vision, CUDA and DX11.
- Given the absence of hardware, how does GPGPU take over scientific computing, much less graphics?
ANSWER - What absence of hardware? Over 150M CUDA-C/Direct Compute/OpenCL-capable NVIDIA GPU solutions are in the market today with thousands of developers building world-changing applications TODAY. Fermi will accelerate the pace of innovation for GPU computing and provide tremendous speedups in double-precision floating point power, ECC memory protection for improving reliability of GPU clusters and accuracy of medical imaging and financial calculations, and C++ support for standards-based software development, to name a few. Check out CUDA Zone to find 600 applications, demos, papers, studies etc, all around CUDA. Just for means of comparison, AMD’s Stream Developer Showcase offers 20 postings.
- What is the yield on a 40nm, 3 billion transistor Nvidia products YTD?
ANSWER - Sorry, we do not comment on fab process yields, as AMD knows this is confidential information. Note that AMD also doesn’t comment on their initial 40nm yields for their 2.15B transistor parts.
- What is the TAM for the applications where Fermi will sell its 3 billion transistors?
ANSWER – Very large, but AMD Gary would likely not know these things, given AMD is not seriously engaged in GPU computing.
- Why does the PhysX license specifically exclude it from being used on ATI cards if they so generously offered it to us?
ANSWER – AMD is on record saying they do not support Physx, so that question should be directed at them. If AMD wants to officially support PhysX we’d be happy to work with them to make sure that AMD customers get the same high-quality, trouble-free experience GeForce customers enjoy today. PhysX is our IP available under license, and if AMD is interested in licensing it then they should come talk to us.
The fact is, while AMD has been talking about supporting GPU-accelerated physics for the last 1.5 years, NVIDIA actually went out and did the work for our GeForce customers. It’s time for AMD to stop talking about physics and actually roll up their sleeves and start doing the work or take a license from us.
- Finally, Nvidia can mock the mainstream consumer, enthusiast PC crowd all they want, but I'm not sure it is in their best interests to abandon it.
ANSWER – I don’t know what Gary is talking about. We aren’t abandoning anyone. We love games -- it’s our bread and butter. Also it’s obvious Gary didn’t attend GTC or he would have seen our “highly scientific” demo where Jensen launched a bunch of little dudes into a wall to see if we could break it down Watch the videos on Youtube by searching on “NVIDIA Physx Demo Running in 3D” (sorry can’t post URLs yet in this forum) and also this “NVIDIA GTC 2009 Fermi (GT300) Real Time Physics Demo” or check out our new fluids demo searching on “NVIDIA PhysX Fluid Dynamics Demo in 3D”. All these are applicable to games.
And BTW, GTC demos were also shown in Full 3D. NVIDIA is moving gaming to the next level. Offering just a few more FPS does not move PC gaming forward. Technologies like 3D Vision and PhysX gives you a never before seen game experiences, and make games much more fun to play. The movie industry has already embraced 3D technology. An increasing number of movies are in 3D today and the latest CG effects of action movies are all using massive physics animations. Just look at Transformers 2 or 2012. NVIDIA enables these technologies for PC gamers today. These are not just PowerPoint bullets, but things you can play with today. If I was an AMD customer, I would ask Gary why AMD is not making my gaming experience better with GPU-accelerated physics, stereoscopic 3D technology, and an effective developer relations team that works hand-in-hand with developers to make games more engaging and fun to play.
That is the fundamental difference between NVIDIA and AMD. NVIDIA believes in innovation and believes innovation is good for gamers. We think open standards are great. We also believe in creating standards that allow developers to innovate immediately, such as C for CUDA and PhysX. We want great features to come to games as quickly as possible. If this happens with DirectX, OpenCL, Bullet, or PhysX, it does not matter to NVIDIA. We just want it to happen as fast as possible.
PhysX and CUDA are here TODAY. Today GeForce customers can play Batman: Arkham Asylum with great physics effects, killer stereoscopic 3D, and gorgeous anti aliasing as a result. And GeForce customers who have tried it think those things are very cool. AMD customers are left with listening to AMD create misperceptions about our developer relations team, imply that game developers are unethical or irrelevant, and rant about standards that allow NVIDIA to innovate for our customers.
When Gary says we push proprietary standards over open standards, I’m not sure what’s he’s referring to. Check out khronos org site in news section, and NVIDIA’s developer site for much info on OpenCL. Also NVIDIA’s own Neil Trevett is the chairman of the Khronos Group that oversees OpenCL.
We support open standards, and standards that allow us to innovate immediately. This allows NVIDIA to give our customers great graphics and innovative features like 3D Vision, PhysX, and CUDA.
Nick
Come now Gentlemen, lets not turn this into a game of "My Employer's better than your employer"
Wow (shadowsong): Arthran, Arthra, Arthrun, Amyle (I know, I'm inventive with names)
Agreed. This HD58X0 vs Fermi boards is getting silly with many tech sites apparently taking "sides". Competition between two rival companies is a good and healthy thing, especially for the consumer, but mud slinging (from both sides I see) is really deplorable. Let your products speak for themselves.
Wow (shadowsong): Arthran, Arthra, Arthrun, Amyle (I know, I'm inventive with names)
Neither Nvidia or ATI really have a gleaming record :
ATI : No physx. Physx isn't supported by many games, but at least it works. ATI hasn't really implemented.
Particularly bad at linking to 3D stereoscopy support (there are solutions that work with ATI, but they're not ATI native)
Repeated comments of multimonitor and Vista Aero crashes. Is this fixed?
Crap Linux support
OpenCL is in theory a more open standard than CUDA. In practice takeup appears to have
been limited outside OS X. Anyone know of other supported apps?
Nvidia : No DX11. Not even DX10.1.
Disabling Physx when an ATI card is in the system - utterly unforgivable.
No timetable for new graphics cards.
Has currently lost the performance crown.
Both : SLI and Crossfire rubbish. It could work on different chipsets. vendor lock in sucks.
Cheating on benchmarks (nothing for a while, fortunately)
Both manufacturer's support is sub par, IME. ATI didn't want to know in the past when trying to run multi monitor - fixed by Eyefinity now, hopefully. Nvidia don't want to know that CRT monitors still exist and haven't fixed a 100% replicable BSOD on resume from suspend - a bug that has existed since prior to Vista SP1 and persisted through every single driver release since. You report the bug after each driver release and it drops into a black hole..
Currently I think ATI have the lead. 98% of consumers don't care about 3D stereoscopy and the Physx difference is insufficiently dramatic for it to be a huge factor. CUDA/OpenCL shows a lot of promise but there's no real killer app for most people. Eyefinity doesn't matter for the vast majority of people who don't have room for more than two monitors. DX11 isn't well supported, yet, but offers decent future proofing and the games are trickling in.
PK
Last edited by Syllopsium; 08-10-2009 at 11:14 AM.
Almost perfectly put imo
Wow (shadowsong): Arthran, Arthra, Arthrun, Amyle (I know, I'm inventive with names)
the above comment wasn't from my pov, but from what I've seen of general public opinion.
Ati have been playing catchup for the last few generations, now ati have the lead and nvidia are playing catchup, this combined with how badly nvidia USED to steamroll ati on performance this has now left a lot of people wondering wth nvidia are doing when they have a) lost their lead in release dates, b) lost their lead in performance, c) completely got trounced on performance vs $ and d) couldn't even get a working model of their new architecture at their company conference.
at the moment after the state of the last 4 nvidia cards I've owned, i wont buy Nvidia (reliability fail), but i'm really looking forward to their new range to see what happens, competition in the market can only be a good thing in my opinion.
Wow (shadowsong): Arthran, Arthra, Arthrun, Amyle (I know, I'm inventive with names)
Yes its moot point that its been patched by the community.
No its NOT a moot point that NV deliberatly did it. It is yet again another mark against a company that is making me wonder if i can trust them.
The sweep it under the rug affair that was the laptop gpu failures (and desktop gpu failures) soured my opinion on NV. Their cavalier attitude to "Its out of warrenty... go buy another" when they KNEW it was their faulty chips has pissed alot of customers off. If they are in the black so much... why didnt they just admit it. Suck up the cost and replace the laptops. Thats the sign of a reputable company.
The physx issue just another dominance play. (And to my mind? Exactly the same sort of behaviour that got MS into Monopoly hot water)
Oh, I completely agree that nVidias tactics are under-handed and have been for years but there is no point getting angry over it when you can't make them change and there is a generic patch available for it.
Would I would love to know if how much nVidia expected from ATI/AMD for licensing of CUDA......I can only imagine it was too much for ATI to stomach.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Nick. Welcome to Hexus.
Kyle from HardOCP posed these queries on HardOCP.com. Care to comment?
And my personal questions.Where is NVIDIA's next generation technology for the gamer?
What is NVIDIA's answer to ATI Eyefinity technology?
Why does NVIDIA detect AMD GPUs in Batman: AA and turn off AntiAliasing?
Why do new NVIDIA drivers punish AMD GPU owners who want to leverage an NVIDIA card to compute PhysX?
Why did Nvidia coverup the laptop GPU issues? (To the point that apple has a scathing Support FAQ which states that NV lied to them)
Why kill off your chipset division? Nvidia has had some amazing chipsets. (Soundstorm rocked. Killing that turned me off the NF3. You had the lead in onboard sound and threw it away.)
Why is SLI artificially restricted? When it was discovered that certain motherboards could SLI with just dual slots. NV patched drivers to disable this. To the consumer this looks very much like playing the bully.
Care to comment on NVidia's emasculating of DX10 and the resulting DX10.1? Was it due to NV not having hardware ready? And in addition to that... the HORRENDOUS mess you guys made of the Vista drivers?
To those who may percieve i'm "bashing" NV.
I'm not wanting NV drummed out of the market. A monoploy of any kind is no good for the market. (intel's gfx dont count. Gaming on GMA? Please dont make me laugh)
What i am objecting to is Nv's poor handling of its community base and the screwups that have caused a previously good execution to fail. Nv appears to be copying Intel with their P4 fiasco.
And please... 600quid for a card? That is definatly excessive.
It's not a solution because Nvidia might block the workaround. Way back in the 84.xx driver days that other artificially limited technology (SLI) was re-enabled in hacked drivers. Nvidia changed their drivers to prohibit that, and it's only recently people have found workarounds again.
Now, the key difference is that SLI never worked on Crossfire motherboards to start, even when they were fully capable of doing so. It wasn't a case of it working and then being stopped from working.
Of course ATI isn't innocent in the SLI game, seeing as Crossfire seems to be similarly artificially limited. I find it immensely annoying, but at least I knew of the situation when I chose my crossfire motherboard and bought two Nvidia cards to go with it.
PK
There are currently 1 users browsing this thread. (0 members and 1 guests)