I believe matty was pointing out that if NVidia enabled PhysX support as default with ATI cards in the system there wouldn't be a registry hack to blame.
But it wouldn't even take a registry hack.
All it needs is a single boolean option in the graphics card properties: enable Hardware PhysX support. It's default is enabled when the card detects it's in an all NVidia system, and disabled when it detects the presence of other vendor's graphics card. Users can simply go into the card's properties dialogues (or whatever NVidia's catalyst control centre equivalent is (and would you believe I own 2 nvidia graphics cards? )) and simply tick the checkbox if they want PhysX support alongside their HD 5850. A warning box pops up saying "We have detected that you've made a pact with the devil and bought an ATI graphics card - if you use PhysX your computer is likely to explode and we won't take any respoinsibility for it! Are you sure you want to enable PhysX?"* and if the user clicks OK then it's reasonable to assume that they know what they're getting themselves into. NVidia are covered from a QA point of view, and I can buy a cheapy DDR2 GT220 to process all those pretty PhysX effects. It'd be pretty simple to implement and it would make a *lot* of people who currently dislike NVidia feel somewhat more inclined to buy their hardware. I don't get how they can think their current approach is good business practice...
*this is a suggested wording only, and may be changed in the final implementtion
It's all moot anyway. There is absolutely no reason for nvidia to disable it for QA reasons.
If there was ever an issue, it would be a core driver issue causing clashed which would have to be rectified anyway, Physx or no Physx.
Funny how they ONLY disable CUDA/Physx. They still let you use the nVidia card alongside your ATI card for other tasks......Why isn't that also disabled for "QA reasons"?
Answer: CUDA is not disabled for QA reasons.....what a shocker!
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Hicks12 (24-01-2010)
Yeah, that's what I was implying.
nVidia could have used a registry setting instead, that was very easy to hack - so people can use it, but it's evidently unsupported. Or equally they could've had an "unsupported" tickbox as scaryjim suggested.
Instead they've been trying to prevent people from using ATI cards with the vigour of Securom preventing piracy, which implies to me that "unsupported" really isn't the crux of the issue here.
There are currently 2 users browsing this thread. (0 members and 2 guests)