well i cant see why ATI would want to screw with them because its not like them and in the end it should let them sell more ATI cards.
I'm sure nvidia might try something dodgy though since they dont want gamers to use any other cards with there own, and also i think nvidia get some SLI licencing fee from motherboard makers which i'm sure they wont want to lose.
Your right to freedom of speech is not the issue. It's what you wrote which is at issue. Per-vendor CPU optimisations?.. Come on, you don't seriously think programmers have nothing better to do than spend all day adding Intel and AMD assembly optimisations to their code, do you? You'll be very lucky to find any who will even bother adding hand assembled SIMD optimisations to their code, much less *that*.
Welcome to 1969, the compiler is born.
To me it reads that nVidia wrote the code to enable/implement AA on Batman:AA (since AA isn't natively supported by the UE engine). In turn AMD wanted Eidos to alter that code to work for their cards also. They pointed out they couldn't do such without infringing nVidia's IP.
Thus, surely AMD should be offering to write their own version of the code to implement AA, rather than trying to 'borrow' the work already done by nVidia?
Prior to this explanation I'd assumed nVidia was in the wrong, and unfairly trying to keep the feature exclusive, but I'm now inclined to believe AMD is just trying to confuse matters to avoid the truth (mentioning an IP legal issue which simply wouldn't apply if they wrote their own code rather than piggyback someone else's).
[Obviously, I could be 'reading' this wrong but from the evidence shown it doesn't appear that way to me]
You don't think Eidos should have written their own manufacturer-neutral anti-aliasing code then?
I still say it VERY fishy they couldn't do it on their own....they could write the Batman AA game but could not write some anti-aliasing code without getting the video card manufacturers involved?
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
It does seem that the game developers should be the ones taking the blame here. Why did they not write A-A into the game ? Why did they ask NV to do it ? Why did they allow it to be hardware locked ?
NV have acted in their own interests. I fail to see why they should be writing code for a competitor.
This is, completely seperate from the Physx lockout, which is clearly anti-competitive.
Society's to blame,
Or possibly Atari.
As I said before we simply do not have enough information. Did AMD attempt to send some AntiA code to Eidos? Was this code rejected by Eidos? Did the AntiA code they send conflict with NVIDIAs AntiA code? Was the legal team using the IP as an excuse to let AMD down gently from the AntiA exclusivty clause that NVIDIA made with Eidos (if such a clause exists somewhere)?
We do not have enough information to place blame.
Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV
MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display
HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television
i7 (Bloomfield) Overclocking Guide
Originally Posted by Spock
I agree that it should be Eidos who takes the flack for it, they agreed to the license terms nVidia provided. But they were nVidia's terms, after all. It was either that, or no AA.
Because AA is a DirectX feature, and therefore will run on any DirectX compatible card. It's not as if the AA code bypasses DirectX and talks directly to nVidia's kernel driver, is it?
Personally I never use AA, so I wouldn't miss it even if I bought Batman: AA. It's the principle of the thing, more than anything else.
Here's how it's done.
a) Epic Games write the unreal engine
b) They license it to the developer.
c) Developer makes the game. In the meantime, Epic keeps developing the engine, adding features, bugfixes, etc. The whole point of licensing engines is to cut the development time and bring games faster to market than developing a 3D engine of your own.
Could the developer have rewritten the portion of the engine that handles AA, something that Epic hasn't done yet? Sure. Would that add month(s) to Batman's development time (aka making the game late, with the financial implications you all imagine)? Sure reloaded. Would nVIDIA's offer to write a "patch" to enable AA in their cards sound lovely to the developer's ears? Sure - revolutions
Phage (04-11-2009)
That's the thing, any code written is 85% likely to work on non-NVIDIA hardware as it would use the DirectX libaries and functions to achieve the AntiA. If NVIDIA used a non-standards part of their technology, which they wouldn't need to be the easiest way to achieve AA is through Direct X, then yes, they would have due cause to lock other vendors out.
This is coming down to the little pickle we get now-adays with non-physical media, what can, and can't be, interllectual property? I see this quite often, patents over things that are well established and truely ancient features, in computer terms, like thumbnails.
How is code, a generic representation of a set of instructions presented in a human readable format, allowed to be copyrighted? The same code, within differecent context, executed in a different enviroment, can achieve completely different results. You can't copyright lines of code. If, for some reason, the AntiA code NVIDIA was using used some new, unique method, of achieving AntiA, then yes, they have a right to protect that IP.
But programmers are lazy, they will take the path of least resistance most of the time, so it won't use anything new and exciting to achieve AntiA. Further more, this code was sold to, including the rights to said code to Eidos. I have seen no contract that states that NVIDIA retains the IP rights over the AntiA code they wrote!
I'm grasping at straws here, the concept of "free-information" is a new one to everybody anyway. The world isn't going to chance over night to embrace this concept either. Our ecomony is based upon the notion of limited resources, and placing value on a resource. People are not ready for a world where a piece of information can be replicated millions and millions of time at no cost, and the sad thing is we are getting closer and closer to the point where this becomes a reality. The cost of replicating information is already tiny.
Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV
MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display
HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television
i7 (Bloomfield) Overclocking Guide
Originally Posted by Spock
Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV
MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display
HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television
i7 (Bloomfield) Overclocking Guide
Originally Posted by Spock
That's the thing I reckon. Sure, ideally Epic (Unreal Engine) should have wrote the AA code and made it available to all licensees, but as they haven't (yet) and it's probably uneconomical for Eidos/Rocksteady Studios to, it's meant it's down to the GPU makers.
As it appears from the evidence so far (as I read it) nVidia have offered code to do this (hence it's been used*) and AMD haven't - they said to adapt the nVidia code which legally is shaky ground. To me, surely AMD should be offering their own code?
(* - if nVidia's Anti-aliasing (AA) code artificially locks AA use to only their own cards, and it's not a technical issue, then it's not nice of them but financially, and business wise, it's understandable and justifiable. It may not be nice, but have AMD cause to complain? Haven't they just avoided doing some expensive and time consuming work themselves and instead made a song and dance about nVidia having bothered to?)
Would be good, but likely uneconomical. But similar has happened before so not completely discountable (developer extensions to Unreal or Quake/Doom engines). (The PC market seems to be a lot less profitable now though to warrant such expense).
True, it's just guess work and speculation (and we need to bare this in mind).
IMO we need disclosure on a couple of points
- When did AMD get the chance to offer an AA solution
- When was it turned down, in relation to nVidia's being adopted
Additionally a statement on if AMD is willing to validate the game as-is and state it performs acceptably in their hardware (with the provisio any further updates are ratified by them also).
There are currently 2 users browsing this thread. (0 members and 2 guests)