@Novo: Absolutely agreed here.
And as I said, they should have pushed Epic for AA code. It's Epic who they licensed the engine from, after all.
In case anyone missed it behind the page jump on the news post, the infamous emails:
http://www.hexus.net/content/item.php?item=20991&page=2
Also nice to see Hexus got people titled and logo'd up etc (people usually can't do that til some number of posts or something I think never bothered checking), some people elsewhere mentioned they weren't sure if it was imposters, I think we can take that as Hexus have verified ID
personally I cant see why nvidia would abandon the gaming market, why stop making items for your biggest market during an economic downturn? kudos however goes to amd for gettin the dx11 cards out in time for windows 7, however they do have the monopoly on it at the minute which will keep the prices high until the gtx 3xx's come ou next year. At the end of the day most people are not fussed which branded card they get, I know all I want is the best card on the market for the best possible price and am not fussed who makes it. Bring on the competition!
Biscuit (04-11-2009)
I thought this too.
However the developers may have already tried this... only for Epic to say, "No". The evidence for this is lacking so I skirted round it.
I also thought that developers may have considered a different game engine, but there's a whole world of reasons why they wouldn't, e.g. cost, skills, development time, etc.
It's not just that AMD having a monopoly on DX11 cards which is keeping it high, there are other factors involved as well, demand is higher than supply right now (with the big OEMs gobbling them all up), and frankly, nvidia don't have a comparable product which performs as well.
When the supply/demand ballances out, and nVidia gets a comparable product (on standard features and performance) out the door, the price will start coming down. That's just how the market works, and why I rarely do the 'early adopter' thing.
What does NVIDIA have to apologize for?
Working with Eidos before the game launched to ensure their customers got to enjoy the game with anti-aliasing? Adding amazing PhysX effect to the game that make it look 100% better? It looks to me like the only people NVIDIA owes an apology is ATi for making their developer relations and gaming features look so utterly lackluster.
I agree odds are slim that ATi didn't know Batman AA would be a UE3 engine game they'd have trouble with, but the alternative is they knew and ignored the situation. Is the latter more flattering? I think not- either way their lack of concern for their customers gaming experience is troubling at best.
Didn't they care enough to spend the time and money working with Eidos to give their customers a modern gaming experience?
Well, I guess they have some concern. Catalystmaker tweets about the location of some instructions to hack the game for antialiasing on Twitter:
http://twitter.com/CatalystMaker
Not quite the same as working with the developers, but ATi customers have a lot of experience re-naming executables and the like.
It doesn't cost extra money to be hardware agnostic. On the contrary, it costs significantly less money. See Linux, for an example.
Oh, and yes, things cost money, so expecting consumers to buy both AMD and nVidia cards to play any games they may buy is absurd. And it's even more absurd to think that they should have to swap cards out for each game they start in order for things like PhysX to work.
Wow. So NVidia make AMD drivers for Linux? Or AMD make NVidia drivers?
NVidia have written some standards-compliant AA code. That's great. However, NVidia has neither the resources, nor the inclination, to test code that they have written for their hardware against a competitor's product. They have written their code, and tested it on their products, optimising it for those products and including those optimisations in the code itself.
The code probably looks something like:
RockSteady/Eidos are unwilling to make changes to the code, because it was given to them by NVidia, so to do so, from a legal standpoint, would be shaky, at best, from a legal standpoint.Code:select (CardID) { case Fermi: case GT2*0: AA = TRSSAA; Max_AA = 16x; break; case 9*00: AA = TRMSAA; Max_AA = 16x; break; case 8*00: AA = MSAA; Max_AA = 4x; break; case default: AA = None; Max_AA = 0x; break; }
I guess the main question here is "Why do AMD/ATI feel entitled to take advantage of work that NVidia has put in?" RockSteady/Eidos aren't saying "No AA for AMD/ATI in Batman", they're saying "Give us the code to make it work, without infringing on the IP/Copyright of the code NVidia has given us (In other words, no Modifying NVidia's code), and we'll add it in so Batman has Anti-Aliasing on all platforms".
I believe the appropriate Linux term, as you seem so enamoured of that field of code development, is "CLOSED: Invalid Ticket (Feature request without patch)".
What?.. Where the hell did I suggest a vendor should be writing drivers for hardware that doesn't belong to them? The whole point of DX/OpenGL/etc is that application developers *don't* need to add per-hardware-vendor use cases in their software. The rest of your comments are equally meaningless drivel, upon failure to understand this ridiculously simple concept.
Epic strawman.
The point, that you seem to be missing, is that the AA code in Batman was provided to them as an enhancement to UE3, and it was provided to them by NVidia. NVidia have written the code (it appears) in a standards-compliant way, but just because their implementation is standards-compliant, doesn't mean the hardware it's going to run on implements those same standards correctly. Therefore, NVidia have locked the code to their cards, as that is all they have available to code for. If NVidia's code was unlocked for AMD/ATI or anyone else to use, and there was a problem with it, Rocksteady would be complaining to NVidia, and requiring them to fix their code for another vendor's product. Which would mean NVidia would have to shell out money from their own pockets to patch, test, QA and release code for a competitor.
It's like IBM having to patch their Lotus Notes client to overcome a quirk in Microsoft Exchange's POP3 implementation. It doesn't happen. Thus IBM don't support Exchange, and NVidia should not have to support AMD/ATI.
That's a load of nonsense. If it's standards compliant code, then it will work. If it doesn't work, then it's not standards compliant code, is it?
And again, it should have been Epic, who should do the AA work in the first place. It's their engine, they know it well, Eidos licensed it from them, and both of them would also benefit by fixing the AA fiasco.
The way I see it is that MSAA (As Huddy clearly states in his e-mail subject line) in DX10 mode should work because it's part of the damn specs of the unreal engine 3.5 (aka UE3 DX10) AFAIK and ARE FOR SURE PART OF DX10 SPEC. UE3 DX9 does not support AA with deferred shadows (DPR+AA...Or essentially deferred pixel/shadow HDR+AA) because of dx9 limitations (no DPR+AA in the API), but DX10 does, and UE3.5 supports DX10. Nvidia can tout their DX9 workaround that ATi doesn't have, but could support in hardware (since the X1000 series, nvidia 8000 series), that's fine and lovely. Good for them. IN DX9 mode.
Remember the x1000 series pushing HDR+AA that nVIDIA couldn't do in the 7000 series? Yeah, that's this essentially, as the deferred shadows in UE3 (that the DX9 API can't support simultaneously with AA) are HDR. Hence why this only works on 8000 series and up nvidia hardware. Huddy knows this. ATi should be able to push AA through catalyst, just like nvidia does with their essentially built-in force work-around they've used in the past (that Huddy also mentions). Granted, if to be supported in-game, ATi should provide or help write that code...For DX9.
This game runs in DX10 mode. ATi supports DX10. DX10 will do DPR+AA. ATi should be supported. End of conversation.
Like Huddy says in his e-mail, they are likely using the DX10 codepath for AA...Why are they locked out? Because nvidia enabled AA in DX9 mode through forcing hax and built it into the game GUI? This should not transfer over to DX10. That's just BS on all fronts, any way you cut it.
This is clearly what can only be described as TWIMTBP douchebaggery of EPIC proportions, or some very strange misunderstanding on EIDOS' and Rocksteady's part. You'd think devs would know better?
Huddy is confused as to what nVIDIA did, and so am I. I am willing to bet nVIDIA did jack shat to enable AA in DX10 mode, because DX10 supports deferred rendering + AA in the effing SPEC API, where-as dx9 (and nvidia dx9 cards) does not. Nvidia did not create the dx10 spec, nor the engine's ability to support the dx10 API. Even further, why should they care if someone forces it through cat in dx9 mode? Obviously it wouldn't be supported, but to lock it out completely in both modes can only be described as a dev super fail.
Got it Eidos/Rocksteady? DX10 yes. DX9 if forced through catalyst or ATi helps with code for in-game (that could be supported back to X1000 series..although would prolly run very badly).
Last edited by alwayssts; 04-11-2009 at 06:29 AM. Reason: Spelling
Yippee, bug for bug compatibility. Everyone likes compatibility problems, don't they? Especially two years and two hardware generations later. Bug for bug compatibility is one of the many nightmares of developers coding software for various similar platforms. This case is an instance, with both buggy, incomplete software (see MS, AMD, and nVidia) and buggy hardware (which nVidia puts out a ton of, software patches in drivers before and after release is a reasonably easy fix and can always fix everything, right? Well, except for a poor choice of underfill. Software did, to some degree, fix hardware bugs when it came to the R600 and AA by AMD.) Especially considering that the GT200 series and G92 series are almost certainly bug filled monstrosities that have the worst bugs worked around by an ugly black box known as the drivers. The bugs probably only affect performance, the non-working parts of the hardware are probably "optional" for full DirectX compliance. The works of a certain Alan Turing, would seem to indicate that one could emulate a Core i7 with three 5870 cards on a single, standard 8080. With that in mind, however, the frame rate might be best measured in frames per decade for Batman: AA on one of the first 8080 processors. So nearly any hardware problem can be worked around.
There is probably a reason there is no freely available documentation for Geforce cards, the errata list would be embarrassing to read and properly working software nearly impossible to create independently. Having said that, I think most software developers would rather have consistent, well documented, software APIs than needing to spend ten times as much time and money to code directly for the bare hardware especially with all the warts in the OS already.
aidanjt (04-11-2009),nightkhaos (04-11-2009)
There are currently 1 users browsing this thread. (0 members and 1 guests)