a few games use 3.0 and now we have got 4.0, great
http://www.theinquirer.net/?article=25937
a few games use 3.0 and now we have got 4.0, great
http://www.theinquirer.net/?article=25937
(\__/)
(='.'=)
(")_(")
Only Nvidia use 3.0.
6014 3DMk 05Originally Posted by Errr...me
and it was thier main feature over ati lol, now it out of date and barley used
(\__/)
(='.'=)
(")_(")
AFAIK there is no SM4, it's just the inquirer making up stuff. MS said there wouldn't be any new shader models before longhorn at their GDC speech on the future of directx.
Chances are the new directx just has a few bug fixes and maybe some extra stuff for the new cards, but a new revision of shaders seems highly unlikely.
Isn;t longhorn now Vista? And since the Beta is out already the Inquirer may have some *gram* of truth
6014 3DMk 05Originally Posted by Errr...me
mabe, would not be a first lol
(\__/)
(='.'=)
(")_(")
I dont know why everyone is acting so suprised, u didnt think, with the pace that gfx card tech improves, that the APIs would freeze as they are? granted sm3.0 is only in nvidia cards. but then it was the same with sm1.0/1.1/1.2 etc, no card had the same bloody version!
DirectX 10 has already been said will use a universal shader model, like we have in the xbox 360 gpu. so maybe SM4.0 will be based on that premise too.
From my point of view, SM3 is nothing but a stepping stone to the much more powerful shader model being developed for the next generation game/OS
Xenos gives us a glimpse of some of the features that might be available - specifically the idea that shader units are not specifically pixel or vertex...
..but, instead, can switch instantly between the two - with no latency !
The advantage of this 'integration' is that you can allocate your ALUs (Arithmetic Logic Units - the bits of a GPU that do the maths) to do the job that is in the most demand
Tons of vertices to process ?... No proble - here is a massive bank of vertex shaders
Masses of pixels to shade ?... Also no problem - we're all pixel shaders in here mate !
ATI is working very closely with Microsoft on the development of next generation shader technology and - traditionally - ATI has always done very well during industry inflection points (times when the technology changes tremendously in a short space of time)
Good examples of this include DX9 and PCI-Express where we were first by a country mile
What is also interesting this time around is that some of our competitor's top scientists seem to be making public/vocal arguments against integrating shader model hardware...
...2007 will be interesting if we have this 'right' and they don't
.
"X800GT... snap it up while you still can"
HEXUS
......................................August 2005
There was no 1.2 (publicly at least), and all cards have always supported lower versions of shaders (e.g. any shader 2.0 card also supports 1.1, 1.3, 1.4) so it wasn't a huge deal. You just pick the minimum level you want to suppoort and write for that, or have some switching on the fly with detected levels.Originally Posted by Mr Fujisawa
OI think nvidia are just making noise in the wrong direction so people will buy their latest gen cards.
There's no way their gonna let ATI get out in the open with the new unified architecture parts if they benefit performance.
I mean on the one hand head scientist at nvidia said this:
Then he goes on to say this:Originally Posted by Nvidia
So hedging his bets all ways I think. ATI are working with MS on this one, but you can be sure nvidia will be playing catch-up real quick if it proves a good move.Originally Posted by Nvidia
Having said that I do hope ATI comes back @ Nvidia with some big hitters so that it bucks the big greens ideas up after them stroling around loving their latest gen
I doubt you will see SM4.0 for a while if at all. As MS said DX9 was the last time it makes its outing under the DirectX name, they are moving to XDA which is basically the next step from DirectX
Steam: (Grey_Mata) || Hexus Trust
As far as I know, the only game with shader model 3.0 is Pacific Fighters...
Have good luck thinking of another game that uses shader model 3.0
Far Cry..
Oh yeah... but you see my point, we're hardly being spoiled for choice that said, I couldn't care any less as only have a feeble 9700Pro...
Until next year that is, from whence I'll have my brand spanking new PC...
The concept of SM3 is that it allows 'really cool stuff' like texture fetches in the vertex engine and branching in the pixel shader...
...wouldn't it be funny if developers were steered away from using those features because existing implementations were just to slow
Also, is it me, or is it weird that some companies get 'cookie points' in reviews for having SM3...
...but then those same sites test with SM3 = off in order to benchmark 'more fairly'
I am thinking about things like The Chronicles of Riddick
The 'benefits' of SM3 are included when marking up features...
"...and you have to bear in mind that the product supports SM3 !"
But when the testing is done - SM2 mode is forced - instead of letting each card run the shader model that it was designed for
Test everything with its 'natural' shader model + give marks for having features
or
Test everything at SM2 + ignore SM3 altogether
Mix 'n' match just seems daft - like the worst of both worlds
Extra marks without having to show the feature running live
.
"X800GT... snap it up while you still can"
HEXUS
......................................August 2005
Farcry runs very fast with SM3 enabled and looked very nice. Love the HDR lighting - looks so spangly
There are currently 1 users browsing this thread. (0 members and 1 guests)