Page 1 of 2 12 LastLast
Results 1 to 16 of 22

Thread: shader model 4.0

  1. #1
    lazy student nvening's Avatar
    Join Date
    Jan 2005
    Location
    London
    Posts
    4,656
    Thanks
    196
    Thanked
    31 times in 30 posts

    shader model 4.0

    a few games use 3.0 and now we have got 4.0, great

    http://www.theinquirer.net/?article=25937
    (\__/)
    (='.'=)
    (")_(")

  2. #2
    Hexus.net Troll Dougal's Avatar
    Join Date
    Jun 2005
    Location
    In your eyeball.
    Posts
    2,750
    Thanks
    0
    Thanked
    0 times in 0 posts
    Only Nvidia use 3.0.
    Quote Originally Posted by Errr...me
    I MSN offline people
    6014 3DMk 05

  3. #3
    lazy student nvening's Avatar
    Join Date
    Jan 2005
    Location
    London
    Posts
    4,656
    Thanks
    196
    Thanked
    31 times in 30 posts
    and it was thier main feature over ati lol, now it out of date and barley used
    (\__/)
    (='.'=)
    (")_(")

  4. #4
    Senior Member
    Join Date
    Jan 2005
    Location
    Manchester
    Posts
    2,900
    Thanks
    67
    Thanked
    182 times in 136 posts
    • Butcher's system
      • Motherboard:
      • MSI Z97 Gaming 3
      • CPU:
      • i7-4790K
      • Memory:
      • 8 GB Corsair 1866 MHz
      • Storage:
      • 120GB SSD, 240GB SSD, 2TB HDD
      • Graphics card(s):
      • MSI GTX 970
      • PSU:
      • Antec 650W
      • Case:
      • Big Black Cube!
      • Operating System:
      • Windows 7
    AFAIK there is no SM4, it's just the inquirer making up stuff. MS said there wouldn't be any new shader models before longhorn at their GDC speech on the future of directx.
    Chances are the new directx just has a few bug fixes and maybe some extra stuff for the new cards, but a new revision of shaders seems highly unlikely.

  5. #5
    Hexus.net Troll Dougal's Avatar
    Join Date
    Jun 2005
    Location
    In your eyeball.
    Posts
    2,750
    Thanks
    0
    Thanked
    0 times in 0 posts
    Isn;t longhorn now Vista? And since the Beta is out already the Inquirer may have some *gram* of truth
    Quote Originally Posted by Errr...me
    I MSN offline people
    6014 3DMk 05

  6. #6
    lazy student nvening's Avatar
    Join Date
    Jan 2005
    Location
    London
    Posts
    4,656
    Thanks
    196
    Thanked
    31 times in 30 posts
    mabe, would not be a first lol
    (\__/)
    (='.'=)
    (")_(")

  7. #7
    Registered User
    Join Date
    Jun 2005
    Posts
    27
    Thanks
    0
    Thanked
    0 times in 0 posts
    I dont know why everyone is acting so suprised, u didnt think, with the pace that gfx card tech improves, that the APIs would freeze as they are? granted sm3.0 is only in nvidia cards. but then it was the same with sm1.0/1.1/1.2 etc, no card had the same bloody version!

    DirectX 10 has already been said will use a universal shader model, like we have in the xbox 360 gpu. so maybe SM4.0 will be based on that premise too.

  8. #8
    ATI Technologies exAndrzej's Avatar
    Join Date
    Dec 2004
    Location
    London, UK
    Posts
    555
    Thanks
    0
    Thanked
    0 times in 0 posts
    From my point of view, SM3 is nothing but a stepping stone to the much more powerful shader model being developed for the next generation game/OS

    Xenos gives us a glimpse of some of the features that might be available - specifically the idea that shader units are not specifically pixel or vertex...

    ..but, instead, can switch instantly between the two - with no latency !

    The advantage of this 'integration' is that you can allocate your ALUs (Arithmetic Logic Units - the bits of a GPU that do the maths) to do the job that is in the most demand

    Tons of vertices to process ?... No proble - here is a massive bank of vertex shaders

    Masses of pixels to shade ?... Also no problem - we're all pixel shaders in here mate !

    ATI is working very closely with Microsoft on the development of next generation shader technology and - traditionally - ATI has always done very well during industry inflection points (times when the technology changes tremendously in a short space of time)

    Good examples of this include DX9 and PCI-Express where we were first by a country mile

    What is also interesting this time around is that some of our competitor's top scientists seem to be making public/vocal arguments against integrating shader model hardware...

    ...2007 will be interesting if we have this 'right' and they don't
    .
    "X800GT... snap it up while you still can"
    HEXUS
    ......................................August 2005

  9. #9
    Senior Member
    Join Date
    Jan 2005
    Location
    Manchester
    Posts
    2,900
    Thanks
    67
    Thanked
    182 times in 136 posts
    • Butcher's system
      • Motherboard:
      • MSI Z97 Gaming 3
      • CPU:
      • i7-4790K
      • Memory:
      • 8 GB Corsair 1866 MHz
      • Storage:
      • 120GB SSD, 240GB SSD, 2TB HDD
      • Graphics card(s):
      • MSI GTX 970
      • PSU:
      • Antec 650W
      • Case:
      • Big Black Cube!
      • Operating System:
      • Windows 7
    Quote Originally Posted by Mr Fujisawa
    I dont know why everyone is acting so suprised, u didnt think, with the pace that gfx card tech improves, that the APIs would freeze as they are? granted sm3.0 is only in nvidia cards. but then it was the same with sm1.0/1.1/1.2 etc, no card had the same bloody version!
    There was no 1.2 (publicly at least), and all cards have always supported lower versions of shaders (e.g. any shader 2.0 card also supports 1.1, 1.3, 1.4) so it wasn't a huge deal. You just pick the minimum level you want to suppoort and write for that, or have some switching on the fly with detected levels.

  10. #10
    not posting kempez's Avatar
    Join Date
    Aug 2005
    Location
    Basingstoke
    Posts
    3,204
    Thanks
    0
    Thanked
    0 times in 0 posts
    OI think nvidia are just making noise in the wrong direction so people will buy their latest gen cards.

    There's no way their gonna let ATI get out in the open with the new unified architecture parts if they benefit performance.

    I mean on the one hand head scientist at nvidia said this:

    Quote Originally Posted by Nvidia
    It's far harder to design a unified processor - it has to do, by design, twice as much. Another word for 'unified' is 'shared', and another word for 'shared' is 'competing'. It's a challenge to create a chip that does load balancing and performance prediction. It's extremely important, especially in a console architecture, for the performance to be predicable. With all that balancing, it's difficult to make the performance predictable. I've even heard that some developers dislike the unified pipe, and will be handling vertex pipeline calculations on the Xbox 360's triple-core CPU.
    Then he goes on to say this:

    Quote Originally Posted by Nvidia
    We will do a unified architecture in hardware when it makes sense. When it's possible to make the hardware work faster unified, then of course we will. It will be easier to build in the future, but for the meantime, there's plenty of mileage left in this architecture
    So hedging his bets all ways I think. ATI are working with MS on this one, but you can be sure nvidia will be playing catch-up real quick if it proves a good move.

    Having said that I do hope ATI comes back @ Nvidia with some big hitters so that it bucks the big greens ideas up after them stroling around loving their latest gen
    Check my project <<| Black3D |>>
    Quote Originally Posted by hexah
    Games are developed by teams of talented people and sometimes electronic arts

  11. #11
    Real Ultimate Power! Grey M@a's Avatar
    Join Date
    Oct 2003
    Location
    Newcastle
    Posts
    4,625
    Thanks
    52
    Thanked
    156 times in 139 posts
    • Grey M@a's system
      • Motherboard:
      • Gigabyte Z97X Gaming 7
      • CPU:
      • i7 4790K (With H100i cooling)
      • Memory:
      • Corsair Vengeance Pro 16GB DDR3 (2 x 8GB)
      • Storage:
      • Samsung 840 Pro 128GB SSD, 1TB Cavier Black WD HD, 4TB Cavier Black WD HD
      • Graphics card(s):
      • MSI R9 390X Gaming Edition 8GB
      • PSU:
      • SuperFlower Leadex GOLD 850W Fully Modular
      • Case:
      • Corsair 650D
      • Operating System:
      • Windows 8.1 Pro x64
      • Monitor(s):
      • 24" LG 24GM77-B 144Hz
      • Internet:
      • 100MB Virgin Media Cable
    I doubt you will see SM4.0 for a while if at all. As MS said DX9 was the last time it makes its outing under the DirectX name, they are moving to XDA which is basically the next step from DirectX

  12. #12
    I shall never tire... BEANFro Elite's Avatar
    Join Date
    Jan 2004
    Location
    Surrey
    Posts
    1,596
    Thanks
    122
    Thanked
    31 times in 19 posts
    • BEANFro Elite's system
      • Motherboard:
      • Asus MAXIMUS IV EXTREME Rev.3.0
      • CPU:
      • Intel Core i7 2600K Sandy Bridge
      • Memory:
      • Corsair Memory Vengeance 8GB DDR3
      • Storage:
      • 240Gb RevoDrive 3 X2, 1x 1TB Maxter DiamondMax 11
      • Graphics card(s):
      • Sapphire ATi HD5970 3GB
      • PSU:
      • Coolermaster Silent Pro Gold 1000W Modular
      • Case:
      • Coolermater Cosmos Pure Black
      • Operating System:
      • Windows 7 Ultimate 64-bit
      • Monitor(s):
      • Dell 2209WA
      • Internet:
      • TalkTalk
    As far as I know, the only game with shader model 3.0 is Pacific Fighters...

    Have good luck thinking of another game that uses shader model 3.0

  13. #13
    OMG!! PWND!!
    Join Date
    Dec 2003
    Location
    In front of computer
    Posts
    964
    Thanks
    0
    Thanked
    0 times in 0 posts
    Far Cry..

  14. #14
    I shall never tire... BEANFro Elite's Avatar
    Join Date
    Jan 2004
    Location
    Surrey
    Posts
    1,596
    Thanks
    122
    Thanked
    31 times in 19 posts
    • BEANFro Elite's system
      • Motherboard:
      • Asus MAXIMUS IV EXTREME Rev.3.0
      • CPU:
      • Intel Core i7 2600K Sandy Bridge
      • Memory:
      • Corsair Memory Vengeance 8GB DDR3
      • Storage:
      • 240Gb RevoDrive 3 X2, 1x 1TB Maxter DiamondMax 11
      • Graphics card(s):
      • Sapphire ATi HD5970 3GB
      • PSU:
      • Coolermaster Silent Pro Gold 1000W Modular
      • Case:
      • Coolermater Cosmos Pure Black
      • Operating System:
      • Windows 7 Ultimate 64-bit
      • Monitor(s):
      • Dell 2209WA
      • Internet:
      • TalkTalk
    Oh yeah... but you see my point, we're hardly being spoiled for choice that said, I couldn't care any less as only have a feeble 9700Pro...

    Until next year that is, from whence I'll have my brand spanking new PC...

  15. #15
    ATI Technologies exAndrzej's Avatar
    Join Date
    Dec 2004
    Location
    London, UK
    Posts
    555
    Thanks
    0
    Thanked
    0 times in 0 posts
    The concept of SM3 is that it allows 'really cool stuff' like texture fetches in the vertex engine and branching in the pixel shader...

    ...wouldn't it be funny if developers were steered away from using those features because existing implementations were just to slow


    Also, is it me, or is it weird that some companies get 'cookie points' in reviews for having SM3...

    ...but then those same sites test with SM3 = off in order to benchmark 'more fairly'

    I am thinking about things like The Chronicles of Riddick

    The 'benefits' of SM3 are included when marking up features...
    "...and you have to bear in mind that the product supports SM3 !"

    But when the testing is done - SM2 mode is forced - instead of letting each card run the shader model that it was designed for

    Test everything with its 'natural' shader model + give marks for having features

    or

    Test everything at SM2 + ignore SM3 altogether


    Mix 'n' match just seems daft - like the worst of both worlds

    Extra marks without having to show the feature running live
    .
    "X800GT... snap it up while you still can"
    HEXUS
    ......................................August 2005

  16. #16
    not posting kempez's Avatar
    Join Date
    Aug 2005
    Location
    Basingstoke
    Posts
    3,204
    Thanks
    0
    Thanked
    0 times in 0 posts
    Farcry runs very fast with SM3 enabled and looked very nice. Love the HDR lighting - looks so spangly
    Check my project <<| Black3D |>>
    Quote Originally Posted by hexah
    Games are developed by teams of talented people and sometimes electronic arts

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Quick review: IcyBox HDD enclosure (revised model)
    By davidstone28 in forum Reader Reviews
    Replies: 15
    Last Post: 28-09-2006, 03:11 AM
  2. 2 identical model Sata disks....different sizes
    By ikonia in forum PC Hardware and Components
    Replies: 9
    Last Post: 17-04-2005, 05:24 PM
  3. Sony PSP Fashion Model Photo Gallery
    By Steve in forum HEXUS News
    Replies: 0
    Last Post: 21-03-2005, 09:29 PM
  4. 2X eVGA 6800U PCI-E SLI benchmarks
    By eva2000 in forum Graphics Cards
    Replies: 2
    Last Post: 06-03-2005, 05:57 AM
  5. ECTS 2004: ATI On Track With Shader Model 3.0
    By Steve in forum HEXUS Reviews
    Replies: 0
    Last Post: 02-09-2004, 11:58 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •