Results 1 to 8 of 8

Thread: GeForce 8 series cards to get PhysX upgrade

  1. #1
    HEXUS.admin
    Join Date
    Apr 2005
    Posts
    31,709
    Thanks
    0
    Thanked
    2,073 times in 719 posts

    GeForce 8 series cards to get PhysX upgrade

    Good news for owners of NVIDIA's GeForce 8 series graphics cards. Following the recent acquisition of Ageia, NVIDIA's CEO has let it be known that PhysX technology will be added to existing GeForce cards via a simple software update.
    Read more.

  2. #2
    Mostly Me Lucio's Avatar
    Join Date
    Mar 2007
    Location
    Tring
    Posts
    5,163
    Thanks
    443
    Thanked
    445 times in 348 posts
    • Lucio's system
      • Motherboard:
      • Gigabyte GA-970A-UD3P
      • CPU:
      • AMD FX-6350 with Cooler Master Seldon 240
      • Memory:
      • 2x4GB Corsair DDR3 Vengeance
      • Storage:
      • 128GB Toshiba, 2.5" SSD, 1TB WD Blue WD10EZEX, 500GB Seagate Baracuda 7200.11
      • Graphics card(s):
      • Sapphire R9 270X 4GB
      • PSU:
      • 600W Silverstone Strider SST-ST60F
      • Case:
      • Cooler Master HAF XB
      • Operating System:
      • Windows 8.1 64Bit
      • Monitor(s):
      • Samsung 2032BW, 1680 x 1050
      • Internet:
      • 16Mb Plusnet

    Re: GeForce 8 series cards to get PhysX upgrade

    Could be handy, but won't it impact on the frame rate if part of the card is busy doing PhysX calculations and lead to worse, rather than better performance on nVidia cards?

    Or is this just a ploy to tap into the unused potential of SLI to boost the sales of their motherboards and cards...

    (\___/) (\___/) (\___/) (\___/) (\___/) (\___/) (\___/)
    (='.'=) (='.'=) (='.'=) (='.'=) (='.'=) (='.'=) (='.'=)
    (")_(") (")_(") (")_(") (")_(") (")_(") (")_(") (")_(")


    This is bunny and friends. He is fed up waiting for everyone to help him out, and decided to help himself instead!

  3. #3
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,039
    Thanks
    1,881
    Thanked
    3,379 times in 2,716 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: GeForce 8 series cards to get PhysX upgrade

    Yes, to both of those I think Lucio.

    We'll have to see which a game uses most - CPU cores or GPU power. If GPU (like most games at the moment) then doing physics on unused cores seems a better way of doing things (and Intel will be pushing this with it's acquisition of Havok).

  4. #4
    Goron goron Kumagoro's Avatar
    Join Date
    Mar 2004
    Posts
    3,154
    Thanks
    38
    Thanked
    172 times in 140 posts

    Re: GeForce 8 series cards to get PhysX upgrade

    I have never liked the idea of giving up Graphics power for physics. I was under the
    impression that extra physics puts your GPU under more pressure from all the extra
    bits and pieces it needs to render etc.

    How much of a trade off is there between running a dedicated PPU chip compared to
    it being emulated by the GPU. If there is a significant difference then I would prefer
    the PPU integrated onto the GPU die. That way the power of the PPU could be
    matched with that of the GPU.

    In any case physics will only be properly viable when its a part of directX

  5. #5
    Senior Member this_is_gav's Avatar
    Join Date
    Dec 2005
    Posts
    4,854
    Thanks
    175
    Thanked
    254 times in 216 posts

    Re: GeForce 8 series cards to get PhysX upgrade

    Quote Originally Posted by Kumagoro View Post
    How much of a trade off is there between running a dedicated PPU chip compared to it being emulated by the GPU. If there is a significant difference then I would prefer the PPU integrated onto the GPU die. That way the power of the PPU could be matched with that of the GPU.
    When this was announced I was expecting that to be the case too. I certainly wasn't expecting it to just be a software thing. If that's the case what little impression of PPUs I had has been lowered further - if a GPU can do it alongside it's normal graphics processing then it can't be taking that much of a hit, so did the PPU do much at all? Was it just a marketing gimmick for what was primarily a software layer?

  6. #6
    HEXUS.timelord. Zak33's Avatar
    Join Date
    Jul 2003
    Location
    I'm a Jessie
    Posts
    35,185
    Thanks
    3,126
    Thanked
    3,179 times in 1,926 posts
    • Zak33's system
      • Storage:
      • Kingston HyperX SSD, Hitachi 1Tb
      • Graphics card(s):
      • Nvidia 1050
      • PSU:
      • Coolermaster 800w
      • Case:
      • Silverstone Fortress FT01
      • Operating System:
      • Win10
      • Internet:
      • Zen FTC uber speedy

    Re: GeForce 8 series cards to get PhysX upgrade

    doesn't this make a mockery of PhysX?

    I mean....they told us that it ADDED to the gaming experience by doing more...now it turns out the card does it anyway.

    I'm baffled.

    Quote Originally Posted by Advice Trinity by Knoxville
    "The second you aren't paying attention to the tool you're using, it will take your fingers from you. It does not know sympathy." |
    "If you don't gaffer it, it will gaffer you" | "Belt and braces"

  7. #7
    Splash
    Guest

    Re: GeForce 8 series cards to get PhysX upgrade

    Quote Originally Posted by Zak33 View Post
    now it turns out the card does it anyway.

    I'm baffled.
    The card *will be able to* do it with what I'm guessing is a firmware upgrade. I'm guessing the plan is to use it as a way of selling SLI to those who haven't bothered (like me, but until I see some concrete figures I'm not biting)

  8. #8
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,039
    Thanks
    1,881
    Thanked
    3,379 times in 2,716 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: GeForce 8 series cards to get PhysX upgrade

    Quote Originally Posted by Zak33 View Post
    doesn't this make a mockery of PhysX?

    I mean....they told us that it ADDED to the gaming experience by doing more...now it turns out the card does it anyway.

    I'm baffled.
    The key thing is they've ported physX to CUDA, which is nvidia's way of leveraging the GPU power. Ageia said all along that you graphics cards were somewhat suited to these calculations (more than x86 CPU say), just not as suited to it as a dedicated unit. That'll still be the line I'd guess.

    If (haha) ATi can implement CUDA on their GPUs then it's win win. But more likely we will see it as part of dx12 or whatever when microsoft wakes up and decides to knock other companies' propriatory systems out of the way again.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Replies: 3
    Last Post: 21-08-2004, 01:34 AM
  2. Replies: 4
    Last Post: 09-07-2004, 05:04 PM
  3. Replies: 0
    Last Post: 09-07-2004, 10:41 AM
  4. New card v Geforce 3
    By iandredd in forum Graphics Cards
    Replies: 18
    Last Post: 15-03-2004, 10:42 PM
  5. Softmod / Upgrade a Geforce 4MX?
    By joshwa in forum Graphics Cards
    Replies: 4
    Last Post: 17-02-2004, 09:04 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •