I've got a 8800GT 512mb, seen a few 8xxx series cards quite cheap recently (84xx/85xx?), wondered if they're worth it for dedicated PhysX as Ive seen others mention having? is this even worthwhile? thoughts?
Printable View
I've got a 8800GT 512mb, seen a few 8xxx series cards quite cheap recently (84xx/85xx?), wondered if they're worth it for dedicated PhysX as Ive seen others mention having? is this even worthwhile? thoughts?
Possibly, depends do you really play PhysX games.
Might be worth looking at 8xxx series cards that can take their power from PCI-E bus...
Less cables.
Games supporting PhysX
TBH, after moving from a system with a dedicated PhysX card + ATI gfx, to one with no Physx & ATI gfx, I've noticed no difference at all in gaming performance in any game. My old PCI Agia card is now just gathering dust.
Like others say, if you don't even play a PhysX game then there's no point.
Personally I've never actually seen the need for PhysX nor wished or wanted it either.
For a more complete list of physx games http://en.wikipedia.org/wiki/PhysX
And it does depend on the game, try playing Warmonger without a physX card and it's unplayable, although saying that it was never a great game with physx
Most games that use physx can be played fine on a 8800gt 512mb, the impact of the additional physx load is rather minor on most.
It's only games that are heavly physx loaded that will start to suffer.
Works great with 3dmark vantage but otherwise not much point ATm.
It's nice, but I wouldn't pay extra for it. As mentioned before, depends if the games you play takes advantage of it.
Tbh physx isn't even that good. I was playing mirrors edge with my gtx260 (physxs enabled) and it added a few extra cool effects but it lagged at like 2fps. I realise that a dedicated card would have reasonable preformance but tbh the effects weren't that good and deffo not worth the cost for me at least.
^ That doesn't sound right. Whether you appreciate the extra effects is entirely up to you, but the game shouldn't be lagging at 2fps on a GTX260 assuming even a modest rig.
http://www.pcgameshardware.com/aid,6...eviews/?page=3
http://www.pcgameshardware.com/aid,6...-Physx/Reviews (the only time when performance is that badly hit is when you use the CPU for PhysX)
I just did an experiment with an 8600GTS as a physx card with my Radeon HD4850. The first difficulty I ran into was the 8600GTS just didn't work if installed as a secondary GPU. Wasn't even detected by the BIOS.
Eventually got it working with my cards swapped round at 8x, 8x pci-e. Used version 1.02 of the ATI hack patch (without it you can't enable physx in nvidia control panel). Was kinda nice in Batman, with soft fabrics and paper getting thrown about fairly realistically.
Unfortunately my souncard (X-Fi Prelude) didn't like having my main GPU installed next to it. After 20 minutes or so of gaming I would get sound corruption and then the PC would turn off, which I put down to the soundcard overheating due to being in such close proximity to a hot graphics card. Taken out the nvidia card now and moved the ati card back to the first slot and all is fine. Suppose it could be the PSU not liking the extra load but I prefer the heat theory.
I may have another crack soon with a low power, single slot nvidia card in the 2nd pci-e slot.
To be honest, I think physx isn't really doing much on the GPU in games that couldn't be done on the PCU (quad maybe) with better programming. Running physx on the CPU hardly increases CPU utilisation at all. The framerate crawls while the CPU sits there mostly idle. Just smacks of lack of optimisation really.
I've been playing through Ghostbusters recently and the CPU based physics of the infernal engine is really not far off what GPU physx does in Batman. If anything I'd say the particle effects in Ghostbusters are better. It's just fabrics and paper etc that I think Batman is better in. Plus Ghostbusters framerate is solid on my unlocked AMD quad core. Batman's is all over the place unless you're running high end nvidia gpu (s). The sooner we loose proprietary physics and move towards more open initiatives the better IMO.
PhysX is about as much use on a chocolate teacup...
ahhhh... the sound of silence since the resident nVidia fanboy left :)
OK, that's harsh comment about PhysX, but I think the concept of the dedicated PPU has gone west. That said, it is all in the eye of the beholder. If you game PhysX accelerated games, think the extra loading is acceptable and enjoy the additional eye candy, that's fine I guess.
My vote would be for MS integrate physics modelling into DirectX (or similar) and standardise it. Much better all around and stop all this proprietary hardware nonsense.
You still have the problem of processing requirements. Regardless of where or how it's being done it has a cost, if that cost is greater then the "spare" CPU/GPU cycles that are available, you degrade something.
We will have to see the implementations but at least for now, I think the "2nd GPU" is almost a necessity. especially considering how much the latest Physx games seem to be hammering the Physx card (both Batman and Dark Void need a dedicated 9800GTX+ to turn Physx to full)
I was hoping by now the integrated graphics on motherboards would be up to this.
I now own two motherboards with 128MB sideport ram chips on them, so that isn't a problem. The Nvidia "ION" is 16 shaders, half what you get on an 8600GT, so not quite enough. Was disappointed to see the latest AMD chipset didn't move forward in GPU power at all :(
Oh well, maybe next year...
Next year the GPU will be integrated into the CPU, surely! I think we've seen the highest spec AMD are willing to go to on their IGPs already, tbh, and NVidia are being shut out by Intel, so IGPs may be a thing of the past, with it all coming down to what happens with the on-die GPUs / APUs. Of course, with good implementation and well written, open libraries, future APUs should be brilliant for processing physics effects...
Hmm, using on-cpu graphics to compute physics would eat memory bandwidth that I want for the CPU.
On-chipset GPU with its own sideport RAM seems ideal for physics if it had just a bit more balls.
Does anyone here intend buying a CPU with integrated graphics? Makes sense for laptops and possibly low end desktops, but I personally don't want one. Waste of die area as far as I am concerned, I would rather have more cache/cores in that silicon. So for desktop users I am sure they still have conventional processors on the roadmap for some time.