I've got a 8800GT 512mb, seen a few 8xxx series cards quite cheap recently (84xx/85xx?), wondered if they're worth it for dedicated PhysX as Ive seen others mention having? is this even worthwhile? thoughts?
I've got a 8800GT 512mb, seen a few 8xxx series cards quite cheap recently (84xx/85xx?), wondered if they're worth it for dedicated PhysX as Ive seen others mention having? is this even worthwhile? thoughts?
Possibly, depends do you really play PhysX games.
Kalniel: "Nice review Tarinder - would it be possible to get a picture of the case when the components are installed (with the side off obviously)?"
CAT-THE-FIFTH: "The Antec 300 is a case which has an understated and clean appearance which many people like. Not everyone is into e-peen looking computers which look like a cross between the imagination of a hyperactive 10 year old and a Frog."
TKPeters: "Off to AVForum better Deal - £20+Vat for Free Shipping @ Scan"
for all intents it seems to be the same card minus some gays name on it and a shielded cover ? with OEM added to it - GoNz0.
Might be worth looking at 8xxx series cards that can take their power from PCI-E bus...
Less cables.
Games supporting PhysX
My Blog => http://adriank.org
Like others say, if you don't even play a PhysX game then there's no point.
Personally I've never actually seen the need for PhysX nor wished or wanted it either.
For a more complete list of physx games http://en.wikipedia.org/wiki/PhysX
And it does depend on the game, try playing Warmonger without a physX card and it's unplayable, although saying that it was never a great game with physx
Most games that use physx can be played fine on a 8800gt 512mb, the impact of the additional physx load is rather minor on most.
It's only games that are heavly physx loaded that will start to suffer.
[rem IMG]https://i69.photobucket.com/albums/i45/pob_aka_robg/Spork/project_spork.jpg[rem /IMG] [rem IMG]https://i69.photobucket.com/albums/i45/pob_aka_robg/dichotomy/dichotomy_footer_zps1c040519.jpg[rem /IMG]
Pob's new mod, Soviet Pob Propaganda style Laptop.
"Are you suggesting that I can't punch an entire dimension into submission?" - Flying squirrel - The Red Panda Adventures
Sorry photobucket links broken
Works great with 3dmark vantage but otherwise not much point ATm.
It's nice, but I wouldn't pay extra for it. As mentioned before, depends if the games you play takes advantage of it.
Tbh physx isn't even that good. I was playing mirrors edge with my gtx260 (physxs enabled) and it added a few extra cool effects but it lagged at like 2fps. I realise that a dedicated card would have reasonable preformance but tbh the effects weren't that good and deffo not worth the cost for me at least.
^ That doesn't sound right. Whether you appreciate the extra effects is entirely up to you, but the game shouldn't be lagging at 2fps on a GTX260 assuming even a modest rig.
http://www.pcgameshardware.com/aid,6...eviews/?page=3
http://www.pcgameshardware.com/aid,6...-Physx/Reviews (the only time when performance is that badly hit is when you use the CPU for PhysX)
I just did an experiment with an 8600GTS as a physx card with my Radeon HD4850. The first difficulty I ran into was the 8600GTS just didn't work if installed as a secondary GPU. Wasn't even detected by the BIOS.
Eventually got it working with my cards swapped round at 8x, 8x pci-e. Used version 1.02 of the ATI hack patch (without it you can't enable physx in nvidia control panel). Was kinda nice in Batman, with soft fabrics and paper getting thrown about fairly realistically.
Unfortunately my souncard (X-Fi Prelude) didn't like having my main GPU installed next to it. After 20 minutes or so of gaming I would get sound corruption and then the PC would turn off, which I put down to the soundcard overheating due to being in such close proximity to a hot graphics card. Taken out the nvidia card now and moved the ati card back to the first slot and all is fine. Suppose it could be the PSU not liking the extra load but I prefer the heat theory.
I may have another crack soon with a low power, single slot nvidia card in the 2nd pci-e slot.
To be honest, I think physx isn't really doing much on the GPU in games that couldn't be done on the PCU (quad maybe) with better programming. Running physx on the CPU hardly increases CPU utilisation at all. The framerate crawls while the CPU sits there mostly idle. Just smacks of lack of optimisation really.
I've been playing through Ghostbusters recently and the CPU based physics of the infernal engine is really not far off what GPU physx does in Batman. If anything I'd say the particle effects in Ghostbusters are better. It's just fabrics and paper etc that I think Batman is better in. Plus Ghostbusters framerate is solid on my unlocked AMD quad core. Batman's is all over the place unless you're running high end nvidia gpu (s). The sooner we loose proprietary physics and move towards more open initiatives the better IMO.
PhysX is about as much use on a chocolate teacup...
ahhhh... the sound of silence since the resident nVidia fanboy left
OK, that's harsh comment about PhysX, but I think the concept of the dedicated PPU has gone west. That said, it is all in the eye of the beholder. If you game PhysX accelerated games, think the extra loading is acceptable and enjoy the additional eye candy, that's fine I guess.
My vote would be for MS integrate physics modelling into DirectX (or similar) and standardise it. Much better all around and stop all this proprietary hardware nonsense.
Corsair Air 540, Asus Prime X570-Pro, Win 10 Pro, AMD R9 3900X, Corsair HX 750, EVGA 1080 Ti, 2x Corsair 2TB MP600, 2x 2TB WD20EZRX, 4x8GB Corsair Dominator, custom watercooled (single loop, 2 rads)
Corsair 550D, Asus X470-Prime Pro, Win 10 Pro, AMD R7 2700, Corsair RM750i, Asus GTX780 Poseidon, 2x Sammy 500GB 970 EVO, 2x 2TB Seagate Barracuda, 2x8GB Corsair Vengeance, custom watercooled (single loop, 2 rads)
Synology DS918+ w/ 2xWDC Green 3TB + 2x Seagate Barracuda 6TB, N2200 w/ 2xSammy 1.5TB
backup:
Corsair 500R, Gigabyte GA-Z97MX Gaming 5, Win 10 Pro, i5 4690, Corsair HX750i, Sapphire Fury X, 256GB Sammy SM951 M.2 (System), WD SE16 640GB, 2x8GB Corsair Vengeance, Corsair H100i
You still have the problem of processing requirements. Regardless of where or how it's being done it has a cost, if that cost is greater then the "spare" CPU/GPU cycles that are available, you degrade something.
We will have to see the implementations but at least for now, I think the "2nd GPU" is almost a necessity. especially considering how much the latest Physx games seem to be hammering the Physx card (both Batman and Dark Void need a dedicated 9800GTX+ to turn Physx to full)
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
I was hoping by now the integrated graphics on motherboards would be up to this.
I now own two motherboards with 128MB sideport ram chips on them, so that isn't a problem. The Nvidia "ION" is 16 shaders, half what you get on an 8600GT, so not quite enough. Was disappointed to see the latest AMD chipset didn't move forward in GPU power at all
Oh well, maybe next year...
Next year the GPU will be integrated into the CPU, surely! I think we've seen the highest spec AMD are willing to go to on their IGPs already, tbh, and NVidia are being shut out by Intel, so IGPs may be a thing of the past, with it all coming down to what happens with the on-die GPUs / APUs. Of course, with good implementation and well written, open libraries, future APUs should be brilliant for processing physics effects...
Hmm, using on-cpu graphics to compute physics would eat memory bandwidth that I want for the CPU.
On-chipset GPU with its own sideport RAM seems ideal for physics if it had just a bit more balls.
Does anyone here intend buying a CPU with integrated graphics? Makes sense for laptops and possibly low end desktops, but I personally don't want one. Waste of die area as far as I am concerned, I would rather have more cache/cores in that silicon. So for desktop users I am sure they still have conventional processors on the roadmap for some time.
There are currently 1 users browsing this thread. (0 members and 1 guests)