Have you guys seen this?
The Ghost recon explosion rocks..
And Cell Factor is sooooo promising..
Bet on Soldiers.. nahh..
Linky here: http://physx.ageia.com/footage.html
Printable View
Have you guys seen this?
The Ghost recon explosion rocks..
And Cell Factor is sooooo promising..
Bet on Soldiers.. nahh..
Linky here: http://physx.ageia.com/footage.html
Looks very very sweet
Damnit I wasn't gonna buy any PC stuff :mad:
Whoa that looks too good, I really want one now :(
Doesn't look great, but hey, everybody loves a flamethrower!Quote:
Originally Posted by sawyen
The explosion for Ghost Recon is legend...
And those phy-power thingy + those gigantic physics with barrels and other debris in cellfactor is just too cool to be true...
And check out the alpha blending... top of its class..
That's done on the GPU though...Quote:
Originally Posted by sawyen
Will be interesting to see how HavocFX compares (and whatever ATI will call their's). I may go SLI after all....
Will be interesting if ATI works with Ageia.. Cross computation across PCI express.. works just like SLI, but this time ur not bound to a specific GPU during upgrade..
i have heard about this for a while and not been that interested but after seeing that ghost recon footage i want one!!!
What if PhysX comes up with a multi-PPU solution? a dual PPU board with 512MB DDR3.. You can even simulate facial hair and ear goo with that amount of PPU power...
ROFLMAO!Quote:
Originally Posted by sawyen
Yummy, BFG have one priced for $300, quite a fair bit of money to be fair.
What is the compatibility like?
Cellfactor looks sweet.
If the engine take up by game designers is good, then could we see this as a way of enabling "SLI-like performance enhancements" on non-SLI systems?
I remember buying a Voodoo2 to pair with my Matrox Millenium many moons ago, when 2D/3D combo cards where pretty naff. Are we going to see the same with independent Physics cards, if the big 2 start integrating PPU capabilities into their GPUs?
Seems HAVOC is the green eye monster here.. teaming up with Nvidia and such, but having SLI to drive PPU is slightly impractical.. If you think now..
Say u have a 7800GT now.. in order to run PPU on SLI, u'll need another 7800GT..
After say a year or so, gfx demands go up.. perhaps PPU demands did not, but ur stuck with 2x7800GT.. while others are using 8800 Ultra or N800 GTX etc etc with a good old PhysX.. In order to get the PPU to work again with better gfx IQ, u'd have to sell 2 7800GT.. then invest the money on TWO high end cards again.. which I think a dedicated PPU shud cost less...
I cant see the sense really.. Unless PPU gets obsolete as fast as GPUs..
A high end GPU now would be mid range in 6 months.. lowish-mid in a year.. perhaps even discountinued..
Actually, I still don't get why you MUST have SLI for this to work o.O
Have never really played with SLI so just thinking out loud - Have always bought a card with the intention of buying a second later, then just upgraded to a single next generation gpu.
I just see the PPU option as being hellishly expensive for what you get, and can't believe that they really expect us to only expect future performance upgraded only through drivers with minimal hardware options... I don't think I can cope with trying to maintain decent gfx performance at the same time as physics.
My problem is that I absolutely love Ghost, and anything that could improve an already awesome game is worth the money in my book.....
If I'm understanding correctly, Using an SLI system with HavocFX (for example) means that you have the increased GPU power for games that don't support the physics engine. For games that do, you can sacrifice a GPU to run as a PPU.Quote:
Originally Posted by TooNice
You kinda get the best of both worlds - but not at the same time!
Unless you have SLI and a separate PPU... hmmmm.......
But I suppose a dedicated PhysX will definitely be cheaper than a top of the line say a 8900 GTX as a GPU..
so say a GTX cost £400+, and a PhysX cost £300.. thats £700
But running SLI, will mean £800... then 6 months down the road.. say a G90 comes out.. and u want one.. But u cant run a G90 with a G80, at least in SLI u cant.. In the end u lose ur PPU as well.. But If u had a PhysX, u can choose to upgrade ur G80 to a G90 while still retaining a PPU.. perhaps may not be top of the line then, maybe Ageia came out with PhysX GTX or something.. but at least ur PPU will still work..
MSRP I've seen would be $299 so hopfully not £300 :(
Yea, probably gets to the region of £270ish online.. but still my point is that it makes more sense than SLI..
Check ATI's take on physics :D
The PhysX option only adds $249 to the price of a Dell box in the US, so hopefully they should retail at something like that figure too (so probably £199 in the UK... :mad: )
if its around £200.. will be a good investment.... well, as long as the lifespan isnt as short as graphics cards..
I hope someone comes out with a solution based on dual-core CPUs.
The cost overhead isn't nearly as much, plus you wouldn't be tied into a specific gfx card makers implementation.
Multi core CPUs are here and look to be the future, might as well take advantage of them......
Multi core cpus are no more the future of physics processing than they are of graphics rendering, lighting and geometry processing. I don't see anyone with a dual core CPU giving up their GPU, and I don't see next year's quad cores changing that!Quote:
Originally Posted by shaithis
My only question is how long it will take this technology to be incorporated into a northbridge as a low end "integrated physics". As Nvidia are saying you need a 6800 minimum for their technology, I guess their 6150 chipsets won't count (which seems like a lost opportunity).
Wow, didn't realise intel/AMD were planning to release 50GHz parts to make them a viable alternative. ;)Quote:
Originally Posted by shaithis
I managed to get hold of ATI for some further comment on their take on PPU's: Clicky here
I think you missed my point.....Quote:
Originally Posted by DanceswithUnix
Why buy an addition processor or gfx card when a lot of people have (and a HUGE amount more will in the next year) dual-core CPUs with the second core hardly doing anything?
Just how much difference would the effects look if you were limited to calculating them on a 2.5GHz CPU core instead of some dedicated physics card? I don't beleive it would be that great TBH and I personally think that if these things cost £100+ they will probably flop due to demand.
Someone like Microsoft could almost certainly kill it overnight by adding physics into DirectX if you have dual-core/SMP.
I have a habit of throwing money at gaming, yet this is really not whetting my appetite at all. Sounds to me like someone is selling me something that could be done (for the most part) by my system already.
Quote:
Originally Posted by Butcher
You really think a dust cloud or a box exploding is going to look much different to your eyes if the physics of it is generated by a dedicated add-in card, rather then a second core?
You must be the bionic man...
I really find it hard to beleive that current CPU cores would have a hard time calculating these things to a level of complexity that a human could differenciate between.
Now, if these things were for real-world modeling applications I could understand.....but for gaming???
A simnple box explosion, probably not. But when you have 100 boxes exploding, with all the sharpnel interacting, and at the same time fluid simulations and such, it starts to add up. CPUs really aren't that good at this sort of thing compared to a dedicated chip.
The same sorts of arguments were used about GPUs you know, and yet a modern GPU is about 50 times faster than a CPU at what it does, despite them being clocked much lower.
Aren't GPU very wasteful though? Doing the calculations for lots of things that aren't visable?
Um, they calculate what you tell them to calculate.
You usually calculate some non-visible things for rendering, but that has no bearing on a discussion about physics. In terms of what's calculated it would be pretty much the same on either a PPU or GPU.
The problem if you make the 2nd core handle physics is that it will only be good for that alone. Which may defeat the purpose of other people wanting dual core.
It's pretty clear by now that GPU do graphics far, far better than CPU: just look at 3D Marks CPU tests. Now if Intel/AMD swap the the 2nd core for one of the X1900XT/7900GTX and do some mass changes to the design, who knows. But that is not their business, their business is to make generic CPU to do some number crunching used by most of your PC.
I suspect this will apply to physic engines too. It may be that early on, you can get the 2nd core to do render some of the physics via some kind of software mode. But it won't be long that won't be good enough [speculating].
As for GPU wasteful, from what I've heard things have improved significantly since the TNT day in that aspect. I asked this question some time ago, and apparently, the latest Hyper-Z (and nVidia's equiv) are getting closer to what PowerVR provided with their tile based rendering tech. No idea how true that statement is though (only got one reply on that thread).
GPUs aren't inherently wasteful, that's just a property of the way you draw overlapping objects. For physics procoessing there should be no overlap, so there would be no wastage.
CPU architecture just isn't geared up for physics processing. Even the dedicated PPU is just working on basic visually noticeable physics effects.. Besides, with physics calculations being offloaded from the CPU the CPU can devote more resources into AI. We have theory to do a lot of neat stuff in games, we're just waiting for the hardware to catch up.Quote:
Originally Posted by shaithis
.. until someone comes up with an AI engine.
*Patents the idea* :D
Yea, like wouldn't it be amazing if you could reason and dynamically interact with NPCs, or even enemies?... better still would be doing it vocially.. I would think that would be really cool, in a kinda creepy way lol. AI is really back in the stone age compaired to graphics, for reasons I can't quite understand.
IMHO eyecandy apeal can wear off after awhile (from a few minutes to a day), but with gripping storyline and dynamics you can easily get really hooked into a game, for e.g. Baldur's Gate II is still one of my most favourate games, not the best looking graphically, but the storyline was awesome.
Imagine a storyline and charactors that intelligently evolves!! *getting ahead of himself*
I would think it would have a very powerful effect on players.
I don't think I find that surprising.. The concept of AI is incredibly complex, because there is so much more open to interpretation.Quote:
Originally Posted by aidanjt
Dynamically interacting with NPCs is not just AI, it's linguistic. Another tough area considering how many ways there are to express the same things.. or how a subtle play on words can change the meaning. Plus, there are more languages than just english.
It would be made easier if human input are limited to certain syntax, but then, it would be clear that we are interacting with a machine.
It would certainly be amazing if we can overcome all these road blocks.. It'll be a long time before it is made possible. And I can only imagine how crazy it would be to code them :surprised: