Watch the show.HEXUS first examined Ageia's PhysX proposition in April 2006. With WARMONGER and UNREAL TOURNAMENT 3 itching to go, Ageia asked HEXUS.tv to create a promo-feature that brings their complete story up to date...
Watch the show.HEXUS first examined Ageia's PhysX proposition in April 2006. With WARMONGER and UNREAL TOURNAMENT 3 itching to go, Ageia asked HEXUS.tv to create a promo-feature that brings their complete story up to date...
They are stubborn and determined little devils aren't they?I don't think these 'physx' are impossible to be rendered on a state of the art multi-core CPU and you're only getting more hardware that's going to give you that much more headaches with drivers/support/failures etc.Useless imo
State of play: people care even less about the idea than they did a year back. Developers of big-name titles continue to annoy Ageia by using multi core CPUs in preference.
I think they're only still alive nw because numb-nutts know nothings by them because its expensive and is an extra bit to to add onto their ego.
I'm surprised they still think they're the mutts danglies even though a second GPU and / or a multicore CPU will do the same job in a more practical and useful way and at a fraction of the cost
I wonder what game he was on about that stopped his tank, couldn't have been anything in the past ten years, surely? LOL!
When we were speaking with Dan, I have to say that his figure of "60% of future games will be developed with our software" seemed strong
If that is the case, and the only other serious contender in the middleware arena has been snapped up - then it will be interesting to see what happens with Ageia over the coming months
Ryszard's initial review in May 2006, struggled to see many bright spots with the Ageia offering
Now, 18 months on, they have kept plugging away on the DevRel and drivers - and picked up the deal with Dell for the XPS1730
Normally, people buy hardware and get a free game
With Warmonger being free, you could almost (OK - not quite - but within £40!) buy the hardware and get the game for nothing
On-going licensing for the UT3 engine would seem key here
If UT3 proves massively popular (and the PhysX aspect grabs the hardcore gamer's attention) - then 2007/8 could be a lot rosier for Dan & Co...
In the interview, Dan says himself that not going with a much stronger drive into the game developers early on was a mistake - let's see if it is one that they can rectify
.
.
I kept 6 trusted serving men, they taught me all I knew.
There names were what and where and why and how and when and who.
(I also had the HEXUS forums on speed dial just in case )
Except they won't do the job.
GPU physics solutions are dead-in-the-water, and have been for a year or so (and were never that good to start with). Multi core is STILL not being used properly by developers (Valve's latest release is still single-core despite all the hoopla a while back).
Ageia PhysX is hooked right into UT3, a major middleware component, which is a huge win for them.
A year ago I won't have given Ageia a gnat's chance of making it. But things are different now.
I can't speak intelligently about what might happen in the future - but there does seem to be a bizarre situation here for the GPU guys
They have ample wad-loads of power...
...but who is providing the middleware etc to make use of that power to do the physics?
Real world applications is a different thing - where places like Stanford will write specific programmes to maximise GPGPU performance...
...but I am not sure who will do this in the game environment
Would be interesting, if AMD had a few million spare, for them to buy Ageia - create a completely 'open platform solution consortium' - and actually drive the middleware themselves to take advantage of ATI and* nVidia's multi-core technologies
Problem is that many of the guys at ATI who are high-up the food-chain, would need some kind of categoric-handed-down-by-moses-with-the-tablets proof that an investment of this kind would lead directly to more sales etc
That proof is hard to come by - so maybe they won't take the plunge - but it would seem to make sense in a world where Intel have the biggest-known brand of physics and plenty of spare cores to do the processing
*Can't work on just one and expect any kind of support from anyone else. Would probably have to work (in some fashion) on Intel graphics as well
.
.
.
I kept 6 trusted serving men, they taught me all I knew.
There names were what and where and why and how and when and who.
(I also had the HEXUS forums on speed dial just in case )
http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
I thought it was (HL 2 EP2) tbh. But I agree multi-threading still has a way to go. However, devs are much more likely to do this now that dual core is the default choice and quad core has become so much cheaper - more so that physx which is utterly nowhere in the embedded users stakes.
Mind you, when you see the results of steam surveys even dual core is 'rare'-ish.
I suspect that all they're seeing here is the performance gain that results from not having to swap-out threads in order to service the background processes which will always be running in XP.
When Valve advertised their big multi-core push they did mention how hard it was to engineer a truly multi-threaded game engine while keeping everything in sync. I think it's clear they haven't cracked it yet, certainly not in any meaningful way.
I dug up this too - http://www.bit-tech.net/gaming/2006/...Source_Engin/4
which is interesting - particularly because it's physics-related. I guess it's safe to assume they're 'working on it' released or not. Certainly doesn't seem like phsyx is required by valve..
It's definitely an interesting situation.
To spell it out - any processor be it CPU, GPU or anything else is always going to be designed with a specific task in mind. CPUs are designed to run several different types of calculations efficiently, to keep multiple processes and threads running concurrently, ensure memory is accessed efficiently, I/O devices are used properly and efficiently etc.
GPUs are designed only with 3D graphics in mind and their inner guts (memory, transistors, logic gates, registers etc.) are designed to be efficient within that field. Running physics calculations on a GPU is a bit of a square peg - round hole situation. You will get it to work but it won't be as efficient or elegant if the processor was designed to run calculations that are within the realm of physics. A "physics" card is simply a better solution to the problem rather than trying to implement different layers of abstraction to allow the GPU to perform physics calculations. It means another card but think of the difference in games before 3Dfx launched the Voodoo card (first standalone 3D card? can't remember) and how that changed the way games are played.
As people have already mentioned in this thread multi core processing is a nice idea in theory but no one has a clue how to properly write programs which take advantage of them. It's a huge paradigm shift to write programs, operating systems and games for them. Bit of a different story in the server / HPC realm but that's a different ball game entirely.
What could have really launched the PhysX card? If they had managed to convince one of the big three console makers that it was a good idea and had it finished a few years earlier - maybe it could have made it into the PS3, Wii or 360.
The big bonus there is that it's a guaranteed component of a system and developers can have it as a given when developing a game. When developing a PC game it can't be a critical component as there is such a small install base at the moment that it would be commercial suicide to make consumers buy a card to play the game. Pretty much a classic Catch 22. If Ageia are still around for the next console war there is good chance something like a PhysX card could be in the offing for consoles.
I like the idea of a PhysX card - People bang on about how "photo realistic" graphics are becoming but that alone won't completely allow me to suspend my disbelief. Now, if the environments DO what they look like they SHOULD DO then we are talking. Never mind suspending your disbelief, what is there to "disbelieve" if that falls into place?
Too many words - Going to make some dinner
Is that why GPGPU is an ever-growing field of both research and applications?
GPUs are massively parallel processors with programmable pipelines.
It makes them great for graphics, but they can be great at other things too. The problem, in my eyes at least, isn't 'fitness for purpose' but rather 'if the GPU does just graphics then we can get greater visual quality than if it's got other work to do too'.
After reading that post of mine again I guess it does look like I made out that GPUs are useless at Physics calculations. They are exactly as you described them but at the end of the day they will still be optimised with graphics in mind, not physics. Your point is a really good one - having two cards with distinct jobs is going to work out better for the overall result.
Rather than AMD buying out ageia, like what many said when the physics cards were first announced, a
DirectX API for physics could be better. More competition driving down prices and better uptake I reckon.
I remember hearing some vague mutterings about Nvidia or was it ATi planning on latching their own brand of physics processor onto their flagship range of GPUs about a year ago when PhysX was about a month old to consumers.
I think it'd be a great idea in principle, but in practice, I'd imagine temperatures when alongside two GPU cores must be rather dubious to say the least and wouldn't graphics performance take a hit seeing as the onboard physics chip would be using the same bandwidth on the PCI-Express bus, surely?
There are currently 1 users browsing this thread. (0 members and 1 guests)