Over an 80% improvement over a single GPU:
Expreview.com - Extra Hardware News Report! » Blog Archive » SLI performance boost! GeForce 9600GT Review
That is impressive!! Also seems that a 9600GT 256mb is coming out too!
Printable View
Over an 80% improvement over a single GPU:
Expreview.com - Extra Hardware News Report! » Blog Archive » SLI performance boost! GeForce 9600GT Review
That is impressive!! Also seems that a 9600GT 256mb is coming out too!
any bet of a 9600GTX GTS GS G and then 256mb versions of them? the rate of nvidia bringing out cheaper models is pathetic. And, comparing it to a single 8800gts is flawed because it has 1gb of ram instead of a single card so should in theory just benefit high resolutions. Nvidia just peed me off, next time round im going with ati, atleast they havent screwed me over with all these little crappy models that work out better O.o
The rate at which graphics cards are released sickens me too. In the past at least it was every nine months to a year. Graphics cards have such a short lifespan now(measured in a few months). No wonder consoles are getting more popular. Companies keep on releasing new cards even before current games are optimised for current cards!! :censored:
its all because of one thing.... crysis, seriously. And the fact ati/amd have decided to go mid user and so nvidia jsut releases bunches of crap hoping to keep amd away, instead of concentrating on the high end to play crysis xD. Im told the 9800GTX is out in march, wish i waited now >.<
Perhaps I should have said midrange cards which 99% of gamers buy.
Maybe higher end cards which cost £250 to £400 have a longer life. However most people have better things in their lives to spend money than £250 to £400 graphics cards every 12 months. I am not so desperate to play games to spend that much money IMHO.
I don't understand the outrage. I've said in another thread that I didn't understand why nVidia would release so many cards in the range they've released. But I said so looking from their perspective (they are cannibalising their own line up). From a consumers perspective, I think it is great except for having to spend more time deciding which card to get. And plus, the recent cards provides incremental benefits: the 3850/3870 (necessary because the 2xxx simply weren't competitive enough) and 8800GT/8800GTS (released quite a while after the GTX) are holding well. The GS/256MB 8800GT are probably the most affected by the 9600GT, but they were not particularly popular cards and most likely a 'filler' anyway.
As the optimisation comment, I am very glad things are the way they are. If graphic card companies were to use that excuse for not releasing a new cards, I would label them as being lazy. It's not like new cards require much optimising to see immediate improvements (unless it's about new API - but if it we not for the release of graphic cards that support DX10, we probably would not see DX10 releases - it's probably easier for hardware manufacturer to pave the way as I can only imagine how difficult it would be to code a game to support a new API without the hardware to test it on).
As to the 9600GT SLi - it's great. I was a little sceptical regarding some of the less positive comment regarding the 9600GT SLi in another thread (that SLi was only 1.2x faster, and that the 9600GT SLi would not be faster than a 8800 Ultra). If only because the 3850 Crossfire was already near the 8800 Ultra territory, and even if Crossfire is more efficient as I have heard (but not double checked), I figured that the raw advantage the 9600 has over the 3850 would be adequate to bring it well above 8800 Ultra speed. For another review check this. Looking at how well Tri-Quad 3870 scales, it does look like SLi/Crossfire will become more and more viable. I don't expect to see 100% efficiency, so from a bang for buck perspective, one card will always be better than two of that card from a bang for buck perspective. But it may well be that it is cheaper for GFX manufacturers to produce superior performance from multi-GPU than adding more Mhz to a given architecture (kind of like what we end up seeing in the CPU market). If that's the case, then I can see them put more resource to make SLi/Crossfire more and more viable in the future, and this may well be a first sign.
It would be nice if games were actually optimised to run better on the cards available as opposed to the cards trying to be optimised to run the games better. The latter is the more expensive way of doing things. Anyway I suppose Crysis is an example of this. At least Bioshock ran well on most hardware and was fun to play.
There is only so much you can optimise a game while making it compatible with the various card out there: having a separate pathway for each card is most likely infeasible, even having one for nVidia and another for AMD would be a lot of work. And what happens when nVidia/AMD introduce a new architecture? You will have write more optimisation. Software (re-re-)development costs may not necessarily be cheaper than mass produced hardware enhancements.
Plus, I seriously doubt that cards are optimised to run specific games better. That's usually done in software (game optimisation and driver optimisation). I don't doubt that nVidia has gotten pretty good at optimising their hardware and drivers for openGL and that would explain how well their card does in Quake War. However, this also shows in other OpenGL games.
Don't get me wrong, I am all for better optimised games. But I do not believe that optimisation should only be on the software side, nor do I think that hardware manufacturer should sit back not try to advance their architecture until games uses every little trick available to maximise it. And to be honest, I suspect that the architecture of recent cards hasn't changed much (from G80 to G92 or even the 96-series, or the 29xx to the 39xx) and a game that's well optimised for the G80/29xx will be similarly well optimised for the newer cards.
As for Crysis - I can't comment on it since I've not played it. If it looks as good as I've heard, then perhaps it's not the best example of poor optimisation. It might seem pointless to have a game with such a high system requirement, but it is only *one* game (not that hard to avoid), and I think it is nice to have a playable tech-demo made if only as an indication of how far we've almost got. Gameplay is the most important aspect in games, but the fact that we are pushing beyond NES graphics should be an indication that we also want to see ever improved graphics.
Hicks12 sounds ticked off because the new cards has, or potentially could devalue his 'investment' (I don't really see how a G92 GTS is affected by those mid-range releases to be honest). I am sorry, but if that is the reason, then it is a selfish reason. It's basically saying that because you have made an investment, then people who are buying after you should be deprived of any advancement the manufacturers have made. You can't always have the very best deal when it comes to PC components (think of the people who bought the X1800XT on release, the people who bought DDR2 RAM last year). But if it is a risk you can't accept, then I suggest going EVGA (with the step-up program) next time.
i wasnt saying they shouldnt have a better card O.o. I just said i wish i waited till next month for then 9800. And that a mid range card shouldnt beat a high end card. Crysis is good BUT i dont see how the heck it laughs at all cards, bf2142 has good graphics compared to it, i think the only good gfx crysis has is the guns and people(card and stuff are not the best) and its pretty much just a tad bit better then bf2142. I dont care about the value of my card but i waited a whole year to upgrade as i wanted the 9 series :P and it was delayed and delayed and then poof dates. Evga doesnt support lifetime warranty outside of us/canada hence why i went with bfg as they give outside 10years.
I apologise if I misunderstood then, but I could not interprete 'Nvidia just peed me off, next time round im going with ati, atleast they havent screwed me over with all these little crappy models that work out better O.o' any other way. In practice though, do you really need lifetime warranty? Especially for something like a graphic card? I tend to advocate bang for buck and going for the cheapest vanilla option simply because those cards only need to last long enough till the next upgrade - and if a card last one year, it'll last two (in my experience anyway).
Still, I am not entirely sure what you mean by mid-range should not be able to beat a high end card. If they are of the same generation, then it's a given (although if it was the case, then the mid-end would take the place of the high-end and vice-versa). But it should not be too surprising if the mid-range of a previous generation matches (give or take) the high end of a previous generation: on top of my head, the GeForce 2 MX was close to the full fledged GeForce 1 for instance. To me, that's progress.
You do have the option of selling off your current card and wait for the 9800 (the GTS holds the value pretty well - I don't think it was affected by the 9600 at all). But then, if the 9800 end up being less than splendid you may end up regretting it. Not only that, but the 9800 may start where the 8800GTX did, closer to £300 than £200. If it's any consolation, I highly suspect that the 9800 won't match the performance/£ of the G92 (GT/GTS) on release.
This optimisation already exists! It is called "Nvidia The Way It is Meant To be Played" or the lovely little advert which you can see on almost all major titles. I feel sorry for ATI as they seem to be unable to counter Nvidia's money and PR which leads to games being optimised for its GPU architecture.. Also releasing midrange cards every three to six months means that GPU companies have little incentive to care about optimising the drivers of previous cards to run better on later games. Not everyone is an an enthusiast and wants to be eternally upgrading every few months. Also playing PC games is NOT the preserve of spendthrift enthusiasts and none of that "buy consoles if you want value then" crap which has been said on forums as an answer! In the end the relatively poor sales of Crysis have shown people want value for money(I did buy it BTW). HL2 did well since it could scale very well from relatively weak to good hardware and still looked good at the time. Upgrading once every 12 to 18 months is fair enough for most game playing people.
just to drop my 10Cents in i would like to call COD4 into the playing field. Although the game is not DX10, the graphics in it are still pretty damned impressive and it seems to be playable on the most primitive of systems, a mate of mine runs it on a P4 2.8 with a Nvidia 6200 for crying out loud. Now this has clearly come from an evolution of the COD game engine as it has remained very similar throughout the games. Crysis is one of (if not the?) first to run on the new Crytek engine so its going to be inefficiant just the same as any NEW or highly adjusted hardware, operating system, software or driver, these things will take time and revision to get anywhere the versitility of other games which often run on modified engines which have had plenty time to mature. Half life 2 is an exception as this was excelent from day 1, although saying that... how ong was it delayed for?
Yeh i went a bit far on that comment xD, but all my experiences with ati were great and my card lasted ages(well its still running now). I know they should be able too but it was the fact nvidias marketing was flawed for the consumer(obviously not them) with the g92 gt and gts, it was just to extend the lifetime of the 8800 series product. That was wat i was thinkign originaly but really id rather stick with it, save the chance and then just wait till like 5years down the line and pay the same again.