Read more.Four R9 290X-class of GPUs go up against three GTX 980s.
Read more.Four R9 290X-class of GPUs go up against three GTX 980s.
Relatively good for value for money this thing. I think these are gonna start selling like crazy with their new price! Shame Amd don't have a nice Cpu equivelant to go with this thing lol... >_<
Very nice, but one would be the preferred option.
Unless I'm reading the figures wrong, I'd slightly disagree with that last bit. According to your benchmarks, the AMD solution (cheaper remember!) manages to beat the best of the Green team in Bioshock, Crysis and Grid benchmarks, and it's pretty close on the 3D Mark, with NVidia pulling ahead on Tomb Raider and Total War.We'd still opt for three GeForce GTX 980s instead of four GPUs housed inside two R9 295X2s, based on our testing, but AMD does win out when value is as important as sheer performance.
I'd come to the conclusion that - ignoring the noise and power draw - that the top end from NVidia and AMD are pretty evenly matched. Decision then becomes whether to take the purchase cost saving and put that against the increase power needed to run the beastie.
Not that this is likely to be an issue for me since I'm firmly stuck on the 1080p res gaming.
1200W?! That's like running a powerful microwave oven. Potentially for hours on end. Every day.
That'll make for an eye-watering electricity bill.
The difference between both options is about 7p an hour, so you would need to game for about 21,000 hours on quad Crossfire to eat up the price difference of about £1500. That is 8 hours a day, 7 days a week for 7 years. If you played 24/7 (or in shifts with friends) you could cut the payback on the Nvidia cards to about 2 1/2 years though
Last edited by Spreadie; 17-12-2014 at 06:00 PM.
Then you'd have to factor in whether it's worth using them for mining, and the relative hash rates you'll get between the red and green team.
I am pretty sure that I could do the calculations, but really I can't be bothered
True, I was taking the price of the graphics setup that was used in the comparative tests as some of the comments on noise and temperature may not apply to the standard cards. Even so, it would still take about 2 1/2 years at 8 hours a day, 7 days a week of 4K gaming to recover the difference. Frankly, if you can afford either setup, I doubt you will worry too much about the electricity bill - especially if you factor in the free space heating provided by the AMD cards
What I found most interesting was that for less than £500 it is possible to get a dual GPU set up that beats the single GTX980 cards which start at £430 for stock reference models and slightly more for overclocked non-reference models. Had the choice been available to me when I bought my GTX780Ti then my GPU may well have been flying a different colour.
Well, numbers aside, I think I'm a bit unusual in that I'd rather actually not own power guzzling hardware like this, regardless of whether I could afford the electricity bill.
I'm actually more impressed by moderately powerful setups that manage to be 'Green'. My own setup at home could be described as 'high end' in terms of its benchmark and gaming performance, and can probably chew through 500W on full throttle, but when I'm just sat listening to music or writing code it's using barely anything at all. Probably less than 100W. This pleases my OCD.
I don't think the use of ameliorated is correct as melioration/amelioration is a positive development process.Originally Posted by Tarinder
Perhaps counterpoised would be more apposite?
I only discovered the word melioration when looking for an alliterative title for a F1 blog post a few months back, so it stuck out like a sore thumb when I read it.
Last edited by Michael H; 19-12-2014 at 12:05 AM.
Turning on more things uses more electricity - shocker! </dailymail>
If you've invested heavily in your gaming rig as your main pastime you'll probably find that the cost of running your PC during gaming for a month isn't actually any more expensive than, say, going to the movies once a week, or having a gym membership. I mean, it's not like your computer is just using that power up to no end - you're investing that running cost in your entertainment. 1200w from the wall costs about 15p an hour. A frugal (say, 200W) machine will cost around 4p an hour. I'm willing to bet that plenty of people would consider 11p an hour a reasonable cost for being able to turn all the pretties up to maximum on a 4k screen
I'm pretty sure my electric costs 26p pKWH during the day. So, if I gamed on a rig drawing 1000W while gaming for 4 hours a day, that'd be £87.36 per quarter. Now if I had a rig that only drew 500W while gaming, that'd be £43.68 per quarter. Something like that can make a big deal to me when it comes to paying the bills! Please correct me if I'm wrong on those calculations. I do have a habbit of messing up with numbers
Doesn't "ameliorated" mean "made better"?
There are currently 1 users browsing this thread. (0 members and 1 guests)