Read more.Quote:
Next-gen graphics cores to support 3D Blu-ray and feature decent gaming performance.
Printable View
Read more.Quote:
Next-gen graphics cores to support 3D Blu-ray and feature decent gaming performance.
I can't believe how overhyped this has been. Hexus's own review of the 5450 wasn't pretty with the evaluation of...
Instead of celebrating this "achievement" intel should be damned for once again failing to provide acceptable gaming performance. They could do an awful lot better.Quote:
If the primary concern when purchasing a graphics card is to play games, then do yourself a favour and spent an extra £10-£15 for either a Radeon HD 4670 or GeForce GT 220.
http://www.hexus.net/content/item.php?item=22120&page=6
Llano is likely to offer 4670 or better performance for less cost.
If you mean Llano will be crap, I respectfully disagree. It won't be an SB beater but it could get close to say the i7 920 in a lot of things.
It will also thrash any intel cpu in gpgpu, one of the main points of the chip and Fusion in general.
Well what about simple stuff like speeding up webpages using D2D? Most of us spent a lot of time on the web and it's still stuck like it was 10 years ago, with very little changes.
http://blogs.amd.com/fusion/2010/06/...ust-for-games/
If intel wasn't so big they'd have been left behind years ago. As it is the web will probably always have to cater for people with backward intel igp's like the one on SB.
'The web' doesn't cater for backward GPU's at all, GPU's are here to assist in stuff not to be a requirement for web browsing.
So which is it, Intel being so big that they would of died out if otherwise (theres a reason theyre so big) or that the web is catering for people behind the times (should they be forced to upgrade every time a company clicks their heels) ?
I'm just going to quit this thread anyway as its only going to be a few posts a way before you invoke the AMD/BD/Fusion equivalent of Godwins law.
However much you lot like to slate Intel GPUs for their poor performance they are perfectly fine for the majority of users, like my parents and business desktops. Gaming is a niche user scenario, sorry it just is. Lower power draw and cost are far more important in a most PC buying decisions than whether or not it can play Crysis.
It's good to see Intel making these improvements to their IGPs, this will enable users with them to have future proof systems and more performance is good, Intel need to provide enough power for Windows Aero, accelerating web sites, media duties and very light gaming - thats all they are intended to do. If you want to seriously game then AMD/ATI or NVidia are there to assist...
Diverging the subject somewhat, this is still fairly encouraging news (though I don't look forward to the inevitable "Intel pricing").
It'll be either this or AMD's Bulldozer, then, that might finally retire my Q6600. A CPU that's had more life in my main workstation than any other - over 2 and a half years - but between Ableton and Bryce (month-long renders are not fun), I'm desperate to upgrade it. Further, it's paired with 6GB of DDR2 - 4 of which cost me only £37 - and I find today's RAM prices depressing. A full motherboard+CPU+RAM upgrade is always the most costly one, but with the same amount of RAM that I have now costing more than double what I paid for only 4GB of it, I'm not budging unless Bulldozer or Sandy Bridge really are the power-frugal and performance-capable solutions I'm hoping for.
And yeah, no less than 8 cores, please!
What exactly are you saying? That the Intel chip gives 10% better framerates at half the resolution and without post-processing? :confused:Quote:
Tests on an early sample by Anandtech, though, show that the improved graphical-horsepower can be used for more than just watching films. Using a 3.1GHz Core i5 2400K with a ‘single-core' integrated-GPU, the Intel chip traded blows with a discrete ATI Radeon HD 5450, beating it by up to 10 per cent in some games.
Admittedly, the graphical-fidelity was decreased and the resolution was low.
and after all the promises still no USB3, and a SATA3 solution similar to to what 3rd parties have been offering (a bunch of SATA2 and 2 extra ports for SATA 3).
This would have been fine if it wasn't for Intel demanding you change your motherboard in order to use SB. So in order to upgrade your CPU, you need to sidegrade your MB :eek:. Of course there have been improvements like upgrading the PCIe lanes but then again it should have been there in the 5x chipsets (AMD hasn't had such nerfed PCIe on their MBs). So you got yourself a nice 750 and want a bit more CPU power... that 2600 looks real nice, but before you go down that road you need to get a brand new motherboard (don't worry, it has two extra SATA3 ports :thumbsup: ). Hopefully they've done more on the Q6x and push them out without too much delay.
Really strange why Intel, the main driving force behind mainstream SSDs, have decided to deliver a half-arsed SATA3 solution. Most people will be buying these motherboards in 2011 and some market research groups are predicting SSDs to slip into mainstream price segments by 2012.
Don't even get me started on USB3 (Light Peak is not a good enough excuse)
It gave up to 30% win in the Dragon Age benchmark, that is why it "won" overall by 10%.
Dragon Age actually hugely favours intel cpu's, it's got nothing to do with the graphics part. In a normal benchmark suite (note, nobody except Anand benchmarks Dragon Age), a 5450 will be faster than this Sandy Bridge IGP.
This is intel we're dealing with, and Anandtech. Your first port of call is to believe nothing these two conspire to tell us.
Anandtech is also one of the most derided tech sites around. Did you know he got a full new server of Xeons today?
The point should be pretty obvious, when a single benchmark out of what 5 benchmarks is so hugely skewed in favour of certain cpu's, then the gpu's can't be properly compared.
Take out that Dragon Age result and guess what happens?