Read more.Quote:
Up to 40 per cent extra performance from a simple RAM switch.
Printable View
Read more.Quote:
Up to 40 per cent extra performance from a simple RAM switch.
So we do need DDR4 after all. Come on Samsung - bring down the rain.
Nothing wrong with that sentence in my opinion. Memory companies have been searching years for a consumer product other then the small overclocking crowd that they can market their faster memory chips at. This is it... clearly. A well priced mainstream system that a lot of people are going to buy.
So yes, I can imagine a lot of memory executives drooling over the numbers in these reviews with the realisation it means more business for their higher end products.
Thanks for the article Tarinder. Really looking forward to tomorrow now :)
It will be good to see if Scan will do some of their OC (FM2) bundles with fast RAM now.
Don't suppose a run with 2133MHz 10-10-10-27-1T memory is an option for you? My samsung green can do this with 1.4 volts in my 3570k rig. I'm interested to see how sensitive performance is to timings as I can run 2133-11-11-11-28-1T with less volts, and this ram is destined for a 5800k based always on living room pc.
Btw I'd really like to see this compared to say, the Trinity cpu + discrete cards like the 630, 640 and 6670.
"However, the test Gigabyte F2-A85X-UP4 motherboard, updated to the latest F3e BIOS, would only run the super-fast memory at this speed in single-channel mode."
Would be interesting to see this 2400 rated ram's performance! Think you'll do an update if amd/gigabyte/ram manufacturers get this fixed?
Problem is that if you're buying a discrete card then there's not much point in buying a trinity cpu, you'll want an i3/i5 unless you intend on cross-firing with trinity's graphics and crossing your fingers that it actually works in the games you play. It wasn't particularly effective with llano if I remember correctly. A similarly price i3 + low end nvidia system versus trinity and trinity + 6670 cross fired would be a very interesting comparison though.
Kitguru did a bunch of benchmarks with a 7970 GHz Edition at 1080p and the A8-5800K beat the i3 2105 in every game -
http://www.kitguru.net/components/gr...w-discrete/19/
The AMD system has expensive 2133 ram, the 2105 build has cheapo 1333. A fairer i3 to compare with the a10-5800k price-wise would be the i3 3220 which has 200Mhz over the 2105 and will be better per clock. The results would probably be in favour with Intel and I'd assume power usage would be too. In most of the gaming tests the difference is negligible and the benchmark is gpu bound, not cpu.
RAM makes almost no difference in gaming outside of the integrated graphics. If the games aren't gpu bound at 1080p then you might as well be running on the integrated instead. Results like this prove that Trinity is capable at high settings gpu-bound gaming as well as low settings (igp). The only place it "fails" is at low resolution gaming with a discrete card, and seeing as nobody does that it's not exactly a problem.
VRZone did a discrete test as well and the A10 wasn't far off an i5 3470 in a lot of games - http://vr-zone.com/articles/amd-trin...e/17272-1.html
Ram does make a difference, maybe not massive but if you got the few percent boost from going 1333 to 1600 in the kitguru tests you'd likely even the playing field in several of the tests. Multiplayer needs strong cpu performance that isn't represented in single player benchmarks and if you're not going to use the igp then you'd most likely be better off buying into the upcoming non-apu piledriver based cpu's rather than trinity.
I have a Core i3 2100 - the A10-5800K is not far off it in many games,even in lightly threaded ones. The Core i3 3220 is clocked nearly 10% higher than a Core i3 2100 and has improved IPC. Even,in SC2 the A10-5800K is around Phenom II X4 970 or 980 level,which is only slightly behind my Core i3 2100.
Add the fact that most reviews are using £300+ cards,and most users buying a £90 CPU are unlikely to get anything over £200(more likely £150 and under) and then I suspect the graphics card will be a bottleneck too.
This is why with the IGP,an A10-5800K still beats even a Core i7 3770K with the highest clocked HD4000 version - CPU is important,but the GPU is still the main factor in most games.
Moreover,Frostbite 2,id Tech 5 and the upcoming UE4 engines are all multi-threaded,so the A10-5800K in the long term will be closer to Core i3 2100 level performance anyway.
So,if the FX4300 series have the same PD cores and L3 cache,they should be quite competitive in performance with a Core i3 IMHO.
If you're spending £100 on a CPU then you're probably not going to spend > £200 on a GPU. What you may do is spend £100 on a GPU that can CrossFire with the GPU within your CPU - only an option with the AMD APUs. If the CrossFire works effectively (e.g., enabling 1920x1080 gaming at medium settings), and in non-gaming scenarios the discrete GPU powers down completely to save power, then you have a very nice option compared to a £100 Intel CPU + £150 GPU.
Unfortunately, that's not a tech AMD have invested much time in, but it's definitely one that would really benefit them now their APUs have sufficient CPU performance to be genuine options in a mid-range build with discrete graphics. Given the latest gen of cards have ZeroCore, which can drop the entire card to around 1W when it's not powering a display, they could easily provide the option of outputting the graphics from the APU and dropping the dedicated card into ZeroCore state. Perhaps it'll get picked up once they've stabilized everything on GCN with the next gen of APUs...