Read more.Does AMD's Dual Graphics' promise stack up?
Read more.Does AMD's Dual Graphics' promise stack up?
Interesting, so it does not work at all on Batman, but works pretty well on AMDs tame title Dirt Showdown.
As the conclusion says, even a modest gamer using a 7850 is probably better off with an i3.
Personally I'm not convinced that there is that much money to be made by targeting the most miserly of PC gamers.
Most PC gamers tend to overspec their machines rather than underspec them.
If you can't afford an extra £100 on your system for a decent GPU and CPU, you shouldn't be buying a new system in the first place (unless you are totally system-less). Therefore I can not see the point of ever buying an "performance APU" for a desktop system as you should buy a discrete GPU if you are going to game.
Slimline systems, all-in-ones, mini-systems, laptops, etc - yes, there is a market. And that's why AMD released Trinity months ago in the mobile lower-power format.
Someone make a cute case that looks like a Minecraft block, stick a mini-itx board with Trinity inside, and sell it for a reasonable price and I'll probably buy it. But I'm not buying a full ATX board and then only using a Trinity, or trying to gang it up with a cheap-ass graphics card to get a few more FPS.
OTOH Tech Report's value for money chart actually does show Trinity is pretty good value at the price point overall, and that cheaping out even further on the CPU (especially going below an i3) is very stupid.
You were surprised Dual Graphics didn't "work" in Batman? Really? Everyone knows Dual Graphics (Hybrid CrossFireX) doesn't work with DX9 games.
He is a another test of dual graphics:
With an HD6670 GDDR5 it can hit around GT640 or HD7750 level in games it actually works in and in many it does not work ATM. Luckily one of these titles is BF3 where it just passes an HD7750 GDDR5. The IGP is around the speed of an HD6670 GDDR3 or GT630 GDDR3 using 1600MHZ DDR3. With an overclock and faster DDR3 RAM you are probably looking at HD6570 GDDR5 or HD5670 GDDR5 level. Anyway,the IGP does also have other non-gaming uses,which also helps it case for inclusion!!
Last edited by CAT-THE-FIFTH; 04-10-2012 at 12:17 PM.
Go!Go! Gadget Underpants!
I wrote a post earlier this day but it did not get posted, probably a problem with account activation or something. Here is the super short version of it:
I have a a8-3850 on ecs a75f-a, xigmatec aegir and my ram is Gskill pi-series 2400mhz 9-11-9-28.
I found out tru weeks of testing that to get the most of dual graphics you need to know this:
1.overclock your IGP to 800mhz(hd6670 levels) I did it by raising the base clock from 100-133 and cpu multiplier from x29 down to x26(2900->3450Mhz) where i get my ram to work at x16. So you get 133x16=2120Mhz and timings down to 8-11-8-28. The IGD clock goes to 133x6=800-ish. The 133 is important becouse SATA has its little "jumps" and its sweet spot is just over 133.
2.AMD did not tell us anywhere of the in depth info how dual graphics work. how the memory resources are shared betwen IGD and hd6670 so i tested it. The discrete hd6670's 1gb gddr5(4k effective) gets rounded down to match your system RAM speeds so you dont get any boost from having a gddr5 card since your ram is probably sub 1800/1600mhz. So get the gddr5 one only if your ram is of high speeds, i would go with asus hd6670dis since it has a slightly higher clock of 810/820mhz and its roubust, better quality and "dustproof"?!?!
3. AMD said to stick with the sub 1866 ram speeds and get tighter timings. This is also not true, i get most FPS when i get my ram to 2240(140 baseC x16)
4. Never go with the dual graphics if you dont play modern games that support it or the ones that are optimized and dx11 (World of Tanks ahtes dual graphics, dual core CPU's and this is where you suck the most with DG)
5. Never go dual graphics with any hd6670 if you cant overclock your IGD to 800-ish youl get close to none impovement and itls suck 80Watts for nothing...
If your MB cant overclock IGD seperatly than AND your cooler is not that good or is stock then go with this setup: base clock to 133, cpu multi to x22(=2900mhz as it was at stock) and youl get your ram and IGD overclocked 33% which is what you aim for. Never go with sub 1866 ram as the price is almost the same.
Now what i would like to see is someone do the same thing with a10-5800 as i did on liano. Get the ECS a85 deluxe board its cheep, reliable and easy to overclock and most of all it support crazy ram speeds. You will be rewarded with a bonus of +30fps in every game you would normaly have 30*-ish FPS. Dont go for 8 or 16gb of ram, its useless mostly and 32bit win7 supports up to 16gb ram with patched kernel. I have 4gb and 512 goes to IGD, every game i play i get above average FPS and i never got down to bellow 512 free ram no matter how hard i multitask
Hope someone posts the results with ecs 85 and A10-5800 with asus hd6670 since i find that to be the very best combo you get get and a pretty cheap one.. Aaaaand dont give me that i dont want atx, this is not a performance system talk.. This is AMD as i remember it innovative, "cheap" and very effective. You can always contact me about the voltages and overclocking help, also i will provide scrrenshots and benchamrks if requested Love AMD fanboy"
Edit: here are some img's you can scroll em just edit the link; imageshack"DOT"us/photo/my-images/823/cachememv.png/
Last edited by beganovicc; 04-10-2012 at 12:53 PM.
You'd be better off buying faster memory and overclocking the igp instead of buying any card up to say, the 7750. I've started selling these and the entry point for discrete graphics is the 7750.
Also, crossfire/sli just sucks horribly in too many cases. Take batman here for example - I believe because it is an Nvidia title that AMD doesn't even attempt to fix crossfire drivers for it.
yup i second that
"your ram budget"+100$(hd6670 price) = meeehh in many titles + you need better PSU
"your ram budget+50$" +50$cpu cooler = better all around windows experiance and more FPS in ALL games and you can power it with some generic/stock/ComesWihChasis PSU + you can always drop in the hd6670 without selling/changing anything else and you will get the best out of Trinity
AMD needs to work on their dual graphics because at the moment the only reason to buy one of these APUs is if you're not going to add a discrete card. For everything else there's Intel unfortunately. The only other possible reason would be if lucid's MVP works wonders on these chips, so apu + mvp + discrete card would be the set-up. Will this be tested? I've struggled to find tests of MVP, I know it has no tangible benefit for my 560ti + 3570k, but maybe that's because the hd4000 is cr@p.
Last edited by MustardCutter; 04-10-2012 at 02:04 PM.
Yes, it would be a great concept if it could be shown to work well.
A great strength of the PC is the ability to upgrade your machine. Trinity doesn't really let you upgrade to it, and doesn't really let you upgrade from it.
what amd really need to do and what they will be doing in the 3rd gen of APU is to go with their fusion idea and make next gen apu's IGP and CPU use the same resources and share cache, also what would be great is if they could make dual graphics use only discrete cards memory. There are a few cards like hd6670 with 2gb of gddr3 and they almost never need more than 512mb. If they could make a hd6670 with 2gb of gddr5 (which they could, they could name it hd6670DGO "dual graphics optimized") and make the next gen apu be able to use ONLY the discrete cards memory without using system ram, hence no need to downclock the gddr5 to mach system ram used by IGD, That would be a guaranty for total ownage of everything but high end system market.
Also another crazy concept i have in my head is for AMD to buy the licence for the notorious rambuses XDR2 which is much faster than gdd5 will ever be, integrate a controller into the 3rdgen APU and ask MB manufacturers to make a separate slot or two for XDR2 jsut for IGD use..
Oooooooor completely integrate the XDR2 ram system into their next gen APU's and bypass the ddr3 and ddr4 and use XDR2 as a complete substitute for ddr... I im curios if they thought of this, and how much market and consumers it would give em. Imagine the possibilities of highly overclock-able APU, with the IGD sharing next level cache with CPU and using XDR2. You could overclock in extremly high since you are cooling it with the a proper cooler/watercooler....
i got my a8-3850's igd overclocked by 40%, imagine a next gen's IGD beast 80% overclocked and feed with XDR2...
Last edited by beganovicc; 04-10-2012 at 02:39 PM.
Considering how much the iGPU performance of the APU is actually affected by the memory speed, I would love to see new benchmarks with 2133 memory (since 2666 is too expensive for a system like this) and maybe an overclock comparison?
Interesting review though.
If AMD were only smart enough to make higher end APUs that were also compatible with their high end cards for a nice little bump in performance, not before fixing the side effect that is. It would move their entire gpu base to cpus too which could potentially shift the dominance to a equilibrium
Thanks for review Tarinder,
Ive been wanting more reviews but there have been none. Thanks for that link also cat.
I think once the xfire starts working as it should, Trinity APU with The hd6670, which can be had for £25 second hand on ebay would be bang for buck even more.
I recently got an Amp that supports bitstreaming for my HTPC. I was looking at a replacement cpu / motherboard or just a graphics card. Went for a 7750 in the end but this came out very close as an option. The Intel i3/HD4000 doesn't even come close!
At least we're getting closer to technology on the desktop that can shut off the GPU when it's not in use. This, and other implementations I've seen so far, don't achieve the "holy grail" (imho) though.
My main concern is having an always-on system that uses hardly any power, which can ask for more power - and resulting energy demands - for a short time when needed.
CPUs are getting better at dynamically shutting off cores, but their minimum power usage is still quite high.
I do think a time will come though when you can have a high end CPU and GPU in a system that uses <20w idle.
There are currently 1 users browsing this thread. (0 members and 1 guests)