Not forgetting the A7 is a dual core, the T100 uses a quad BT.
Not forgetting the A7 is a dual core, the T100 uses a quad BT.
It seems BF4 does not do badly on AMD CPUs:
http://gamegpu.ru/action-/-fps-/-tps...-test-gpu.html
And yet I still see people recommending i3 for 'gaming' systems...
If you're running well threaded workloads, anyway. BF4 is by far the best threaded game ever released, it'll pretty much scale with however many cores you can throw at it. If it's a good indication of the way game engine design is going (and with both next gen consoles containing 8 x64 cores, it may well be) then anyone who plumped for an FX 6 or 8 core CPU should be feeling pretty smug about now
That means Frostbite3,CryENGINE3 and id Tech 5 all run well on AMD CPUs. Unreal Engine 3 already uses upto 4 cores reasonably well,so its probably not unusual for Unreal Engine 4 if it scales up to 8 threads,and does well on AMD CPUs too. That would be 4 big engines doing well on AMD CPUs.
LOL,some of the people over on Anandtech forums,are obviously not happy about the results. Of course they dug up some results from pclab.pl,which is always has had the most dubious results for AMD CPUs. I remember during the BF4 Alpha or Beta tests,they quietly used 32 player maps instead of the 64 player maps other websites used to make the Intel CPUs look better.
That is the problem I have with a cousin in law, he finds these annoying forums that say he has to get an i5 when the AMD options are cheaper and better considering his build is specifically for BF4. I hope some sense prevails, I won't be recommending the expensive i5 that's for sure.
But we're nearly at 2k posts.
It has to be some sort of record for Hexus!
BTW I've seen some mods on other forums close long threads and reopen as a part x; is there actually a reason behind that? I've seen performance mentioned but no-one seems to have an explanation when asked?
Also, maybe the next thread should be named: AMD - Steamroller/Excavator + random processor chitchat.
The thing is the reason people need to do such a thing is down to insecurity. Anybody normal would have no issue in seeing a £85 CPU doing relatively well against a more expensive one,as the addtional competition will drive prices down or keep them competitive in the long run. Thats how I see it.
However,the other lot don't see it that way. They need to self validate their own purchase,and they need as many people to buy the same brand of CPU or CPU model to make them feel better about the purchase,since more people buying them is a sort of validation of their own choice for them. They bought into the "best" brand. This also means,they need to hate any situation where a cheaper CPU from another brand does well too,as it makes them feel deep down their own purchase is not as well informed or less value for money. Basically,they feel their E-PEEN is threatened and their "geek credentials" status is reduced. TBF,it is the same with many hobbies revolving around tech of some sort.The worst thing is the noobish fools end up making everything more expensive in the process for everyone.
Last edited by CAT-THE-FIFTH; 31-10-2013 at 01:29 PM.
Yep
It kind of felt right, I only jumped because my hard drive failed and I figured if Windows had to be re-installed then now was the time to but a new motherboard.
But if anything I am surprised at how well the old 955BE is doing, the CPU it replaced which is now doing sterling work in my son's PC.
This thread will outlive us all!
Shame that graph doesn't have the 6350, the CPU I nearly bought. Looking at the gap between the 8350 and the 9370 performance scales ok by clock. I guesstimate another 5 frames over the 6300 putting it just behind the 4670K.
An APU result would have been very interesting too, find out how much the L3 cache is helping.
I'm not. There are 2 "gaming" desktops in my house. One houses a Phenom II 905e and a Radeon 7750, the other has (following a recent upgrade ) a Q6600 and a HD4850.
To be fair, neither of us game at high resolution or high settings, but we're both perfectly happy with our gaming experience on our aging hardware.
At least, I would be if I could be bothered to fire the desktop up, but since I got my new laptop with an A10-4600M I can't see the desktop getting used that often, tbh. Doubt I'll ever get rid of it, but it might end up relegated to work duties....
On another side note,Nvidia back to their dirty tricks with the latest Batman game:
http://translate.google.com/translat...htm%23t8918976
The post is from the reviewer at Hardware.fr it seems!
FYI, interviewed after the rather poor performance of the 290X in Batman AO 4K AMD redirects us to the communications manager of WB and communicates this:
Quote:
We are aware That the performance of AMD Radeon ™ graphics cards in Batman: Arkham Origins is not up to user expectations. DESPITE our efforts to Improve this location Within the timeframe offert by Warner Brothers, our efforts Were Refused. Further, the design of the game and deliberately prohibits Fundamentally AMD's software engineers from collaborating with the developers to design performance optimizations That can be integrated into the game's codebase.
Sweclockers has tested the R9 290X in BF4 MP using the same map:
http://translate.google.com/translat...2F4%23pagehead
Interestingly at Ultra settings at 2560X1440 and 1920X1080,the FX8350 is within 10% of the Core i5 4670K. However,if you drop the settings down to medium,it appears the CPU loading profile changes,and appears less multi-threaded it seems.
There are currently 34 users browsing this thread. (0 members and 34 guests)