Read more.Meet the new budget champion.
Read more.Meet the new budget champion.
No Athlon X4 760k in the results? Disappoint, since that's actually the nearest genuine rival from a general purpose PC point of view (and also has an unlocked multi for ease of overclocking). I see no reason you'd run an overclocked Pentium Anniversary edition without a discrete GPU, which makes the 760k the perfect comparison.
But based on the other results, it looks like even a heavily overclocked Pentium would only just keep up with a stock 760k. It's a decent chip, sure, but I'd still rather have seen an unlocked i3 personally
Amazed that a dual core chip does that well in gaming, shame on the devs for not implementing proper multithreading.
I saw scan did some BF4 benchmarks and the dual core pentium chip performand the same as a 4770k, Makes me wonder what the other 6 threads that get hammered when I play BF4 are actually doing.
Two possibilities:
One is a DX11 issue - it does a lot of rendering and drawing work in a single thread, which essentially bottlenecks the rest of the render. So you need a certain level of single thread performance to push a certain framerate.
The second, and far more likely scenario given the lack of performance increase from overclocking the Pentium, is that hexus' game settings are making the 750 Ti the bottleneck. Seriously, a 35% overclock garners a 1% - 3% frame rate increase? I'm sorry, but your bottleneck is in another component Mario Just goes to show that your CPU doesn't matter that much if you're running your GPU on the ragged edge... which again makes me question the lack of the 760k in the benchmarks (or indeed a dual-core APU: an A6-6400k would've been interesting...)
Platinum (11-06-2014)
Going to agree with scaryjim here. Those GTX750Ti results clearly indicate the GPU is the bottleneck, not the CPU in those particularly tests.
That may well be the point the Hexus team are trying to make, of course - if you want to game at 1080p on a £100 graphics card you don't need a £150 CPU, a £50 one will do just as well. But if that *is* the point, then it would've been nice to see the two closest competitors from AMD (the slightly cheaper A6-6400k and slightly more expensive Athlon x4 760k) tested with the same GPU, games, and settings, both at stock and overclocked (since both those AMD chips have unlocked multipliers). I understand time is limited for testing and writing reviews, but this one seems to raise more questions than it answers....!
Of course, if time is the issue I'll happily accept a shipment of suitable components to test them myself...
just to add to the 'is it the CPU or GPU bottlenecking?' discussion.
heres some results from a 2.8GHZ Phenom II x4 920, a 5 year old chip.
fire strike 3973 http://www.3dmark.com/3dm/3048268? using default settings.
more stuff tested via link in my signature
so if a £250 intel chip only gets 149 more marks than my old, well used AMD, then yeah I agree that the GPU is the bottleneck.
but I also say, if you want to get a mid range graphics card, like the 750ti or the AMD equivalent and you've got an old PC, then there is no point upgrading anything else in your computer. (except SSD of course.)
scaryjim (11-06-2014)
would have liked to see BF4 MP gaming tbh - the dual core would have cried ...
oh and an offer of an AMD FX 9590 + board if it was needed as well still stands
I'm frustrated with the chip and review. Why do Intel insist on selling everyone a cruddy GPU that is no use to man nor beast on every single CPU?
Why didn't Hexus pair the Pentium with a range of graphics cards. A low,mid and high end set of benchmarks would have suited the people looking to buy this, as it's major attraction has to be budget gaming. It was never going to be a cheap computing powerhouse with only 2 cores. Why such a cruddy GPU Intel? just why.....
Sadly, it's almost impossible to generate repeatable benchmark results from multiplayer gaming. Someone needs to try to work out a way to do it though (locally hosted bot match or something?) because it's a big part of game experience that current reviews can tell you nothing about....
Because it's a CPU review, not a graphics review. There's a hard limit to how much time Hexus have to benchmark systems, after all. If they ran all the tests that every reader wanted to see, they'd be benchmarking every chip for a month
I don't know if you mean intel GPUs in general or the one in this specific chip? As far as the Pentium goes, it's because they've literally taken a Pentium die and unlocked the multiplier in software. As such it's stuck with the cut price silicon that goes into every Pentium, and that means it gets the suckiest implementation of HD graphics outside of an Atom. I suppose the reason you get the GPU at all is that it costs Intel nothing to leave it enabled, so why not....
Pleiades (11-06-2014)
Gah!
It's not an issue with the devs and 'proper multithreading' Most games do simply not lend themselves well to a large number of threads. In most situations a smaller number of threads with more power is a better solution. This isn't an issue that can easily be solved either, as it's just due to the nature of what you're trying to calculate. See, thread synchronization: https://wiki.cc.gatech.edu/multicore...ynchronization (and that's one of many issues).
If you want to see 'perfect' utilization of your threads / cores, then you need data that is delinked and not dependent on each other. For example, encoding audio / data streams on each core.
It can be done, Unreal can do it, but the work required for just bench-marking is disproportionate for the most part when you implement it. Devs just don't have that kinda free time normally.
It's all about result replication, which is quite difficult (impossible) if you give players freedom. The only way of doing it is to have pre-scripted events on the clients (with no control at all), and a server which starts all the clients events when needed. I actually coded something like it back in the UE2 days, but there are slightly better ways to go about it now.
The problem is there are recent games which don't play well on a dual core.
Watch Dogs
http://static.techspot.com/articles-...nch/CPU_01.png
Thief
Test on a forum:
http://oi58.tinypic.com/2igf90z.jpg
Chap said the Pentium dual core was the least smoothest of all the CPUs.
Test from a website:
http://pclab.pl/zdjecia/artykuly/cha..._cpu_g3420.png
http://pclab.pl/zdjecia/artykuly/cha...pu_i3_4330.png
Even with a huge overclock,I am not sure if the Pentium dual core can hit the same framerates as the Haswell Core i3 CPUs in some of the newer titles,and that is not taking minimums or frametimes into account too.
In fact one of the reasons I changed from a Core i3 to a Core i5,was actually for Crysis3. Some parts of the game can be punishing(as is the MP component).
I might have to see what people find when more testing is done of the Pentium G3258.
Edit!!
This would be an awesome PC for the budding WoT player,but I think with some of the newer games,you will see a Core i3 or FX6300 pulling ahead.
Last edited by CAT-THE-FIFTH; 11-06-2014 at 10:56 PM.
Yeah...but none of that is really going against what I've said. Some games are going to benefit from more cores, sure, but to blame lazy devs when this doesn't happen isn't usually fair. There is a heck of a lot more to it than that.
Although in that first Thief benchmark....almost level with an i7 there. Not bad for a 2 core cheaper chip
The last 2.... well it's not just about cores. Different architecture, cache, core speeds, threads (2 v 4), chip extensions, memory bandwidth, hyper threading.....
It's less about cores and more about chip quality.
Yes,but it seems HT is making more and more of a difference now.
The chap running Thief said the Pentium dual core was nowhere as smooth as the other CPUs,and that is a UE3 based game with Mantle enabled.
Thief is not like earlier UE3 based games which did not use HT very well:
http://gamegpu.ru/images/remote/http...test-intel.jpg
The G3420 and Core i3 4330 are both Haswell based chips.
Unless you are really buying a rig to play WoT,or need a stop gap CPU,I am not sure how well a dual core is going to fair in the next 6 to 18 months time.
ATM,most console games are with the XBox360 and PS3 as leads,but this is starting to move towards the newer ones with the multi-threaded X86 chips.
Then you have things like Mantle and DX12,and all of the latest engines moving towards multi-threading.
Even taken all that out of the equation,look how much better something like a Q6600 or Q9550 has fared when compared to a E8400?? Or Core i3 530 or Phenom II X4 when compared to a Pentium G6950??
It just makes me wonder whether Intel is enticing people onto socket 1150 and making a calculated move(hence no Core i3 K series),so eventually when the Pentium dual core performance starts falling apart they can sell shiny new Core i5s and Core i7s as drop in upgrades and make people spend more money with them. Its not a bad way of locking people into a platform I suppose,as it appears Broadwell will probably be on socket 1150 for the K series CPUs.
Last edited by CAT-THE-FIFTH; 11-06-2014 at 11:44 PM.
Yeah, HT works well when the issue of parallel calculations comes up. It's working 'better' these days due to HT being more mature and different (compared to the older P4 gaming days) types of calculations being done.
Yep, but the micro-architecture only defines it's makeup at the silicon level and the tech it uses within. Its perfectly possible to have something based on Haswell and yet be pretty different in terms of performance based on it's makeup. Take laptop / desktop chips for example, same micro-architecture , sometimes almost the same specs, but quite a bit in performance difference.
Yeah, no disagreement there. But let's just keep price differences in mind.
The main issue I'm trying to highlight here though is that the 2 core parts are usually low spec to start with. Comparing them to more core counterparts, that are often much more powerful per core anyway....well, it's often not fair to then go and blame the devs.
AFAIK, all modern engines are muti-threaded. Certainly the ones I use are anyway.
To be honest, Intel has had way too many SKUs for a while. Maybe it is calculated, but the number of choices when buying an Intel CPU at the moment is a little stupid IMO.
The Pentium G3420 and Core i3 4130 are nearly identical outside HT and a 200MHZ clockspeed difference:
http://ark.intel.com/products/77775/...Cache-3_20-GHz
http://ark.intel.com/products/77480/...Cache-3_40-GHz
They have the same amount of cache and support the same amount of RAM speeds. Even extension support is the same outside of AES support on the Pentium. The IGP is better on the Core i3 but that does nothing for CPU performance.
Regarding the G3258 I did notice it has worse memory support than the G3420 too,as it only supports 1333MHZ RAM though.
The HT is making the difference here and in a number of games is making a massive difference.
The Core i3 4130 is not really more powerful per core when compared to the G3240,outside a meaningless 5% clockspeed bump.
The price difference is not huge:
http://www.amazon.co.uk/Intel-Extend...ywords=i3+4130
http://www.amazon.co.uk/s/ref=nb_sb_...s%2Ck%3Afx6300
The FX6300 has even dipped down to close to £70 at times.
The problem is all these stock cooler overclocks are fine and dandy,but what are the temperatures like in the first place??
Will these be viable longterm?
Once you start adding the cost of a better cooler(even a cheap £15) and find a compatible motherboard(some of the B85 can supposedly still overclock though which does help I suppose),the price difference is not massive IMHO OFC.
The problem is highlighted again by games like Watch Dogs and Thief.
Add something like the more intense parts of games like Crysis3 which are very multi-threaded,you can see what is starting to happen.
Plus the other problem,is the whole maximum overclocks argument.
What if you get a dog of an overclocker??
My Q6600 was a crap overclocker. My mates E5300 was crap too,despite the fact they were both meant to be great. My mate an Athlon II X3 which was lucky to go from 3.2GHZ to 3.6GHZ,yet another mate had a 4GHZ one,and that was a very high overclock for such a chip.
I had a fantastic E4300 too.
Even seen it with some of the IB and Haswell chips - people do get crap samples,but luckily the K series Core i5 and Core i7 CPUs perform great at stock anyway.
Yes,but we can see with things like Watch Dogs not really playing well with dual cores,or at least parts of Thief where even Mantle cannot really save the day(which is not surprising TBH).
This is not about SKUs. Its about history.
People have had this dual vs quad core/4 thread CPU arguments for years now.
The E8400 did not last anywhere as long as the Q6600. Despite the fact former had higher IPC,and could overclock more:
http://hwbot.org/hardware/processor/..._e8400_3.0ghz/
http://hwbot.org/hardware/processor/..._q6600_2.4ghz/
The same was seen with the G6950 against the Core i3 530 and Phenom II X4 and Athlon II X3.
The dual cores never really had any legs,and these arguments have been had for years.
I rarely ever advocated the use of the E8400 or G6950 over the competitors.
Pentium dual cores have their uses for games like WoT and the like which are very lightly threaded and would performance as good as an overclocked Core i5 with a Pentium dual core. Maybe even SC2 for example.
I see instances where it can displace much more expensive CPUs.
Some people might think of this of a stop gap CPU until then can afford a Core i5,which is another use.
Or something fun to play with.
However,I would be pushing people to spend that much more towards a Core i3 or FX6300,if they are looking at a broader range of games,especially the kind of builds people won't change the CPU out,which is quite common.
Last edited by CAT-THE-FIFTH; 12-06-2014 at 10:41 AM.
There are currently 1 users browsing this thread. (0 members and 1 guests)