That's good thinking, I say
Despite clunk's guides being based around P35 motherboards, the theory is entirely the same, just some of the BIOS names might be a little different
I recommend his guide as he keeps on about keeping it safe, and keeping it stable and working. Yes, it will take far longer to follow his route than others, but you will know that, at the end of it all, it will all be very stable
Yeh i read after looking into the quads, they got a Q6600 up to 3.35ghz stable using a Asus Deluxe Mobo but any further and it started to become unstable even so it still generated quite a bit of heat.
Edited found the link
bit-tech.net | Overclocking Intel's Core 2 Quad Q6600 - Introduction & Hardware Choices
The board and CPU finally stopped booting into Windows at around 3470MHz but we couldn’t get any kind of Windows stability at these frequencies. We dropped the CPU clock down to 3400MHz and our test suite ran just fine but unfortunately one of the Prime 95 instances failed after around two hours – the overclock wasn’t 100 percent stable. We finally settled on 3348MHz (372x9) and at this speed the CPU would run everything we threw at it
Hell and Fire was spawned to be released.
1680x1050 is not a super high resolution, but already a point where CPU (of those calibre) would not really matter. Basically, gaming tests that show improvement of quad core over dual core tend to be at 800x600 even in the latest game. By 1600x1200 there is no difference. I suspect that you may still gain at 1024x768/1280x960 without AA/AF but probably not above. Anyway, I don't really think that you can go wrong either way - the way I see it though, is that unless you do multi-task between CPU intensive applications (in which case Quad core is definitely the way to go), you are hedging performance today (dual core edges) for the performance gained from extra core after applications are optimised in such a manner that a 4x 2.4Ghz is faster than 2x3.0Ghz (if overclocking is not factored). The later is also conditional that by the time that happens, you will not be tempted for the next upgrade.
I'll be honest though. I bought dual core when the single vs dual core debate was raging. I probably would buy quad core debate despite my own arguments against. Because outside synthetic benchmarks, I probably would not notice the difference today; and the idea of quad core is attractive.
It's all about where the bottlenecks for a given specific game are, tbh. If a future game is CPU-bound, then quad core will help. High powered graphics also need high-powered CPUs to feed them information, and benchmarks of the 8800 Ultra range have shown them becoming CPU-bound.
In terms of the complexity of the upgrade, changing video card is easy, and the rate of graphics card progress is far ahead of the rate of CPU development.
Thanks for the link, Optical, and thanks Rosaline, for telling me Clunk's guide might still be of use after all (despitein the very thread they said it wasn't for some reason)
Yeah, I think I'm going to go with the quad-core after all, mainly for two reasons: First, thats one of the last things I want to have to replace (aside from a mobo) until I have to, and secondly, any slight performance losses that I may incur with software not coded for quad core is likely not to be too terribly noticable (nothing liek trying to run XP on a PII anyways...which is the system I'm using now *weep*). I suppose a third reason would be if I pursue overclocking, the quad has more room to grow.
Well, unless anyone has anything else to say, about either processors or overclocking, I think this thing is pretty well settled...THanks everyone for your help!
Now if I can just figure out which case to stick all this in...lol CyberPower's choices are slightly limited.
@Rosaline: I did consider that, hence the "of those calibre" in bracket - going by this performance analysis though, at 1600x1200, a 2.1Ghz Core 2 Duo is pretty much all a GTS needs. I don't see the Ultra needing that much more. What I am interested to find out, is how much do graphic cards benefit from multiple cores. Is 4x2.4Ghz better than 2x3Ghz for that purpose?
That IS a good question...and makes me wonder if the new GT would better utalize multiple cores better than previous cards might (or might not have).
Of course, theoretically, using a physics card (in games that support them, like UT3), would take soem strain off of the CPU...thereby increasing performance as well. In a way, it'd be liek having an extra core, dedicated solely for physics computations, easing the strain on the CPU for calculating everything for the graphics card. Or they could well be a waste of money. lol
You're right, TooNice, the GTS bottlenecks before the CPU at 1600x1200. The GTX, however, continues to see gains as the CPU speed is increased.
The GT will likely lie somewhere between the two figures, and an SLI GT setup would be above the GTX in CPU needs.
In terms of how the game works, the CPU determines position, physics (or at least initialises physics), which objects are visible, what the AI characters are doing, and then sends the graphical information off to the GPU to be displayed. To do this, the game must issue an interrupt and let the OS take charge for a moment to send the information across to the GPU, or must use some other similar technique that ensures that the GPU does not get mixed messages. I would hence presume that, at this point in time, only one core can really control the GPU at any given moment.
As such, the raw speed of the display thread in the CPU directly effects the speed of display. If the game is simple and almost entirely graphics based (rather than complex AI and physics), then that is all there is to it. However, typical games have characters with complex AI, physics systems, and other scripted features. These must all be resolved before the display thread actually has the current data to display.
The effect of increasing the number of cores is directly dependant on how parallel the game state information is that the display thread is waiting upon. If the game state consists of the actions of hundreds of seperate AI soldiers, for example, then adding many cores will significantly help speed this up. However, if the gameplay is dependant upon a highly linear problem (such as several thousand 'if... then... else' statements for a single AI alone), then only a single extra core would be of use (allowing the display thread to work on the last tick's game state whilst the AI thread works out the current).
Of course, programming theory states that almost every problem can be made into a parallel processing issue, and this is the key to why quad core is currently highly recommended and one of the reasons why Intel is pushing it so heavily. Games will only get more parallel with time.
So, in conclusion to the question of how graphics cards benefit from multiple cores, the answer is that it is entirely dependant upon the game in question. Older games will have been optimised for fewer cores at higher speeds, whilst newer games will increasingly lap up the additional parallel power of more cores.
I have a q6600 which is clocked at 3.3GHz at the moment. My idle temps are low 30's and at full load (Prime95) is goes to high 50's.
3.0Ghz was reached easily with no voltage increases, and temps were even lower. I hit 3.4 GHz with more voltage, but was concerned as temps got into the mid 60's whilst prime95ing so I backed off.
Paired with my 7800GTX everything is bottlenecked by the graphics card, but hopefully when my new 8800GT turns up I should be able to shed a bit more light on where the bottlenecks lie.
On a separate point, I chose the q6600 because I reckon it'll come into its own in a years time. If I had bought an E6850, I'm sure it would have clocked slightly higher, and I'm sure it would have had more potential in todays games. But let's face it even a normal Core 2 Duo at about 2.4GHz is enough for today's games. You buy something faster so that next summer when something tasty comes out you aren't left behind with single figure frame rates.
I have built a fair number of PCs, about once every 18 months on average, but as a family member ends up inheriting the old one each time, it has to be built for many years service!
Thus, unless you plan to build every year a new machine, get a q6600 and overclock the cojones off it
Why is it, when the poster tells you he's not interested in over clocking, you still get a bunch of replies telling him otherwise?
It seems to me that a lot of people were led to believe that Crysis was going to be more CPU intensive then most. But looking around some reviews, this doesn't seem to be true, they are getting better results on duals rather then quads. (I'd be a bit pi55ed too, if I had just got myself a quad for crysis)
Makes you wonder if this was all a plan, to make people rush out and buy up all the old quads, therefore making way for the new up & coming quads.
For my money, get the E6750,relax in the thought your PC is running crysis as well as the quad, but with the added satisfaction it's running nice & cool, then put the extra £50 you saved on the quad towards the next gen of quads in 18 months time.
Actually, that was a typo on my part - I really meant the GTX. Okay, strictly speaking the GTX do sorta scale beyond 2.13Ghz E6400. For another 800Mhz and twice the cache, the X6800 provides, at 1600x1200, 6% in Far Cry, 2.3% in BF2, 4.7% in FEAR, nothing in Oblivion, 19% in HL2: Lost Coast, 3.7% in Company of Heroes, 3.4% in Q4. I'd say that for today's game, a 2.2Ghz Conroe will not really hold back a GTX.
HL2: Lost Coast do suggest otherwise, and if it is a good indication for future games, then the GTX may potentially benefit from a faster CPU. I've yet to see CPU scaling tests for Bioshock/Crysis; but Anandtech chose not to go above 1024x768 to demonstrate CPU scaling on Unreal Engine 3. I do suspect that part of the reason at least is that any higher and there is a risk that the game will become more GPU bound, invalidating the CPU test. Now once you mention SLi, then I do not doubt that you will definitely benefit from faster CPU. Though frankly speaking, SLi is something I try pretend that it does not exist - for my finance's sake
Regarding your explanation between clockspeed/number of core, yes I understand that much, but I am curious to find out how it applies to today's cutting edge games. The reason being, when dual core first came out, it's benefit (for games) were at best what we see between quad and dual. (Over) Two years later, we can see a 60% improvement going from single and dual core (I am speaking from a purely gaming perceptive). In that space of time, the Core 2 was released and rendered '1st gen dual core' comparatively obsolete.. at a time when dual core slowly started to shine in games. So looking back retrospectively, I do think that people who went from A64 to Conroe probably got more out of their CPU for the money than going X2 if (I stress) looking solely from gaming performances. I am basically speculating that at this very moment, you will more often than not benefit from a faster dual core to a quad core, and by the time this will no longer be true (i.e. games become better optimised for more cores), '2nd generation' Quad that puts the Q6600 to shame will be available. Of course, I could be wrong - perhaps we will not see anything that'll beat Core 2 in the same way it did the X2 within the next 2 years. Or perhaps it will take a lot less than 2 years for games to improve by 60% going from dual to quad.
Last edited by TooNice; 02-11-2007 at 03:48 AM.
How long have these low end Quads been cheaper? 6 weeks ish?
Its everywhere at the moment that the next gen Quads are coming very very soon so my advice is this:
Either
a). Go C2D now and then go for a Quad when the price is indicative of the performance. I bet the low end Quad, by Christmas could be had for less than £120.
or
b). Wait for the next gen QUads and get a better 1st gen as the price will be alot lower or get a 2nd Gen as the price will again be much more sound and value.
How anyone can say the low end Quad is 'good value' for £160-£170 is confusing but it just isnt (unless you are encoding all the time whilst playing games and playing music).
The E6750 for a touch over £100 is the 'Sweetest' spot in the CPU market at the moment.
Actually, I am, especially if I pick up the quad (looks like it can easily be raised to the 2.8-3.0 GHz range, making it more or less match the 3.0 GHz duo except in FSB, where the difference is minimal).
When I said I wasn't interested in overclocking, I meant "not right away" (meaning I'd get the rig runnign and stable for a month or so before tinkering with anything). I also said this to compare real-world opinions on performance when overclocking *wasn't* a factor; because if it is, it seems to me that the obvious answer is, if the 3.0 GHz Duo, and the 2.4 Quad are the same price and the Quad can, without major hassle, likely be boosted to 3.0 GHz, then the Quad is the better buy. My thinking is that would guive you most of the power of the Duo...and then some when the software can make use of those extra two cores. Clunk's guide seemed clear as to you can boost a low end C2D to the 3.0 GHz range relatively safely, so I assume the same theory would apply with the Quads. Granted, I would do it piecemeal (300-odd MHz at a time to check temps? Thank goodness I thought to get a temperature display to mount on my case...). A third reason, I suppose, is initially, I thought it'd be too difficult... The first time I glanced at Clunk's guide a few months back, it sounded liek too much of a headache; the second time was much clearer (though I don't think I'm confortable in messing with RAM timings...I *think* just upping the CPU's clockspeed will be enough for me...).
To Blitzen, normally I'd agree with your logic (I'm typically fairly frugal, except in times it really does make it worth it to pay extra for quality), but upgrading the processor is soemthing i don't want to have to do for a long, long time (a couple of years, maybe?). Its just the one upgrade of a PC I'm leary of doing myself (though changing mobo's would be quite a hassle as well, I would think...). And I think for my needs, the Q6600 will work just fine for a while.
To EarlGrey: Those are similar numbers I'd hope to hit (roughly 3.0GHz, 3.2 at the absolute most; I'd rather play it safe. Plus I hear the 8800GT cards get pretty hot themselves, so the less ambient heat thats not needed, the better. That, and I don't want to mess with changing voltages.
There are currently 1 users browsing this thread. (0 members and 1 guests)