View Poll Results: What will you do over the next year?

Voters
61. You may not vote on this poll
  • Stick with AMD 939, P4 775 or other socket type

    23 37.70%
  • Upgrade to the 1st gen Duo 2

    17 27.87%
  • Upgrade to 2nd Gen Duo 2 Santa Rosa

    14 22.95%
  • Upgrade to AM2

    7 11.48%
Page 2 of 4 FirstFirst 1234 LastLast
Results 17 to 32 of 59

Thread: Core Duo 2 Conroe/Merom

  1. #17
    Senior Member
    Join Date
    May 2006
    Posts
    305
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by aidanjt
    1. x1900XT?.. them things are still CPU limited, especially in Crossfire mode.
    how much difference will you see moving up from a 3500+ to an fx-62 with an x1900xt in both? a lot less than you would see moving up from say an x800xt to an x1800xt instead, and which upgrade costs a hell of a lot more in terms of current retail prices? change the fx-62 to 4600+ and its still true (or at least was before latest price cuts, havent looked)... maybe when you are talking about high end dual gpus you start getting decent returns from cpus in terms of the cost involved but that just reinforces what ive said, especially as you have no way of talking about multithreaded games, except quake4 i think, go fetch some figures for that and ill believe you... if you clock your memory up to DDR600 and get a 1% increase in framerate, does that mean that games are bandwidth limited? only if you want to argue like an ass

    Quote Originally Posted by aidanjt
    2. That day is already here, there are countless programs that need more CPU power, especially network related and heavy duty media. Also, more and more programs are being written in scripted and/or inturprited languages, more crunching power means the overhead involved with JIT compilation is less of a concern.
    good luck with your IT GCSE, thats a good list of CPU/memory intensive stuff that most people will never see let alone use, try and use that quote in the exam but watch out, only 261 pages on the whole of google contain text written by people capable of spelling inturprited like that... if you are going to use professional software on a home computer thats what you get, thats why companies spend millions on workstations and servers... how many people do you think actually know wtf JIT compilation is on here?

    Quote Originally Posted by aidanjt
    3. no they wont, why do you think they invented a PPU?.. you need to free up physics processing a) so they do a proper job with physics calculations, and b) free up CPU time so games developers can make the CPU do other tasks in engine development.. the Creative labs X-Fi also has it's own APU (audio processing unit), freeing up the CPU more.. all these things are needed because CPU ticks are like gold dust.
    what does that have to do with multithreading? if you dont know what the word is, dont start ranting about it... do you know how much difference an APU really makes compared to modern southbridge sound except in terms of how much money it makes for creative? now if you want me to care, tell me what in engine development is coming up that is going to challenge our multithreaded amd x2s except for simple upscaling over time, as all youve done is agree that PPUs will take load OFF the cpu... honestly i would like to know because it changes the viability of CPU upgrades completely and i will gladly accept i am wrong, i dont pretend to be a games developer

    Quote Originally Posted by aidanjt
    4. GPUs aren't limited, in fact most GPUs are so underused that even nVidia (the company behind in GPU performance atm) are proposing to add Physics subsystems to their GPU drivers. And even being used to accelerate MPEG4 encoding.
    see point 1, theyre not underused, thats why ATI is wants it on a separate card... maybe the next gen of nvidia cards will be built to handle it but now they will slow down, im sorry they just will... and what does mpeg4 encoding have to do with the GPU-limit, its a spare function that will never be used at the same time as 3d processing... go and try to play a game on a non-sli nvidia setup while youre encoding through the gpu and you will see how underused the chip is... (btw i am aware that all you have done is repeat nvidia marketting in case you were trying to pretend to be clever)

    Quote Originally Posted by aidanjt
    5. GPUs are incredibly efficent and dedicated graphics processors.. In fact, I'd say GPUs outperform even conroes in general mathematical calcualations, and absolutely destroy them in graphical equasions.
    yes they do, thats why they made them, B+ and gold star for trying hard... now you have misunderstood three successive points but still found the self belief of the stupid kid who's too stupid to realise he's stupid, FYI temps and power consumption are going to take serious hits after the next gen graphics cards, or at least they wont increase any more... requiring a £200 1kW PSU to work with your £300 GPU isnt good marketting, neither is making a graphics card that melts itself whenever you play a game

    Quote Originally Posted by aidanjt
    ...
    I got bored..
    ...
    go and do your homework

    or some JiT compiling

    Quote Originally Posted by aidanjt
    a) conroes aren't expensive.
    god, try reading what i said

    i WAS trolling for some pro-conroe discussion as a response, but please no more fanboi bs

  2. #18
    rad
    rad is offline
    Member
    Join Date
    May 2006
    Location
    Australia
    Posts
    54
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by -ChEM-
    how much difference will you see moving up from a 3500+ to an fx-62 with an x1900xt in both? a lot less than you would see moving up from say an x800xt to an x1800xt instead, and which upgrade costs a hell of a lot more in terms of current retail prices? change the fx-62 to 4600+ and its still true (or at least was before latest price cuts, havent looked)... maybe when you are talking about high end dual gpus you start getting decent returns from cpus in terms of the cost involved but that just reinforces what ive said, especially as you have no way of talking about multithreaded games, except quake4 i think, go fetch some figures for that and ill believe you... if you clock your memory up to DDR600 and get a 1% increase in framerate, does that mean that games are bandwidth limited? only if you want to argue like an ass
    Nothing could be more true. Well done ChEM.
    It seems that people want to argue for the sake arguing. There is not too much inteligence around here apart from yourself.

  3. #19
    Theoretical Element Spud1's Avatar
    Join Date
    Jul 2003
    Location
    North West
    Posts
    7,508
    Thanks
    336
    Thanked
    320 times in 255 posts
    • Spud1's system
      • Motherboard:
      • Gigabyte Aorus Master
      • CPU:
      • 9900k
      • Memory:
      • 16GB GSkill Trident Z
      • Storage:
      • Lots.
      • Graphics card(s):
      • RTX3090
      • PSU:
      • 750w
      • Case:
      • BeQuiet Dark Base Pro rev.2
      • Operating System:
      • Windows 10
      • Monitor(s):
      • Asus PG35VQ
      • Internet:
      • 910/100mb Fibre
    except that games already are multi-threaded....in fact it would be incredibly difficult to write a complex 3D game without threading. The difference is that they are not designed for dual core systems, and most games will just ignore the second core. So it's more than just writing multi-threaded applications - you have to write multi-threaded applications AND do your thread management as such that you can utilise both cores.

    and btw, I think that a large proportion of people on here will know what JiT is JiT stuff really can be CPU limited at the moment too, big Java apps run appaulingly slow on most configurations, and even .NET stuff (my current language set of choice) isn't lightning fast compared to a natively compiled say, c++ app.

    That said you are right in that there are next to no home users who would use these applications, and in reality while the above _could_ be true, any professional software that relys on good performance is just not going to be written in an interpreted langauge is it =) So i've just argued away my own point, oh well

  4. #20
    MacDaddy! darrensen's Avatar
    Join Date
    Apr 2005
    Location
    Sussex
    Posts
    1,695
    Thanks
    6
    Thanked
    43 times in 37 posts
    • darrensen's system
      • Motherboard:
      • Gigabyte z77 UD3H
      • CPU:
      • i7
      • Memory:
      • 8gb DDR3
      • Storage:
      • Loads!
      • Graphics card(s):
      • EVGA 780 GTX
      • PSU:
      • Corsair 850Watt
      • Case:
      • Coolermaster Storm Trooper
      • Operating System:
      • Win 7 64bit
      • Monitor(s):
      • Dell 24"
      • Internet:
      • Plusnet Fibre 80mb
    Quote Originally Posted by aidanjt
    Although probibly some other new technology will rear its head and there'll be another new socket next year
    It's already been decided that Intel will change their socket next year with the Santa Rosa Duo 2.

    I'm not sure if i can justify upgrading to Duo 2. All i play is Counter Strike Source and Flight Simulator and maybe BF2 when i get around to buying it. Currently i'm running a S939 3700 SD. I will need dual core at the end of the year in preperation for Vista.

    Now i either go with a S939 4400 X2, but i would like to start using DDR2 and not worry about changing my mobo. Or i could go for the AM2, that means changing RAM, CPU and Memory. But then if i'm chaning all that i might as well go Duo 2.

    But since Intel are changing the socket for Duo 2 next year, i dont want to be buying a socket configuration that is being phased out.

    I'm sure a lot of us are in this situation. I dont want to fall behind with technology, but at the same time i dont want to be buying an out of date socket type for my mobo.

    EDIT: link found in another post, but for people viewing this thread.

    http://www.overclockers.co.uk/acatal...Core_Duo2.html

    That's a amazing price and i'm sure Scan will do it cheaper than Overclockers!
    Last edited by darrensen; 23-06-2006 at 08:36 AM.

  5. #21
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,003
    Thanks
    780
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)
    Quote Originally Posted by Spud1
    and btw, I think that a large proportion of people on here will know what JiT is JiT stuff really can be CPU limited at the moment too, big Java apps run appaulingly slow on most configurations, and even .NET stuff (my current language set of choice) isn't lightning fast compared to a natively compiled say, c++ app.

    That said you are right in that there are next to no home users who would use these applications, and in reality while the above _could_ be true, any professional software that relys on good performance is just not going to be written in an interpreted langauge is it =) So i've just argued away my own point, oh well
    There are plenty of cases where JIT compilers manage better performance than native compiled code. The reason is simple, a JIT compiler with hotspot optimisation has runtime information available to it that a static compiler can only dream of (even if you use a profiler). Of course, Java has other problems that normally throw that advantage away again...

    Now, if you can run that optimiser thread on its own core so it effectively becomes free then even your single threaded apps will go faster on dual core

  6. #22
    Gentoo Ricer
    Join Date
    Jan 2005
    Location
    Galway
    Posts
    11,048
    Thanks
    1,016
    Thanked
    944 times in 704 posts
    • aidanjt's system
      • Motherboard:
      • Asus Strix Z370-G
      • CPU:
      • Intel i7-8700K
      • Memory:
      • 2x8GB Corsiar LPX 3000C15
      • Storage:
      • 500GB Samsung 960 EVO
      • Graphics card(s):
      • EVGA GTX 970 SC ACX 2.0
      • PSU:
      • EVGA G3 750W
      • Case:
      • Fractal Design Define C Mini
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • 240mbps Virgin Cable
    Quote Originally Posted by -ChEM-
    how much difference will you see moving up from a 3500+ to an fx-62 with an x1900xt in both? a lot less than you would see moving up from say an x800xt to an x1800xt instead, and which upgrade costs a hell of a lot more in terms of current retail prices? change the fx-62 to 4600+ and its still true (or at least was before latest price cuts, havent looked)... maybe when you are talking about high end dual gpus you start getting decent returns from cpus in terms of the cost involved but that just reinforces what ive said, especially as you have no way of talking about multithreaded games, except quake4 i think, go fetch some figures for that and ill believe you... if you clock your memory up to DDR600 and get a 1% increase in framerate, does that mean that games are bandwidth limited? only if you want to argue like an ass
    Ramble, ramble, ramble.. a whole load of rambling and not a lot of thinking. Did you not read the part where I said "especially with crossfire"?.. Multithreaded games doesn't make a damn bit of difference to performance when you have a uniprocessor. And games that have threading models like sphagetti share the same performance issues. You sure make a LOT of assumptions, you assume that a home user will use nothing but C/C++ compiled programs (even sloppy C/C++ code can run like crap on modern CPUs), you assume they'll do nothing more CPU intensive than play games.. Please.


    Quote Originally Posted by -ChEM-
    good luck with your IT GCSE, thats a good list of CPU/memory intensive stuff that most people will never see let alone use, try and use that quote in the exam but watch out, only 261 pages on the whole of google contain text written by people capable of spelling inturprited like that... if you are going to use professional software on a home computer thats what you get, thats why companies spend millions on workstations and servers... how many people do you think actually know wtf JIT compilation is on here?
    I did my GCSEs when you were still suckling on your mothers nipple, don't make assumptions about people and pretend to be intelligent. A GCSE examiner isn't trained to deal with software development issues, coming across "JIT" in papers would be the same as writting it in assembily. Oh, and my apologies for typos in my post, next time I'll type my post out in word, print it out, run around after people and get them to proof-read it.. On that point, your English is less than perfect, your grammar needs work and you don't know how to express yourself clearly, you can easily correct a typo mentally as you read, it's far more challanging to try to read something without structure or meaning.

    Quote Originally Posted by -ChEM-
    what does that have to do with multithreading? if you dont know what the word is, dont start ranting about it... do you know how much difference an APU really makes compared to modern southbridge sound except in terms of how much money it makes for creative? now if you want me to care, tell me what in engine development is coming up that is going to challenge our multithreaded amd x2s except for simple upscaling over time, as all youve done is agree that PPUs will take load OFF the cpu... honestly i would like to know because it changes the viability of CPU upgrades completely and i will gladly accept i am wrong, i dont pretend to be a games developer
    I'm not a game developer either, but if you were any kind of developer you'd know what CPU ticks are, you'd know what I'm getting at by displacing complex equasions from a general purpose processor to a dedicated processor you save massive amounts of ticks. Weither it be dual core, tripple core, quad core, n cores.. it doesn't bloody matter, your 'musings' were based on the assumption that we're GPU limited, and even if we go with the assumption people do nothing more than sit on their hole playing games all day, it's simpily untrue. Some people need decently powerful GPUs for more productive purposes, bearing in mind that 2D overlay rendering is becoming phased out in the near future. And the X-Fi is a true APU, that means you could.. in theory, generate a sound in realtime from a specific point in space, as well as calculate it's distortions and refractions as it travels towards the player, and having hundreds of such calculations going on at the same time.. that's the different between it, and an onboard ****chip that's been crapped out the back arse of realtek.

    Quote Originally Posted by -ChEM-
    see point 1, theyre not underused, thats why ATI is wants it on a separate card... maybe the next gen of nvidia cards will be built to handle it but now they will slow down, im sorry they just will... and what does mpeg4 encoding have to do with the GPU-limit, its a spare function that will never be used at the same time as 3d processing... go and try to play a game on a non-sli nvidia setup while youre encoding through the gpu and you will see how underused the chip is... (btw i am aware that all you have done is repeat nvidia marketting in case you were trying to pretend to be clever)
    See point 1 also, granted pushing physics instructions into a GPU when the CPU is already limited is silly, I mearly used it as an example, rip the arse of nVidia for doing something that stupid, not me, but it only serves to prove that GPUs do have idle ticks to spare, *ESPECIALLY* in SLi/Crossfire configurations with high-end cards, and when you go x1900 high-end crossfire the GPUs are sitting around picking their noses waiting for the CPU to give them work, and that's even when the games are marked as 'crossfire compatiable'.



    Quote Originally Posted by -ChEM-
    yes they do, thats why they made them, B+ and gold star for trying hard... now you have misunderstood three successive points but still found the self belief of the stupid kid who's too stupid to realise he's stupid, FYI temps and power consumption are going to take serious hits after the next gen graphics cards, or at least they wont increase any more... requiring a £200 1kW PSU to work with your £300 GPU isnt good marketting, neither is making a graphics card that melts itself whenever you play a game
    Yes they're hot, yes they use 130W of juice (an approximation), however bearing in mind they do the same job as several general purpose processors which combined would generate more heat and use more energy, that makes them efficent, even compaired to the 35W that high-end conroes use. The GPU heat problem isn't because they're inefficent, it's because the manufacturing process isn't as refined as that of a CPU (larger transisters require more energy to 'push' its gates and thus generate more heat), and they are in a tighter, more confined space which makes adequate cooling solutions difficult. CPUs have always enjoyed hogging up a good 30% of the space of the motherboard for cooling, GPUs only get about 8%, and depending on the design of the card and case, hotspots can build up around them.

    Quote Originally Posted by -ChEM-
    god, try reading what i said

    i WAS trolling for some pro-conroe discussion as a response, but please no more fanboi bs
    I did read what you said, I also said I got bored of reading it, your little a, b, c, d points were all based on the assumption that a) conroes are expensive, which they clearly are not, consider the fact that you will be able to pick up a Core 2 DUO E6600 for about £250, which outperforms an AMD FX-62 price at £730, it's a bloody bargan and you'd be insane and a fanboy for *not* buying one if you're in the market for a system upgrade, b) that we don't need any more CPU power when clearly that is untrue also. And I'm no fanboy, when I purchase hardware, it's on the basis of my criteria, not marketting hacks, not what everyone else goes by, but what I *need*, and my needs are a) compatability with a wide varity of operating systems (there are more operating systems than windows, mkay nVidia?), b) stability, c) and performance. All in that order.

    Now, if you feel the need to reply, please make less assumptions and condecending remarks of my charactor. Keep on track of the technology and articulate your points more clearly, if you will. I'm sorry if you felt offended that I disagreed with your points, but there's no need to be rude about it. mmkay?.. thanks

    *EDIT* oh yea, stay on topic also.
    Last edited by aidanjt; 23-06-2006 at 11:30 AM.
    Quote Originally Posted by Agent View Post
    ...every time Creative bring out a new card range their advertising makes it sound like they have discovered a way to insert a thousand Chuck Norris super dwarfs in your ears...

  7. #23
    Gentoo Ricer
    Join Date
    Jan 2005
    Location
    Galway
    Posts
    11,048
    Thanks
    1,016
    Thanked
    944 times in 704 posts
    • aidanjt's system
      • Motherboard:
      • Asus Strix Z370-G
      • CPU:
      • Intel i7-8700K
      • Memory:
      • 2x8GB Corsiar LPX 3000C15
      • Storage:
      • 500GB Samsung 960 EVO
      • Graphics card(s):
      • EVGA GTX 970 SC ACX 2.0
      • PSU:
      • EVGA G3 750W
      • Case:
      • Fractal Design Define C Mini
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • 240mbps Virgin Cable
    Quote Originally Posted by DanceswithUnix
    There are plenty of cases where JIT compilers manage better performance than native compiled code. The reason is simple, a JIT compiler with hotspot optimisation has runtime information available to it that a static compiler can only dream of (even if you use a profiler). Of course, Java has other problems that normally throw that advantage away again...

    Now, if you can run that optimiser thread on its own core so it effectively becomes free then even your single threaded apps will go faster on dual core
    Yea, well written java code vs. piss poor C++ code. Compilers optimise, not perform miracles

    I still think Java is a horrible language, and the GUI the JVM spits back at the user doesn't blend well with the rest of the system, looks kinda old and tacky... I'm kinda liking C# myself for small or proof of concept stuff.
    Quote Originally Posted by Agent View Post
    ...every time Creative bring out a new card range their advertising makes it sound like they have discovered a way to insert a thousand Chuck Norris super dwarfs in your ears...

  8. #24
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,003
    Thanks
    780
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)
    Quote Originally Posted by -ChEM-
    some musings:
    quad-core chip (kentsfield) have against a HT-based quad AMD (both are dual-dual-cores not actual quads hence linking and master/slave issues)?
    Out of a huge posting that I almost totally agree with, this bit stands out. According to the Inquirer web site, AMD are looking at chip scale integrating two dual cores into a single package. They are not always a totally reliable source, and I haven't seen that anywhere else.

    Given that the current AM2 chips are already shipping with a quad core enabled crossbar on them; from history that they resisted bolting two single core opterons together when they went dual core and the announcement of their 4x4 "technology" I just don't see AMD going the multi chip package route.

    Other than that, thank you for an excelllent, enjoyable and spot on rant

  9. #25
    Senior Member
    Join Date
    Mar 2005
    Posts
    1,117
    Thanks
    8
    Thanked
    10 times in 9 posts
    My 'built on a budget' system of a few years ago is grinding to a halt. Im building a new system over the summer to be powerful and quiet and last me through Uni with minimal upgrades.

    I have a little money to play around with for once, so im not cutting back too much and am building a system based on a watercooled e6600.

  10. #26
    Gentoo Ricer
    Join Date
    Jan 2005
    Location
    Galway
    Posts
    11,048
    Thanks
    1,016
    Thanked
    944 times in 704 posts
    • aidanjt's system
      • Motherboard:
      • Asus Strix Z370-G
      • CPU:
      • Intel i7-8700K
      • Memory:
      • 2x8GB Corsiar LPX 3000C15
      • Storage:
      • 500GB Samsung 960 EVO
      • Graphics card(s):
      • EVGA GTX 970 SC ACX 2.0
      • PSU:
      • EVGA G3 750W
      • Case:
      • Fractal Design Define C Mini
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • 240mbps Virgin Cable
    Quote Originally Posted by Scarlet Infidel
    My 'built on a budget' system of a few years ago is grinding to a halt. Im building a new system over the summer to be powerful and quiet and last me through Uni with minimal upgrades.

    I have a little money to play around with for once, so im not cutting back too much and am building a system based on a watercooled e6600.
    I'd hold onto that money and wait up till the end of the year and see why Intel is releasing a new socket design, unless it's burning a hole in your pocket
    Quote Originally Posted by Agent View Post
    ...every time Creative bring out a new card range their advertising makes it sound like they have discovered a way to insert a thousand Chuck Norris super dwarfs in your ears...

  11. #27
    Senior Member sawyen's Avatar
    Join Date
    May 2005
    Location
    Sheffield University
    Posts
    3,658
    Thanks
    7
    Thanked
    22 times in 21 posts
    • sawyen's system
      • Motherboard:
      • MSI Laptop motherboard
      • CPU:
      • Intel Core i7 740QM
      • Memory:
      • 8192MB DDR3
      • Storage:
      • 256GB SSD, 1TB WD
      • Graphics card(s):
      • AMD Mobility HD 5870
      • PSU:
      • MSI stuff
      • Case:
      • N/A
      • Operating System:
      • Win 7 64bit
      • Internet:
      • Virgin ADSL rubbish
    Quote Originally Posted by aidanjt
    ...I did my GCSEs when you were still suckling on your mothers nipple...
    Yikes.. Isnt that a little wee bit just sqwibbly tiny bite size too rude?
    Me want Ultrabook


  12. #28
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,003
    Thanks
    780
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)
    Quote Originally Posted by aidanjt
    Yea, well written java code vs. piss poor C++ code. Compilers optimise, not perform miracles
    Sir, you seem in fine trollish form today!

    You seem to speak with inside knowledge not available to those of us that merely read the research material. Perhaps you wrote the C++ code?

  13. #29
    Senior Member
    Join Date
    Jan 2006
    Posts
    395
    Thanks
    2
    Thanked
    7 times in 7 posts
    • atmadden's system
      • Motherboard:
      • MSI P35 Neo2 FR
      • CPU:
      • QX9650@4.2ghz 420x10
      • Memory:
      • Crucial Ballisitix PC5300 C3@840 4 4 4 12
      • Storage:
      • Maxtor 250gb SATA
      • Graphics card(s):
      • MSI 8800GTS OC 512mb
      • PSU:
      • Corsair HX620
    Gonna stick with my opty 165 for now wait and see what is on the horizon.
    Core i7 860 @ 4ghz
    MSI P55 GD65 4gb Gskill Ripjaw 2xAsus 5770 1003/5600 Corsair HX620 psu http://trust.hexus.net/user_profile.php?user=10950

  14. #30
    Theoretical Element Spud1's Avatar
    Join Date
    Jul 2003
    Location
    North West
    Posts
    7,508
    Thanks
    336
    Thanked
    320 times in 255 posts
    • Spud1's system
      • Motherboard:
      • Gigabyte Aorus Master
      • CPU:
      • 9900k
      • Memory:
      • 16GB GSkill Trident Z
      • Storage:
      • Lots.
      • Graphics card(s):
      • RTX3090
      • PSU:
      • 750w
      • Case:
      • BeQuiet Dark Base Pro rev.2
      • Operating System:
      • Windows 10
      • Monitor(s):
      • Asus PG35VQ
      • Internet:
      • 910/100mb Fibre
    Hmm yes it is true that *sometimes* JiT code *can* be faster than a natively compiled app, BUT that is mostly talking about smaller applications, many of which are not cross platform anyway - negating part of the benefits that a language like Java or C# can offer you. They are suited for different purposes, you wouldn't write Quake 5 in java or C# now would you Sure you *could* and it would be interesting...but hell I wouldnt want to, for something like that I would defnitely want my own memory management for a start..

    But thats going way OT now =)

  15. #31
    Member
    Join Date
    Jun 2006
    Location
    Derbyshire
    Posts
    115
    Thanks
    0
    Thanked
    0 times in 0 posts
    • philipbain's system
      • Motherboard:
      • Biostar Intel P45
      • CPU:
      • Intel Core 2 Quad Q6700 @ 3.33GHz
      • Memory:
      • 8GB Corsair DDR2-800
      • Storage:
      • 64GB Kingston SSD + 4.6TB of storage across 4 hard drives
      • Graphics card(s):
      • ATi Radeon HD4850
      • PSU:
      • X-Clio 485W
      • Case:
      • Akasa Zen (black)
      • Operating System:
      • Windows 7 Home Premium 64 bit
      • Monitor(s):
      • 22" Acer P225HQ LCD 1920x1080
      • Internet:
      • BT ADSL Max Option 3 Unlimited
    I'm definately upgrading within the next few months as my current PC was thrown together on the (extremely) cheap after my old PC was taken out by a power surge! I think that i'm going to go with the Intel Core 2 Duo as i'm definately wanting to go dual core, AMD's dual core chips are overpriced and Core 2 looks really promising.
    "There's nothing nice about Steve Jobs and there's nothing evil about Bill Gates" - Chuck Peddle, father of the 6502 and the Commodore PET

  16. #32
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,003
    Thanks
    780
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)
    I just realised that, as the dual core with 2x1MB cache variants are disappearing from AM2 Athlon64 lineup, we will be at the mercy of the new socket AM2 Opteron pricing for any upgrade. Hope those prices are good!

    In case you are wondering, I use the GCC compiler for a living and it loves 1MB cache CPUs, so yes it does matter.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. PSU Calculator
    By Hullz-Modz in forum PC Hardware and Components
    Replies: 146
    Last Post: 10-04-2008, 07:07 PM
  2. Replies: 4
    Last Post: 13-06-2006, 09:25 PM
  3. Replies: 0
    Last Post: 01-03-2006, 02:40 AM
  4. Core Duo power consumption solved
    By Steve in forum HEXUS News
    Replies: 8
    Last Post: 01-02-2006, 05:21 PM
  5. Dual core and all that - enlighten me please
    By iranu in forum PC Hardware and Components
    Replies: 11
    Last Post: 22-03-2005, 03:43 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •