Results 1 to 7 of 7

Thread: Using Coolbits as OC Indicator?

  1. #1
    Senior Member
    Join Date
    Feb 2004
    Posts
    1,891
    Thanks
    218
    Thanked
    61 times in 53 posts
    • jonathan_phang's system
      • Motherboard:
      • Asus Rampage III Extreme
      • CPU:
      • i7 930 @ 4.2 ghz (200x21)
      • Memory:
      • 12GB Corsair XMS3 1600
      • Storage:
      • Crucial M4 128GB SSD + Misc Data Drive
      • Graphics card(s):
      • EVGA GTX 1080 FTW
      • PSU:
      • Corsair HX850 Modular
      • Case:
      • Antec 300
      • Operating System:
      • Windows 7 x64
      • Monitor(s):
      • Asus PB278Q (27" 2560x1440)
      • Internet:
      • Virgin Media 100mb

    Using Coolbits as OC Indicator?

    Hi all,

    Recently, my trusty ATI 9800 Pro died on me and so I decided it was time to upgrade. Eventually, got a Galaxy Geforce 6600GT. I’ve currently been testing it at stock speeds (which I believe are 500/1000). This seems fine so I just want to see how far I can push the card, as I heard a long time ago that 6600Gt’s tend to OC well.

    Usually, I use something like RivaTuner to do this, but since going back to Nvidia, I thought I would drag out Coolbits and try that instead, as it means one less program to use. However, using the Detect Optimal Frequencies option, it gets to about 560/1070. If I try to ramp it up to more than this then it fails the ‘Test new Settings’ option.

    What I would like to know is whether or not the coolbits test is usually correct? Or am I able to push it further with another program?

    Thanks

    JP

  2. #2
    Now with added sobriety Rave's Avatar
    Join Date
    Jul 2003
    Location
    SE London
    Posts
    9,948
    Thanks
    501
    Thanked
    399 times in 255 posts
    I think the auto overclocking programs are usually pretty conservative- you're best off overclocking the core and memory seperately to find the limits of each. Start at 560/1000 and raise the core speed five MHz at the time until you get artifacts in 3d programs, then back off to the last stable setting. Then go to 500/1070 and do the same with the memory.

    Core errors usually appear as 'jaggies' and obvious polygon errors on the screen; memory errors cause little sparkly dots to appear on the picture..

  3. #3
    Senior Member sawyen's Avatar
    Join Date
    May 2005
    Location
    Sheffield University
    Posts
    3,658
    Thanks
    7
    Thanked
    22 times in 21 posts
    • sawyen's system
      • Motherboard:
      • MSI Laptop motherboard
      • CPU:
      • Intel Core i7 740QM
      • Memory:
      • 8192MB DDR3
      • Storage:
      • 256GB SSD, 1TB WD
      • Graphics card(s):
      • AMD Mobility HD 5870
      • PSU:
      • MSI stuff
      • Case:
      • N/A
      • Operating System:
      • Win 7 64bit
      • Internet:
      • Virgin ADSL rubbish
    i wudnt use Nvidia's flaky 'detect optimal frequencies' settings. Its been known to produce really random and unreliable overclocks.. Its also highly dependent on the ambient temperature, once it told me my card could be driven to 455Mhz/1150Mhz, after lunch, its then gave me only 408/1100... and I notice the really 'nice' recommended speeds never works.. I tried running DOOM3 on 455MHz/1150Mhz, the framerate began the throttle.. white specks.. messed up redering..

    It best you try out incremental settings, and press 'test new settings' to see if it passes.. try raising 10Mhz at a time.. when you finally fail the 'test new settings'.. use the last know working frequency and run 3Dmark05 for several loops to see if there's any redering problems.. if it works.. then u just found your gfx limit.. if it doesnt, lower that clock a little lower.. and run 3Dmark again..
    Me want Ultrabook


  4. #4
    sneaks quietly away. schmunk's Avatar
    Join Date
    May 2004
    Location
    Wiki Wiki Wild West side... of Sussex
    Posts
    4,424
    Thanks
    40
    Thanked
    163 times in 121 posts
    • schmunk's system
      • Motherboard:
      • Abit NF7-S v2.0
      • CPU:
      • AMD Athlon-M 2500+
      • Memory:
      • 1GB of Corsair BH-5 and 512MB of something else
      • Storage:
      • 160GB Seagate Barracuda
      • Graphics card(s):
      • ATI Radeon X800Pro, flashed to XT
      • PSU:
      • Hiper Type-M ~400W
      • Case:
      • Antec cheapy
      • Monitor(s):
      • AG Neovo F19 LCD
      • Internet:
      • Virgin Media 4MB/s
    With my 6800LE, it would not let me do any overclocking with pipes/shaders unlocked. However, it was possible to find a nice o/c at standard settings and then unlock in Rivatuner, to give a perfectly useable o/c'd, unlocked card with no artifacts.

    I think it is to do with the temperature sensor in the card (hence the dependence on ambient). Therefore, open the case and point a desktop fan at the card whilst you're o/cing it, save those settings and you should find it still perfectly usable when back in your sealed case.
    Last edited by schmunk; 05-08-2005 at 02:44 PM.

  5. #5
    Senior Member
    Join Date
    Feb 2004
    Posts
    1,891
    Thanks
    218
    Thanked
    61 times in 53 posts
    • jonathan_phang's system
      • Motherboard:
      • Asus Rampage III Extreme
      • CPU:
      • i7 930 @ 4.2 ghz (200x21)
      • Memory:
      • 12GB Corsair XMS3 1600
      • Storage:
      • Crucial M4 128GB SSD + Misc Data Drive
      • Graphics card(s):
      • EVGA GTX 1080 FTW
      • PSU:
      • Corsair HX850 Modular
      • Case:
      • Antec 300
      • Operating System:
      • Windows 7 x64
      • Monitor(s):
      • Asus PB278Q (27" 2560x1440)
      • Internet:
      • Virgin Media 100mb
    Cheers for the advice. I've done plenty of OCing - probably why my 9800pro died - but i thought that it might be quite conservative with the coolbits auto detect. Shame it wont let you put it higher without testing, as the interface is nice.

    I guess i'll just reinstall rivatuner then
    Cheers

  6. #6
    Senior Member
    Join Date
    May 2004
    Location
    Rochester, NY
    Posts
    1,041
    Thanks
    4
    Thanked
    8 times in 8 posts
    • oralpain's system
      • Motherboard:
      • DFI "Blood Iron" P35-T2RL
      • CPU:
      • Intel Pentium E2140 @ 400x8 (3.2GHz), 1.375v
      • Memory:
      • Crucial Ballistix DDR2 800 CL4 @ 500MHz (DDR 1000), 4-4-4-12-T2, 2.3v
      • Storage:
      • 2x Seagate ST3250410AS
      • Graphics card(s):
      • NVIDIA 8800GTS (G92) 512 @ 783MHz core, 1836MHz shader, 1053Mhz memory, stock cooling 70% fan speed
      • PSU:
      • Seasonic SS-500GB
      • Case:
      • Antec P182, with some small modifications
      • Monitor(s):
      • ASUS VW222U
      • Internet:
      • Time Warner "Road Runner" Cable - 16 megabit downstream, 1 megabit upstream
    I have never seen a 100% stable OC that was better than the coolbits auto overclock settings.

    Generally I let the card get hot by running a windowed benchmark/tech demo (rthdribl, for example) with the fan speed forced to about 70% or so, then I close it and immediately do the auto overclock test in coolbits. I then take the number it give me there and subtrract a few MHz for it. That is usually a stable overclock.

    Personally, I would not call the coolbits test conservative.

  7. #7
    Senior Member
    Join Date
    Feb 2004
    Posts
    1,891
    Thanks
    218
    Thanked
    61 times in 53 posts
    • jonathan_phang's system
      • Motherboard:
      • Asus Rampage III Extreme
      • CPU:
      • i7 930 @ 4.2 ghz (200x21)
      • Memory:
      • 12GB Corsair XMS3 1600
      • Storage:
      • Crucial M4 128GB SSD + Misc Data Drive
      • Graphics card(s):
      • EVGA GTX 1080 FTW
      • PSU:
      • Corsair HX850 Modular
      • Case:
      • Antec 300
      • Operating System:
      • Windows 7 x64
      • Monitor(s):
      • Asus PB278Q (27" 2560x1440)
      • Internet:
      • Virgin Media 100mb
    Cheers for that - i think the key is as stated in the last post - let something run, or just play for a bit until the gpu is "warmed up" so to speak. tbh, i might not even oc, the card runs quite well already and only gives a few more fps in things like doom - which i guess i can live without - for now!!!!

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •