Page 1 of 3 123 LastLast
Results 1 to 16 of 37

Thread: Heat, power, noise comparison between X1900 & 7900?

  1. #1
    Member
    Join Date
    Aug 2005
    Posts
    113
    Thanks
    0
    Thanked
    0 times in 0 posts

    Heat, power, noise comparison between X1900 & 7900?

    My thinking has turned-around in the last six months. I now see the ATI X1900GT (512MB) as being a better value than either the Nvidia 7900GT (256MB) or 7900GTX (512MB). The ATI has slightly better image quality, better future-proofing (for use on future games that are likely to use more Shader Model 3 and dynamic-branching), better AVIVO (video-in-out) quality, and at a substantially lower price. (Am I correct there?)

    However, I still don't have much feeling for the heat, power, noise differences. I understand the Nvidia 7900 has better specs here, but I haven't seen any official comparisons yet. Some people say the ATI noise is barely noticable, except for five seconds when powering-up, and when playing games to the hilt. Other people compare the ATI to a blow dryer (in noise and heat).

    Have any labs published comparisons yet? Any thoughts? I'm interested in power requirements (the financial cost of running it), at idle and under full load (I presume that will also tell me the heat output). I'm also interested in the noise (under idle and full load).

  2. #2
    not posting kempez's Avatar
    Join Date
    Aug 2005
    Location
    Basingstoke
    Posts
    3,204
    Thanks
    0
    Thanked
    0 times in 0 posts
    Check my project <<| Black3D |>>
    Quote Originally Posted by hexah
    Games are developed by teams of talented people and sometimes electronic arts

  3. #3
    Banned Smokey21's Avatar
    Join Date
    May 2005
    Location
    Stafford, Midlands
    Posts
    1,752
    Thanks
    0
    Thanked
    0 times in 0 posts
    I find the whole thing amazing.

    The best example is that rubbishrubbishrubbishrubbish D.P at OCUK. He really pisses me off, he's such an Nvidia ****ing Fanboy.

    Before the 7900 came out, people were saying how it was gonna be faster than the XTX, with 32 pipes, etc and all that crap. When it was seen it was only on par at best, people starting saying, well at least it runs cooler, and is quiter?

    Surely it's for playing games? I don't quite understand the heat part. They still overclock like hell, and the cooler uses an exhaust system, so it keeps the case temp lower, i know it did when i went 7800>X1800XT.

    Who cares about how much Power it uses? Nvidiaots clutching at straws me thinks?

  4. #4
    Senior Member
    Join Date
    Feb 2005
    Location
    Liverpool
    Posts
    1,020
    Thanks
    34
    Thanked
    26 times in 20 posts
    • [DW]Cougho's system
      • Motherboard:
      • Asus Crosshair VI Hero
      • CPU:
      • AMD Ryzen 3600 @ 4.3 1900 FLCK
      • Memory:
      • 16GB Team Group DDR4 @ 3800 C16
      • Storage:
      • 512GB Samsung 870 EVO NVME & 1TB Samsung 850 Evo
      • Graphics card(s):
      • Gigabyte GTX1070 G1 Gaming
      • PSU:
      • Corsair AX760
      • Case:
      • Silverstone FT-05B
      • Operating System:
      • Windows 10
      • Monitor(s):
      • BenQ XL2730Z 1440p 144Hz
      • Internet:
      • BT Infinity 1
    Its not even worth discussing running costs when comparing graphics cards. I would imagine the difference to be about the same as having a power LED on your computer or not..... aka zero.

    Heat and noise, well that may matter. For me the old arctic coolers (for the 9800's etc) is too loud on full but most people say that its silent on full? its all subjective. By the sounds of it both cards would be too noisey for me but thats doesnt bother me as i would likely put a 3rd party cooler on anyway. However having read a bit around the net it seems that the X1900XT's run so hot that practically no 3rd party HSF is capable of dealing with the load (maybe with the exception of the soon to be released zalman thingy), that for me could win the day for a 7900GTX even if it is slightly slower.

    So yes heat and noise do matter to some people. To me a silent system is worth alot more than the typical 3-5fps advantage the X1900XT enjoys. Although i must say i have heard neither and i am going off other peoples biased views no doubt.

  5. #5
    not posting kempez's Avatar
    Join Date
    Aug 2005
    Location
    Basingstoke
    Posts
    3,204
    Thanks
    0
    Thanked
    0 times in 0 posts
    I will vouch for the quiet of the 7800GTX 512mb/7900GTX cooler. Its very quiet

    I will also say thet the back of the PCB get pretty hot on nvidia cards, whereas the back of the PCB does not give off heat on the ATI card. Dunno if that helps at all but there it it.
    Check my project <<| Black3D |>>
    Quote Originally Posted by hexah
    Games are developed by teams of talented people and sometimes electronic arts

  6. #6
    Out of the Loop
    Join Date
    Mar 2005
    Location
    Staffordshire
    Posts
    1,036
    Thanks
    140
    Thanked
    52 times in 42 posts
    • vrykyl's system
      • Motherboard:
      • Asus ROG X570 Strix-E
      • CPU:
      • Ryzen 3900 @ 4.5ghz 1.28v (Noctua DH15)
      • Memory:
      • 32gb (2x16gb) Crucial Ballistix 3200mhz @ 3800mhz 1.35v
      • Storage:
      • 1tb Corsair MP600 NVME, 256gb Samsung Evo, 4tb WD Red
      • Graphics card(s):
      • MSI RTX 3080 Ventus 3X OC 10gb
      • PSU:
      • Corsair AX 860w + White Braided Cables
      • Case:
      • Corsair 600T White Limited Edition (Soundproofed)
      • Operating System:
      • Windows 10 x64
      • Monitor(s):
      • Samsung 49" CRG9 Ultrawide 5120x1440 @ 120hz
      • Internet:
      • Plusnet 80mb fibre (80/20)
    i have the advantage of picking based purely on performance...as i watercool my graphics so noice/heat/power isnt an issue

    x1900xtx is the best card out atm imo...and its cheaper than the nvidia too

  7. #7
    Member
    Join Date
    Aug 2005
    Posts
    113
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by kempez815
    Something about this data is weird. On this webpage, they say the X1900XTx draws at least 146 Watts at idle, and 315 Watts under full load. In other words, a Crossfire gaming rig would use at least 2x315=730 Watts just for the two graphics cards! You'd need a kilowatt powersupply!

    Those same folks tested these cards in a Crossfire system, but their powersupply was only 600 Watts. It doesn't add up. What gives?

  8. #8
    Senior Member
    Join Date
    Feb 2006
    Location
    Cornwall
    Posts
    266
    Thanks
    3
    Thanked
    4 times in 4 posts
    • Couger's system
      • Motherboard:
      • Gigabyte GA-X58A-UD3R rev2.0
      • CPU:
      • Intel Core i7 930 @ 4.2 Ghz
      • Memory:
      • Corsair XMS3 6GB DDR3 1600 Mhz CAS 9
      • Storage:
      • OCZ 60gb vertex 2E SSD, WD raptor 150gb, WD Raptor 160gb, WD1001FAES 1TB, WD15EARS 1.5TB
      • Graphics card(s):
      • MSI ATI Radeon HD6970
      • PSU:
      • Corsair TX750
      • Case:
      • LanCool PC-K62
      • Operating System:
      • Windows 7 Ultimate
      • Monitor(s):
      • Dell 2007 FPW
      • Internet:
      • 8mb with plusnet
    heat and noise difference to me doesnt bother me the x1900's run quite and cool enough in a proper effiently cooled case

    ATI all the way

    yeah ill agree they do chew monsterous power,
    whist playing FEAR ATI tool displays that my card is using between 20 & 24 amps, thats insane lol
    Last edited by Couger; 18-03-2006 at 11:32 AM.

  9. #9
    Senior Member sawyen's Avatar
    Join Date
    May 2005
    Location
    Sheffield University
    Posts
    3,658
    Thanks
    7
    Thanked
    22 times in 21 posts
    • sawyen's system
      • Motherboard:
      • MSI Laptop motherboard
      • CPU:
      • Intel Core i7 740QM
      • Memory:
      • 8192MB DDR3
      • Storage:
      • 256GB SSD, 1TB WD
      • Graphics card(s):
      • AMD Mobility HD 5870
      • PSU:
      • MSI stuff
      • Case:
      • N/A
      • Operating System:
      • Win 7 64bit
      • Internet:
      • Virgin ADSL rubbish
    Quote Originally Posted by Artic_Kid
    Something about this data is weird. On this webpage, they say the X1900XTx draws at least 146 Watts at idle, and 315 Watts under full load. In other words, a Crossfire gaming rig would use at least 2x315=730 Watts just for the two graphics cards! You'd need a kilowatt powersupply!

    Those same folks tested these cards in a Crossfire system, but their powersupply was only 600 Watts. It doesn't add up. What gives?
    2 cards doesnt necessary draw double the power, as both cards arent fully loaded during games.. More often they're still CPU limited.. Unless AMD or Intel comes up with a processor that has 4x the current speed.. maybe.. PERHAPS maybe Crossfire and SLI would be running full load..
    Me want Ultrabook


  10. #10
    Senior Member
    Join Date
    Mar 2005
    Posts
    4,942
    Thanks
    171
    Thanked
    386 times in 313 posts
    • badass's system
      • Motherboard:
      • ASUS P8Z77-m pro
      • CPU:
      • Core i5 3570K
      • Memory:
      • 32GB
      • Storage:
      • 1TB Samsung 850 EVO, 2TB WD Green
      • Graphics card(s):
      • Radeon RX 580
      • PSU:
      • Corsair HX520W
      • Case:
      • Silverstone SG02-F
      • Operating System:
      • Windows 10 X64
      • Monitor(s):
      • Del U2311, LG226WTQ
      • Internet:
      • 80/20 FTTC
    Quote Originally Posted by Artic_Kid
    Something about this data is weird. On this webpage, they say the X1900XTx draws at least 146 Watts at idle, and 315 Watts under full load. In other words, a Crossfire gaming rig would use at least 2x315=730 Watts just for the two graphics cards! You'd need a kilowatt powersupply!

    Those same folks tested these cards in a Crossfire system, but their powersupply was only 600 Watts. It doesn't add up. What gives?
    Slight flaw there. They are measuring the power draw of the entire system. The GFX card is using a fraction of the total power.
    "In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."

  11. #11
    Member
    Join Date
    Aug 2005
    Posts
    113
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by badass
    Slight flaw there. They are measuring the power draw of the entire system. The GFX card is using a fraction of the total power.
    Eek! You're right. (I just checked their fine print.) Their entire system (including the X1900XTx, and CPU, mobo, HDDs, and all the rest) uses about 146 Watts at idle, and 315 Watts under full load. That makes much more sense. It suggests that the X1900XTx uses, approximately, 30 Watts at idle, and 140 Watts under full load. ??? (That's a very rough estimate on my part.)

    Does anyone have accurate power specs for both the X1900s and the 7900s? I'd like to compare the two. For example, under full load, if the 7900s use, say, 30 Watts less than the X1900s, then that's probably not enough of a difference to justify the substantially higher cost of the 7900s.

  12. #12
    Member
    Join Date
    Aug 2005
    Posts
    113
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by Couger
    yeah ill agree they do chew monsterous power,
    whist playing FEAR ATI tool displays that my card is using between 20 & 24 amps, thats insane lol
    That indicates that the ATI X1900XTx is using between 240 and 288 Watts! Yikes! (That's double what I estimated in my previous post.) With power figures like that, it's hard to explain two of these in a Crossfire system using a mere 600 Watt power supply (as the reviewers did here). Their PSU must have been operating on the hairy-edge of shutdown.

  13. #13
    Senior Member sawyen's Avatar
    Join Date
    May 2005
    Location
    Sheffield University
    Posts
    3,658
    Thanks
    7
    Thanked
    22 times in 21 posts
    • sawyen's system
      • Motherboard:
      • MSI Laptop motherboard
      • CPU:
      • Intel Core i7 740QM
      • Memory:
      • 8192MB DDR3
      • Storage:
      • 256GB SSD, 1TB WD
      • Graphics card(s):
      • AMD Mobility HD 5870
      • PSU:
      • MSI stuff
      • Case:
      • N/A
      • Operating System:
      • Win 7 64bit
      • Internet:
      • Virgin ADSL rubbish
    Like I said in my previous post.. its unlikely that both cards are drawing exactly 100% load power...

    Crossfire would have distributed alot of the math across both GPUs, therefore instead of 1 XTX running at 100%, you now get 2 cards running 50%.. Given the extra setting u throw at it, probably running 75% of each card with CPU limiting in most situations. If 1 card at 100% load chews 290-300W.. 150% would only be around 450W..

    Enuf said..
    Me want Ultrabook


  14. #14
    Member
    Join Date
    Aug 2005
    Posts
    113
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by sawyen
    ... its unlikely that both cards are drawing exactly 100% load power... Crossfire would have distributed alot of the math across both GPUs, therefore instead of 1 XTX running at 100%, you now get 2 cards running 50%.. Given the extra setting u throw at it, probably running 75% of each card with CPU limiting in most situations. If 1 card at 100% load chews 290-300W.. 150% would only be around 450W.
    I've been told (on a previous thread here at Hexus) that both of the dual graphics cards perform a fair amount of the same processing in duplicate. Each card uploads 100% of the textures, and each card does 100% of the vertex processing (so I'm told). Moreover, when operating in any non-alternating frame mode, (for example, in split-screen mode, or checkboard mode), there is an overlap of the areas to be computed, so here again both cards do 100% of the processing. In other words, there is some substantial processing-power wasted when doing SLI/Crossfire, and that will show up as extra heat. The change-over to SLI/Crossfire is not 100% efficient.

    Also, concerning CPU limiting. People typically do not use SLI/Crossfire to get around CPU limiting. Rather they use it to get better image quality (higher resolution, and better eye-candy effects), while keeping approximately the same frame-rate (and CPU limiting) as before. People crank-up the image quality until their graphics system can barely keep up with the CPU. So I believe it a mistake to assume the two graphics cards are under-utilized (and under-powered).

    Given all that, it seems speculative, even doubtful, that the two graphics cards would run at only 75% of their available processing power. Given the power-figures reported so far, it still seems that a 600 Watt power supply would be on the hairy-edge of shutting down.

    P.S. We still have no power figures for the 7900XTx. Has anyone posted those?

  15. #15
    Senior Member
    Join Date
    Dec 2005
    Posts
    247
    Thanks
    1
    Thanked
    8 times in 8 posts
    • Barkotron's system
      • Motherboard:
      • Asus X99-A
      • CPU:
      • i7 5820K
      • Memory:
      • 32GB Crucial Ballistix Sport
      • Storage:
      • 128GB Samsung 850 Pro (OS), 512GB Samsung 840 Evo (Games), various other SSD/HDD (storage, work)
      • Graphics card(s):
      • Inno3d GeForce GTX 980
      • PSU:
      • Hiper Something Something 800W-ish
      • Case:
      • Define XL
      • Operating System:
      • Win 8.1 x64 (fun!), Win2012 R2 (work)
      • Monitor(s):
      • Hazro HR27WD
      • Internet:
      • Virgin 120Mb
    Quote Originally Posted by [DW]Cougho
    Heat and noise, well that may matter. For me the old arctic coolers (for the 9800's etc) is too loud on full but most people say that its silent on full? its all subjective. By the sounds of it both cards would be too noisey for me but thats doesnt bother me as i would likely put a 3rd party cooler on anyway. However having read a bit around the net it seems that the X1900XT's run so hot that practically no 3rd party HSF is capable of dealing with the load (maybe with the exception of the soon to be released zalman thingy), that for me could win the day for a 7900GTX even if it is slightly slower.
    They matter a lot to me as well. I was very reluctant to get the X1900whatever due to the stories about the noise, however in the end I decided to just get one (X1900XT) anyway - much cheaper than the equivalent NV cards, and apparently much better image quality, so a win-win really.

    The noise was bad, so I replaced with the Thermalright V1-Ultra, which keeps core temperatures down in the mid-60s instead of the high 80s I was seeing with the stock cooler if it's on full, and mid to high 70s if I turn down the fan (using the Zalman multi-controller thing) to near-inaudible levels. You do have to be happy to take pliers/hacksaws to the RAM heatsinks that ship with the cooler (2 of them need to be cut down to roughly half their height due to the positioning of the heatsink/heatpipe assembly on the front of the card), or you could wait until they release compatible ramsinks, which apparently they are going to do at some point.

    The only thing that gets really hot are the power regulators, as they're no longer being cooled by the fan. The Thermalright fan sits on top of the card and sucks air through the heatsink/heatpipe thing, and as the card sits directly underneath my exhaust fan, it gets extracted immediately. For the power regulators I've slapped on some of the Zalman ramsinks which has given me a bit lower temps (high 70s after 2 hours of load instead of mid-90s, (!!!) which it was before) - I need to clear up the airflow a bit from the front fan though, or maybe put a very low-rpm exhaust in underneath the graphics card (am I really ever going to use that PCI-E x1 slot? ) as it looks as if hot air may be pooling under there a bit. Turning up the front fan helps though - I guess plenty of cool air gets added under there, and air can escape via the space on the back panel where the exhaust of the X1900's heatsink used to be.

    Now, even if I turn all the fans on full, it's not remotely as noisy as the X1900 stock is when it spins up. Bring on Oblivion!

  16. #16
    Registered User
    Join Date
    Jul 2005
    Location
    Somewhere I Belong
    Posts
    445
    Thanks
    0
    Thanked
    0 times in 0 posts
    I think I read somewhere that the 7900GTX will use aprrox 120w. Don't remember where I read that, think I googled '7900GTX' and found that a few days before they were released.

Page 1 of 3 123 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Mysterious noise coming from computer
    By Lunacy in forum PC Hardware and Components
    Replies: 8
    Last Post: 30-08-2007, 06:56 PM
  2. Replies: 23
    Last Post: 18-06-2007, 08:31 AM
  3. ATX12v power
    By eletero in forum Help! Quick Relief From Tech Headaches
    Replies: 3
    Last Post: 07-06-2005, 11:47 AM
  4. WD Raptor SATA power connector
    By Taz in forum PC Hardware and Components
    Replies: 1
    Last Post: 08-05-2005, 11:56 AM
  5. how to choose a PSU?
    By loriel60 in forum PC Hardware and Components
    Replies: 8
    Last Post: 14-01-2004, 04:40 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •