Results 1 to 13 of 13

Thread: News - NVIDIA G300 frequencies revealed?

  1. #1
    HEXUS.admin
    Join Date
    Apr 2005
    Posts
    29,365
    Thanks
    0
    Thanked
    1,924 times in 668 posts

    News - NVIDIA G300 frequencies revealed?

    NVIDIA's 40nm GPU isn't expected until at least later this year, but we're already hearing rumoured specifications.
    Read more.

  2. #2
    WEEEEEEEEEEEEE! MadduckUK's Avatar
    Join Date
    May 2006
    Location
    Lytham St. Annes
    Posts
    17,293
    Thanks
    649
    Thanked
    1,580 times in 1,006 posts
    • MadduckUK's system
      • Motherboard:
      • Asus A88XM-PLUS
      • CPU:
      • AMD 860K @4.45
      • Memory:
      • 16GB (4x4GB) PC3-12800
      • Storage:
      • 1x240GB Sandisk Extreme / 3x500GB RAID0 / 3GB Backup
      • Graphics card(s):
      • Radeon 7870XT
      • PSU:
      • Corsair TX750w
      • Case:
      • Cooler Master Galdiator 600
      • Operating System:
      • Windows 10 x64
      • Monitor(s):
      • DELL S2409W
      • Internet:
      • 3 One Plan

    Re: News - NVIDIA G300 frequencies revealed?

    that looks nothing more than expected.. anyone want to vs it with ati's future card specs?
    Quote Originally Posted by Ephesians
    Do not be drunk with wine, which will ruin you, but be filled with the Spirit
    Vodka

  3. #3
    Not a good person scaryjim's Avatar
    Join Date
    Jan 2009
    Location
    Manchester
    Posts
    15,102
    Thanks
    1,209
    Thanked
    2,254 times in 1,854 posts
    • scaryjim's system
      • Motherboard:
      • Dell Inspiron
      • CPU:
      • Core i5 8250U
      • Memory:
      • 1x 8GB DDR4 2400
      • Storage:
      • 128GB M.2 SSD + 1TB HDD
      • Graphics card(s):
      • Radeon R5 230
      • PSU:
      • Battery/Dell brick
      • Case:
      • Dell Inspiron 5570
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 15" 1080p laptop panel

    Re: News - NVIDIA G300 frequencies revealed?

    Surely if it's running GDDR5 the effective memory clock should be in the region of 4400MHz, not 2200?

  4. #4
    Gentoo Ricer
    Join Date
    Jan 2005
    Location
    Galway
    Posts
    11,041
    Thanks
    1,014
    Thanked
    944 times in 704 posts
    • aidanjt's system
      • Motherboard:
      • Asus Strix Z370-G
      • CPU:
      • Intel i7-8700K
      • Memory:
      • 2x8GB Corsiar LPX 3000C15
      • Storage:
      • 500GB Samsung 960 EVO
      • Graphics card(s):
      • EVGA GTX 970 SC ACX 2.0
      • PSU:
      • EVGA G3 750W
      • Case:
      • Fractal Design Define C Mini
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • 240mbps Virgin Cable

    Re: News - NVIDIA G300 frequencies revealed?

    Quote Originally Posted by scaryjim View Post
    Surely if it's running GDDR5 the effective memory clock should be in the region of 4400MHz, not 2200?
    Why would it?

    My ATi card has GDDR5 and its design limit is 2200Mhz effective. It's DDR, not QDR.
    Quote Originally Posted by Agent View Post
    ...every time Creative bring out a new card range their advertising makes it sound like they have discovered a way to insert a thousand Chuck Norris super dwarfs in your ears...

  5. #5
    Senior Member Hicks12's Avatar
    Join Date
    Jan 2008
    Location
    Plymouth-SouthWest
    Posts
    6,558
    Thanks
    1,066
    Thanked
    332 times in 288 posts
    • Hicks12's system
      • Motherboard:
      • Asus P8Z68-V
      • CPU:
      • Intel i5 2500k@4ghz, cooled by EK Supreme HF
      • Memory:
      • 8GB Kingston hyperX ddr3 PC3-12800 1600mhz
      • Storage:
      • 64GB M4/128GB M4 / WD 640GB AAKS / 1TB Samsung F3
      • Graphics card(s):
      • Palit GTX460 @ 900Mhz Core
      • PSU:
      • 675W ThermalTake ThoughPower XT
      • Case:
      • Lian Li PC-A70 with modded top for 360mm rad
      • Operating System:
      • Windows 7 Professional 64bit
      • Monitor(s):
      • Dell U2311H IPS
      • Internet:
      • 10mb/s cable from virgin media

    Re: News - NVIDIA G300 frequencies revealed?

    one word... Expensive!. If that came out it would cost a crap load, look at the buffer! 512bit and with ddr5 that must cost one arm to make and sell for an arm and a leg lol.

    Wont really worry about specs still its even announced lol.
    Quote Originally Posted by snootyjim View Post
    Trust me, go into any local club and shout "I've got dual Nehalem Xeons" and all of the girls will practically collapse on the spot at the thought of your e-penis

  6. #6
    Ninja Noxvayl's Avatar
    Join Date
    May 2007
    Location
    In the shadows
    Posts
    2,451
    Thanks
    748
    Thanked
    215 times in 173 posts
    • Noxvayl's system
      • Motherboard:
      • GigabyteZ87X-UD4H-CF
      • CPU:
      • Intel i7 4770K
      • Memory:
      • 16GB Corsair Vengaence LPX + 8GB Kingston HyperX Beast
      • Storage:
      • 120GB Snadisk + 256GB Crucial SSDs
      • Graphics card(s):
      • 4GB Sapphire R9 380
      • PSU:
      • ENermax Platimax 750W
      • Case:
      • Fractal Design Define S
      • Operating System:
      • Windows 10 64bit
      • Monitor(s):
      • ATMT + Dell 1024x1280
      • Internet:
      • Sky Fibre

    Re: News - NVIDIA G300 frequencies revealed?

    Quote Originally Posted by aidanjt View Post
    Why would it?

    My ATi card has GDDR5 and its design limit is 2200Mhz effective. It's DDR, not QDR.
    Because the HD4870 has 3600Mhz GDDR5 memory and the HD4890 has 3900Mhz GDDR5 memory> http://www.hexus.net/content/item.php?item=18359&page=3

    Not sure what card you have but if you are looking at frequencies in an overclocking tool such as RivaTuner or AMD Overdrive you are seeing the base clock which is then multiplied by 2 to get the effective clock. So a max of 2200Mhz showed in a overclocking tool is in fact 4400Mhz effective frequency. GPU-Z also shows you the base clock of the card not the effective clock with multiplyers in place.

  7. #7
    Gentoo Ricer
    Join Date
    Jan 2005
    Location
    Galway
    Posts
    11,041
    Thanks
    1,014
    Thanked
    944 times in 704 posts
    • aidanjt's system
      • Motherboard:
      • Asus Strix Z370-G
      • CPU:
      • Intel i7-8700K
      • Memory:
      • 2x8GB Corsiar LPX 3000C15
      • Storage:
      • 500GB Samsung 960 EVO
      • Graphics card(s):
      • EVGA GTX 970 SC ACX 2.0
      • PSU:
      • EVGA G3 750W
      • Case:
      • Fractal Design Define C Mini
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • 240mbps Virgin Cable

    Re: News - NVIDIA G300 frequencies revealed?

    Quote Originally Posted by ExHail View Post
    Because the HD4870 has 3600Mhz GDDR5 memory and the HD4890 has 3900Mhz GDDR5 memory> http://www.hexus.net/content/item.php?item=18359&page=3

    Not sure what card you have but if you are looking at frequencies in an overclocking tool such as RivaTuner or AMD Overdrive you are seeing the base clock which is then multiplied by 2 to get the effective clock. So a max of 2200Mhz showed in a overclocking tool is in fact 4400Mhz effective frequency. GPU-Z also shows you the base clock of the card not the effective clock with multiplyers in place.
    Code:
    aidan@aidan-i7 ~ $ aticonfig --odgc
    
    Default Adapter - ATI Radeon HD 4800 Series
                                Core (MHz)    Memory (MHz)
               Current Clocks :    500           900
                 Current Peak :    750           900
      Configurable Peak Range : [500-790]     [900-1100]
                     GPU load :    0%
    I already accounted for 'effective' frequency. Understandably, people are getting confused over real frequency and 'effective' frequency, and often double up the 'effective' frequency because they didn't realise that they already had 'effective'.
    Quote Originally Posted by Agent View Post
    ...every time Creative bring out a new card range their advertising makes it sound like they have discovered a way to insert a thousand Chuck Norris super dwarfs in your ears...

  8. #8
    Senior Member
    Join Date
    Apr 2009
    Location
    Oxford
    Posts
    263
    Thanks
    5
    Thanked
    7 times in 6 posts
    • borandi's system
      • Motherboard:
      • Gigabyte EX58-UD3R
      • CPU:
      • Core i7 920 D0 (2.66Ghz) @ 4.1Ghz
      • Memory:
      • G.Skill 3x1GB DDR3-1333Mhz
      • Storage:
      • Samsung PB22-J 64GB
      • Graphics card(s):
      • 2x5850 in CF
      • PSU:
      • 800W
      • Case:
      • Verre V770
      • Operating System:
      • Windoze XP Pro
      • Monitor(s):
      • 19"WS
      • Internet:
      • 8MB/448kbps up

    Re: News - NVIDIA G300 frequencies revealed?

    I'm happy as long as it brings down the price of the current GPU range. These things are beasts for computational simulations.

  9. #9
    Not a good person scaryjim's Avatar
    Join Date
    Jan 2009
    Location
    Manchester
    Posts
    15,102
    Thanks
    1,209
    Thanked
    2,254 times in 1,854 posts
    • scaryjim's system
      • Motherboard:
      • Dell Inspiron
      • CPU:
      • Core i5 8250U
      • Memory:
      • 1x 8GB DDR4 2400
      • Storage:
      • 128GB M.2 SSD + 1TB HDD
      • Graphics card(s):
      • Radeon R5 230
      • PSU:
      • Battery/Dell brick
      • Case:
      • Dell Inspiron 5570
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 15" 1080p laptop panel

    Re: News - NVIDIA G300 frequencies revealed?

    I thought GDDR5 was *actually* QDR and just misnamed? Or does the 4890 have an actual memory clock of 1950MHz?!?

    EDIT: I did my homework (i.e. looked at Wikipedia) and it says "GDDR5 is the successor to GDDR4 and unlike its predecessors has two parallel DQ links which provide doubled I/O throughput when compared to GDDR4." So does that means that it effectively doubles the frequency of transfers, or doubles the bitpath throughput of transfers? I'm very confused now...

  10. #10
    Senior Member
    Join Date
    Jul 2003
    Posts
    290
    Thanks
    1
    Thanked
    13 times in 12 posts

    Re: News - NVIDIA G300 frequencies revealed?

    if its mostly just an extension of the current design the gt300 will be, well powerful of course but, because the shaders aren't really in clusters/packs like the ati design, you get a LOT more transistors and die size for every extra SP because essentially the logic/tranny overhead per shader is FAR higher on the Nvidia design than the ATi design at the moment. a bump from 800 to 1200 sp's on the 5870(if to be believed) won't increase the core size anywhere even close to 50% despite a 50% bump in shaders. The nvidia core , assuming over a 100% shader increase 240 to 512, wouldn't be 100% bigger, but it would be a much more linear core size increase. Meaning either the GT300 has switched to a similar shader setup to AMD, a group of shaders that can do "up to X instructions per clock" rather than 1 shader = 1 instruction per clock, or their GT300 will be HUGE. ASsuming all but identical design to their current models with simply increase shaders and core logic to go with it, the gap between yields/size will continue to increase between AMD/Nvidia, meaning this new part would be even more expensive than now, be even further away from AMD in bang for buck and Nvidia will be losing money.

    IN all likelyhood Nvidia HAVE to move to a more efficient AMD style smaller core design because frankly, it will cost them far to much to not be competitive this round. If they do then their peak gigaflops numbers get a lot less impressive like AMD's are massive now but in reality you can't leverage all that power all the time in games, while the brute force design for Nvidia is far simpler to leverage the power out of.

  11. #11
    Gentoo Ricer
    Join Date
    Jan 2005
    Location
    Galway
    Posts
    11,041
    Thanks
    1,014
    Thanked
    944 times in 704 posts
    • aidanjt's system
      • Motherboard:
      • Asus Strix Z370-G
      • CPU:
      • Intel i7-8700K
      • Memory:
      • 2x8GB Corsiar LPX 3000C15
      • Storage:
      • 500GB Samsung 960 EVO
      • Graphics card(s):
      • EVGA GTX 970 SC ACX 2.0
      • PSU:
      • EVGA G3 750W
      • Case:
      • Fractal Design Define C Mini
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • 240mbps Virgin Cable

    Re: News - NVIDIA G300 frequencies revealed?

    Quote Originally Posted by scaryjim View Post
    I thought GDDR5 was *actually* QDR and just misnamed? Or does the 4890 have an actual memory clock of 1950MHz?!?

    EDIT: I did my homework (i.e. looked at Wikipedia) and it says "GDDR5 is the successor to GDDR4 and unlike its predecessors has two parallel DQ links which provide doubled I/O throughput when compared to GDDR4." So does that means that it effectively doubles the frequency of transfers, or doubles the bitpath throughput of transfers? I'm very confused now...


    It's a common source of confusion. It just means they've essentially slapped extra channels onto the memory controller ICs, thereby effectively doubling the bit rate. But the clock frequency is still 900-1150/1800-2300 actual/effective.

    Personally I think all this DDR/QDR business is pretty misleading, since the clock frequency never actually goes above SDR frequencies. They should stick to bit rate metrics if they want to communicate how much data the bus can push out, IMHO.
    Quote Originally Posted by Agent View Post
    ...every time Creative bring out a new card range their advertising makes it sound like they have discovered a way to insert a thousand Chuck Norris super dwarfs in your ears...

  12. #12
    Not a good person scaryjim's Avatar
    Join Date
    Jan 2009
    Location
    Manchester
    Posts
    15,102
    Thanks
    1,209
    Thanked
    2,254 times in 1,854 posts
    • scaryjim's system
      • Motherboard:
      • Dell Inspiron
      • CPU:
      • Core i5 8250U
      • Memory:
      • 1x 8GB DDR4 2400
      • Storage:
      • 128GB M.2 SSD + 1TB HDD
      • Graphics card(s):
      • Radeon R5 230
      • PSU:
      • Battery/Dell brick
      • Case:
      • Dell Inspiron 5570
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 15" 1080p laptop panel

    Re: News - NVIDIA G300 frequencies revealed?

    Quote Originally Posted by aidanjt View Post
    It just means they've essentially slapped extra channels onto the memory controller ICs, thereby effectively doubling the bit rate.
    So GDDR5 with a 128-bit path can actually shifts 256 bits per transfer, and does this twice per clock cycle, right? As oppose to shifting 128bits four times per clock cycle, which would be QDR?

    Now I know why I didn't become an electronics engineer...

    As you might have guessed, I'm determined to understand this before the end of the working day
    Last edited by scaryjim; 19-05-2009 at 04:37 PM.

  13. #13
    Long Time Lurker
    Join Date
    Sep 2006
    Location
    Newcastle
    Posts
    385
    Thanks
    28
    Thanked
    19 times in 17 posts
    • mercyground's system
      • Motherboard:
      • Gigabyte GA-MA790X-UDP4
      • CPU:
      • AMD Phenom II X4 955 (3.2ghz)
      • Memory:
      • 8GB
      • Storage:
      • 160gb intel SSD + 3TB of HDDs
      • Graphics card(s):
      • 4770 512mb
      • PSU:
      • Akaska 550w
      • Operating System:
      • Windows 7
      • Monitor(s):
      • LG M1917TM
      • Internet:
      • 50mb Fibre

    Re: News - NVIDIA G300 frequencies revealed?

    I'll believe it when i see it. Also given that nv is rebranding like mad as they cant get the shrink right... I doubt you will see this card before Xmas. More likely Q1 2010.

    My next cards are ATI for sure. Got a 4770 that is seriously asskicking in the other half's machine. I want a new card

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Replies: 4
    Last Post: 03-03-2009, 02:16 AM
  2. News - NVIDIA releases Windows 7 drivers
    By HEXUS in forum HEXUS News
    Replies: 8
    Last Post: 02-03-2009, 05:50 PM
  3. News - Intel gets legal with NVIDIA
    By HEXUS in forum HEXUS News
    Replies: 1
    Last Post: 18-02-2009, 04:05 PM
  4. Replies: 1
    Last Post: 21-01-2009, 12:58 PM
  5. Nvidia Laptop issues?
    By mercyground in forum PC Hardware and Components
    Replies: 22
    Last Post: 19-08-2008, 02:56 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •