Page 1 of 2 12 LastLast
Results 1 to 16 of 22

Thread: 4Gb vs 8GB

  1. #1
    Registered+
    Join Date
    Nov 2013
    Posts
    40
    Thanks
    14
    Thanked
    1 time in 1 post

    4Gb vs 8GB

    Hi everyone

    I'm looking at upgrading my GPU in in the next couple of weeks and while researching I've noticed that it only seems to be the AMD R9 390 that have 8gb, while the Nvidia cards all seem to be around 4gb

    Does the extra 4gb make that much of a difference nowadays, or will it in the next couple of years?

    I don't game at 4k resolution and probably won't be for the foreseeable future.

  2. #2
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,039
    Thanks
    1,880
    Thanked
    3,379 times in 2,716 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: 4Gb vs 8GB

    Quote Originally Posted by markieboy View Post

    Does the extra 4gb make that much of a difference nowadays, or will it in the next couple of years?

    I don't game at 4k resolution and probably won't be for the foreseeable future.
    No it doesn't make a difference. As for foreseeable.. who knows? Very unlikely to be a limitation before your GPU oomph runs out, if using a single card.

  3. Received thanks from:

    markieboy (21-12-2015)

  4. #3
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,042
    Thanks
    3,909
    Thanked
    5,213 times in 4,005 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: 4Gb vs 8GB

    People are concentrating on the wrong thing with the 4Gb against 8Gb debate with the GTX970 and the R9 390. 4GB is more than enough at 1080P - even 3GB is probably enough IMHO. However,VRAM requirements are increasing for games,so I would expect in the next year to 18 months 3GB to 4GB will be probably what is needed.

    Its more important the way the RAM is addressed on the GTX970 as the last 512MB is very slow,and its the same kind of thing which affected my GTX660,which means as time progresses,unless the drivers make sure that last 512MB is not addressed,you will probably start to see the gulf in usable performance decrease over the GTX980. Its probably the same reason why the HD7870 started to increase its performance lead over the GTX660 too.

    My main worry with the GTX970,is once Nvidia gets the midrange Pascal cards out,there will be less concentration on tailoring the memory management for the GTX970 in drivers,as they won't be the latest and greatest anymore.

    The issue is many on forums,will probably try to sidestep it as saying its "not important currently" but if you are spending £200 to £300 on a card NOW,you probably want to see at least two years out of the card,so that is essentially the end of 2017. We will probably be verging on the successor to Pascal by then.

    Me and Bagnaj97 probably were considering buying the GTX970 for our PCs,as for me its a better fit in my SFF PC,and for him,Nvidia has better Linux gaming drivers,but the 3.5GB+0.5GB issue meant we decided to hold off until the next gen cards are out. Whereas I can see the drivers being OK for the next year or so,it does concern me for the next year to 18 months after that,as I don't think it is unacceptable to keep a £250 card for two and a half years.

    I only did an upgrade to a cheap GTX960 4GB(it was just over a £100) so I could run Fallout 4 better and that will do me for the timebeing.
    Last edited by CAT-THE-FIFTH; 20-12-2015 at 11:53 PM.

  5. Received thanks from:

    markieboy (21-12-2015)

  6. #4
    Registered+
    Join Date
    Nov 2013
    Posts
    40
    Thanks
    14
    Thanked
    1 time in 1 post

    Re: 4Gb vs 8GB

    Thanks for the reply guys

    Currently leaning slightly towards the R9 390, but still open to the GTX 970 or 980 (980 Ti would be nice but way out of my price range, unless I win the lottery

    Final decision will probably come down to what kind of bargin I can pick up in the sales.

  7. #5
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,009
    Thanks
    781
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: 4Gb vs 8GB

    Quote Originally Posted by CAT-THE-FIFTH View Post
    I only did an upgrade to a cheap GTX960 4GB(it was just over a £100) so I could run Fallout 4 better and that will do me for the timebeing.
    I wondered why you got a 960 Cat, at that price I'm not surprised you changed


    Back on topic...

    Games get optimised for the hardware that is out there, and if you click on the VRAM line on http://store.steampowered.com/hwsurvey you see that most games players (one third of Steam users) have 1GB of VRAM, followed by a quarter on 2GB.

    The 970 is a very very popular card, possibly because the 960 was so overpriced for ages that is seemed worth the extra £50 to get a 970. Back on those Steam stats, there are twice the number of 970 users as there are 960 users when I would expect it to be the other way around as cheaper cards normally sell better even when they shouldn't. I don't see Nvidia hacking off such a large number of users, they may not have the morals of a saint but they aren't stupid.

    So these 3.5GB/4GB cards are all playable, and I expect them to stay that way for some time. There have been some slight stuttering issues which 8GB cards smooth over if you have the money, but I haven't seen any measurements recently and both AMD and Nvidia have done some driver updates recently that should mean things are now better than any graphs I have seen.

    Some slightly out of date numbers from http://techreport.com/blog/28800/how...mory-is-enough :



    That graph is with an HD texture pack to increase memory usage as much as possible, and choosing a vram hog of a game to test. The 970 is *scaling* fine at 4K as it follows the same curve as its bigger brothers just a bit slower as you would expect from a lower clocked. 970 goes wonky at 6K resolution, but it doesn't have the grunt for that sort of resolution anyway so that is academic. The 980 and Titan graphs look very similar shape so the scaling is fine this is just down to clock rates not hitting a memory wall, and that is 4GB vs 12GB.

    I am playing 1440p on just a 2GB R9 285 (though FreeSync no doubt helps) and although I'm sure there are games out there that would be a bit sluggish I'm still playing Elite for now which is fine. From the way some people talk you would think that with 2GB I should just give up, rather that just turn the settings down just a little bit

  8. Received thanks from:

    markieboy (22-12-2015)

  9. #6
    Registered+
    Join Date
    Dec 2015
    Posts
    27
    Thanks
    1
    Thanked
    2 times in 2 posts
    • Sean473's system
      • Motherboard:
      • Clevo P771ZM
      • CPU:
      • Intel Core i7 4790K
      • Memory:
      • 32GB Kingston Hyper X Impact DDR3 2133MHz
      • Storage:
      • 240GB Sandisk Extreme II + 1TB Hitachi 7K1000 7200rpm HDD
      • Graphics card(s):
      • NVIDIA GTX 970M
      • PSU:
      • 330W PSU
      • Case:
      • Clevo P771ZM
      • Operating System:
      • Windows 8.1
      • Monitor(s):
      • LG 17.3" IPS 1080p Display

    Re: 4Gb vs 8GB

    At 4K, the extra VRAM is really helpful.. For 1080p/1400p, you'll be fine with 4GB vRAM... However on my laptop with a 6GB 970M, I've used like 5.5GB vRAM when gaming on 1080p in COD AW and a few other games!

  10. Received thanks from:

    markieboy (28-12-2015)

  11. #7
    Senior Member
    Join Date
    Mar 2015
    Location
    Lurking over a keyboard!
    Posts
    438
    Thanks
    216
    Thanked
    35 times in 33 posts
    • gupsterg's system
      • Motherboard:
      • Asus Maximus VII Ranger
      • CPU:
      • i5 4690K @ 4.9GHz
      • Memory:
      • Kingston HyperX Savage 2400Mhz 16GB
      • Storage:
      • Samsung 840 EVO 250GB + HGST 2TB
      • Graphics card(s):
      • Sapphire R9 Fury X
      • PSU:
      • Cooler Master V850
      • Case:
      • Silverstone Temjin 06 plus mods ;)
      • Operating System:
      • Win 7 Pro x64 / Win 10 Pro x64
      • Monitor(s):
      • Asus MG279Q
      • Internet:
      • TalkTalk VDSL

    Re: 4Gb vs 8GB

    Quote Originally Posted by markieboy View Post
    Does the extra 4gb make that much of a difference nowadays, or will it in the next couple of years?
    I don't think it make much difference currently from things I've viewed. What you buy currently I reckon in the future the GPU will start being the limiting factor rather than RAM quantity to get good FPS.

    Quote Originally Posted by markieboy View Post
    I've noticed that it only seems to be the AMD R9 390 that have 8gb
    View the middle 2 columns as the 290X is clocked same as a 390X. The 390 will not be hugely behind on FPS if clocked the same IMO.



    Source Link:- http://www.babeltechreviews.com/the-...-980/view-all/
    Last edited by gupsterg; 28-12-2015 at 11:42 AM.
    i5 4690K @ 4.9GHz CPU@1.255v 4.4GHz Cache@1.10v - Archon SB-E X2 - Asus Maximus VII Ranger
    Kingston HyperX Savage 16GB@2400MHz 1T - Sapphire R9 Fury X (1145/545 Custom ROM, ~17.7K 3DM FS)
    Samsung 840 Evo 250GB - Cooler Master V850

    R7 1700@3.8GHz - Archon IB-E X2 - Asus Crosshair VI Hero - G.Skill Trident Z 3200MHz C14 - Sapphire Fury X (1145/545 Custom ROM, ~17.2K 3DM FS)
    Samsung 840 Evo 250GB - Cooler Master V850


  12. Received thanks from:

    markieboy (28-12-2015)

  13. #8
    Registered+
    Join Date
    Nov 2013
    Posts
    40
    Thanks
    14
    Thanked
    1 time in 1 post

    Re: 4Gb vs 8GB

    Thanks for the replies, the've been really helpful

    Still not sure what card to get though. After finding out that the next generation of cards is due out next year, I'm tempted to wait and see how they perform and their pricing,even if the next gen cards are too expensive, they should help to lower the price of the 980 TI. Whether to wait or not though will depend on how my R9 270x copes with battlefront and fallout 4.

    Was very tempted by the R9 390x, but am a bit concerned about the temperatures that card generates.

  14. #9
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,042
    Thanks
    3,909
    Thanked
    5,213 times in 4,005 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: 4Gb vs 8GB

    Quote Originally Posted by DanceswithUnix View Post
    I wondered why you got a 960 Cat, at that price I'm not surprised you changed
    Well it was like when I got my GTX660 - it was cheaper than many HD7850 2GB cards.

    But also even on midrange cards,4GB is advantageous too:

    http://www.computerbase.de/2015-12/2...nity-1920-1080

    Quote Originally Posted by DanceswithUnix View Post
    Back on topic...

    Games get optimised for the hardware that is out there, and if you click on the VRAM line on http://store.steampowered.com/hwsurvey you see that most games players (one third of Steam users) have 1GB of VRAM, followed by a quarter on 2GB.

    The 970 is a very very popular card, possibly because the 960 was so overpriced for ages that is seemed worth the extra £50 to get a 970. Back on those Steam stats, there are twice the number of 970 users as there are 960 users when I would expect it to be the other way around as cheaper cards normally sell better even when they shouldn't. I don't see Nvidia hacking off such a large number of users, they may not have the morals of a saint but they aren't stupid.

    So these 3.5GB/4GB cards are all playable, and I expect them to stay that way for some time. There have been some slight stuttering issues which 8GB cards smooth over if you have the money, but I haven't seen any measurements recently and both AMD and Nvidia have done some driver updates recently that should mean things are now better than any graphs I have seen.

    Some slightly out of date numbers from http://techreport.com/blog/28800/how...mory-is-enough :



    That graph is with an HD texture pack to increase memory usage as much as possible, and choosing a vram hog of a game to test. The 970 is *scaling* fine at 4K as it follows the same curve as its bigger brothers just a bit slower as you would expect from a lower clocked. 970 goes wonky at 6K resolution, but it doesn't have the grunt for that sort of resolution anyway so that is academic. The 980 and Titan graphs look very similar shape so the scaling is fine this is just down to clock rates not hitting a memory wall, and that is 4GB vs 12GB.

    I am playing 1440p on just a 2GB R9 285 (though FreeSync no doubt helps) and although I'm sure there are games out there that would be a bit sluggish I'm still playing Elite for now which is fine. From the way some people talk you would think that with 2GB I should just give up, rather that just turn the settings down just a little bit
    But the problem,as I said before,is that for NOW,the driver will limit stuff to the fast 3.5GB of RAM - what happens when the next generation of Nvidia midrange cards start shipping with 4GB of RAM - its when the problems I suspect will start,and when Maxwell is not the latest. Its difference than just having 3.5GB of RAM. Having different speed parts,need the driver to recognise which are the fast segments which need to be prioritised first over the slower part.

    Remember,for someone buying a GTX970 its not unreasonable to want to use it for least two years IMHO,ie,until the end of 2017,so what happens as time progresses?? Will the optimisations keep coming for games??

    Nvidia did the same thing with my GTX660,and you could see as time progressed the HD7870 and its derivatives did start to pull ahead. It also had the VRAM segmented into fast and slow parts.

    Why should Nvidia care in December 2016 to try and optimise the drivers with regards to RAM usage then for a two year card,when they have new cards to sell? Its also my view of the Fury range for use at higher resolutions like 4K too.


    FFS,look at how all of a sudden a GTX960 was matching beating previous generation higher tier cards which were faster in games like W3. There is no reason why Nvidia or AMD would have any issue,pushing performance updates onto later generation hardware. They can always make the excuse the latest generation supports "features" or "has an updated architecture" which is better for games.


    Plus,I would be a bit wary of the Steam Hardware Survey - at times its is not always as accurate as it seems. There have been cases when actual sales(or JPR figures) have not actually matched up that well.

    Steam conducts a monthly survey to collect data about what kinds of computer hardware and software our customers are using. Participation in the survey is optional, and anonymous. The information gathered is incredibly helpful to us as we make decisions about what kinds of technology investments to make and products to offer.
    The problem,is what is the bet that hardware enthusiasts are more likely to do the survey than an average gamer who could care less??

    Heck,I think I have only been asked once in three years too.


    Quote Originally Posted by markieboy View Post
    Thanks for the replies, the've been really helpful

    Still not sure what card to get though. After finding out that the next generation of cards is due out next year, I'm tempted to wait and see how they perform and their pricing,even if the next gen cards are too expensive, they should help to lower the price of the 980 TI. Whether to wait or not though will depend on how my R9 270x copes with battlefront and fallout 4.

    Was very tempted by the R9 390x, but am a bit concerned about the temperatures that card generates.
    As long as you are not having some cramped case you would be fine. The temperatures thing is all a bit of silly arguing amongst enthusiasts on forums - plenty of AMD,ATI and Nvidia cards like the 9800GT and so on were rated past 100C,and so on. Even the R9 290X with its average stock cooler seemed to work perfectly fine for months when people were mining on them and those were compute workloads which stress the GPU more than simpler gaming tasks AFAIK.
    Last edited by CAT-THE-FIFTH; 30-12-2015 at 01:44 AM.

  15. #10
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,009
    Thanks
    781
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: 4Gb vs 8GB

    As someone who hands down rather than sells on video cards and has to keep supporting them, I am very aware of how old cards get treated.

    Thanks to the financial pressure of moving house a few years ago, I had the same GTX460 for about 5 years. There was no way it was being optimised for in the latest games, but it was fine for my use. Those optimisations are for getting sales when people use crazy language like "destroyed" for a 5% victory over a competing card. So I expect an old card to be a few percent down over where it could have been, not something I will lose sleep over. It has been that way for as long as there have been competing cards, I would call that more a case of benchmark shenanigans in the latest cards rather than a failure to optimise older ones.

    Basic failure to work properly or performance tanking, well that is very different from the sleight of hand shader code tricks they call optimisation. No, I don't expect them to break support though sometimes you have to roll back a driver occasionally because their testing isn't going to be as good.

    Witcher 3 was an oddity, and ways were found around the performance problems by the community. I noticed my R9 285 fared way better than most of the other AMD cards thanks to its tessellation support, so sometimes the excuse of newer hardware features has some truth behind it. AIUI, eventually a driver fix did get that game working OK on older cards (not that I have tried it, I still have the original Witcher games in my unplayed Steam catalogue so getting the latest installment seemed silly ).

    TBH I would be more interested in variable sync technology. My wife's monitor is playing up, her replacement should turn up today and having used Freesync for while and having seen a 144Hz gaming monitor on Scan for £190 that is what she is getting. It is a step change technology, and one that I think Nvidia are playing really badly. We are lucky here, she has an R7 260X which I think is the first card to properly handle Freesync.

    And yes the Steam hardware survey is full of selection bias as most surveys are. The sales figures I have seen have just been AMD vs Nvidia though with no breakdown by model. A sale of a GTX640 is not equivalent to a sale of a 970, and a lot of cards go into PCs that won't make any real use of them (like the ones at the office I work at). The breakdown from Valve is a bit iffy, but it is about cards in use rather than cards sold which to me makes it more interesting.

  16. #11
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,042
    Thanks
    3,909
    Thanked
    5,213 times in 4,005 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: 4Gb vs 8GB

    Quote Originally Posted by DanceswithUnix View Post
    As someone who hands down rather than sells on video cards and has to keep supporting them, I am very aware of how old cards get treated.

    Thanks to the financial pressure of moving house a few years ago, I had the same GTX460 for about 5 years. There was no way it was being optimised for in the latest games, but it was fine for my use. Those optimisations are for getting sales when people use crazy language like "destroyed" for a 5% victory over a competing card. So I expect an old card to be a few percent down over where it could have been, not something I will lose sleep over. It has been that way for as long as there have been competing cards, I would call that more a case of benchmark shenanigans in the latest cards rather than a failure to optimise older ones.

    Basic failure to work properly or performance tanking, well that is very different from the sleight of hand shader code tricks they call optimisation. No, I don't expect them to break support though sometimes you have to roll back a driver occasionally because their testing isn't going to be as good.

    Witcher 3 was an oddity, and ways were found around the performance problems by the community. I noticed my R9 285 fared way better than most of the other AMD cards thanks to its tessellation support, so sometimes the excuse of newer hardware features has some truth behind it. AIUI, eventually a driver fix did get that game working OK on older cards (not that I have tried it, I still have the original Witcher games in my unplayed Steam catalogue so getting the latest installment seemed silly ).

    TBH I would be more interested in variable sync technology. My wife's monitor is playing up, her replacement should turn up today and having used Freesync for while and having seen a 144Hz gaming monitor on Scan for £190 that is what she is getting. It is a step change technology, and one that I think Nvidia are playing really badly. We are lucky here, she has an R7 260X which I think is the first card to properly handle Freesync.

    But the problem is,that I actually owned a card which had the same weird memory controller design or a similar one - the performance difference as time progressed has got worse and worse over the HD7870. People doubted me over the HD7970/GTX680 and R9 280X/GTX770 too,and look where the Nvidia cards are now. I said the same of the GTX660 and the only reason I got one was since it was CHEAPER than most HD7850 cards.

    The problem is not framerates but things like frametimes and minimums which reviews tend to ignore with the "older" cards as time progresses. Seen any of those comparisons of the GTX680/GTX770 and the HD7970/R9 280X recently?

    All were highish end cards like the GTX970. People buy £250 to £300 cards to turn settings up,not down so its going to compound the problem.

    I have been a hardware enthusiast long enough to know how these things pan out - the memory controller design of the GTX970 will cause issues in 12 to 18 months,and its probably why Nvidia quietly shut up about it,even though they had done something similar with the GTX660 and GTX660TI.

    Why should Nvidia bother to mucking around with optimising the drivers for an old card like that? You forget that none of the other Maxwell GPUs,like the GTX950,GTX960,GTX980,GTX980TI or Titan X have an unusual design like that. The GTX970 probably takes the Lion's share of memory management optimisations of all of them.

    At least with the GTX660 the memory controller was the same for all cards,ie, a 192 bit one and the last set of RAM had much higher bandwidth than the equivalent on the GTX970,ie,48 GB/S as opposed to 28GB/S which is way above system memory bandwidth. The latter is SLOWER than the main system memory in Skylake.


    You need to consider,from card to card there will be probably different parts of the memory controller disabled on different cards and the driver needs to be able to take that into consideration and make sure the fast 3.5GB is used first.

    Remember,by then the GTX970 will be two to two and a half years old which makes it an older product for Nvidia.

    I also expect the GTX980 to increasingly pull ahead in playability over that time.

    You also need to consider with the consoles having a decent amount of RAM,devs are going to start plonking up RAM requirements more and more.

    I was genuinely surprised seeing this:

    http://www.computerbase.de/2015-12/2...nity-1920-1080

    That is only with midrange GPUs. Now start including higher end cards where you can actually start upping the effects.

    Plus the GTX960 is only going to be available in 4GB versions,and I suspect the next gen Nvidia midrange cards will have 4GB of VRAM. So that means you will see Nvidia optimising for 4GB of memory games not 3.5GB+ 0.5GB. This is why the 4GB vs 8GB debate is less important. Its more what Nvidia will be having in the pipeline which will affect what is going to happen.


    Its also why I am dubious about the use of the Fury and Fury X for 4K,since it requires AMD to keep pushing out memory optimisations. But once the new AMD cards replace them,I am not sure how much AMD will care. This is why I would consider a GTX980TI a better 4K GPU for that reason.

    4GB is enough for most games,but probably is going to be an issue with the resolutions AMD is touting and its the same with the GTX970 as time progresses.



    Quote Originally Posted by DanceswithUnix View Post
    And yes the Steam hardware survey is full of selection bias as most surveys are. The sales figures I have seen have just been AMD vs Nvidia though with no breakdown by model. A sale of a GTX640 is not equivalent to a sale of a 970, and a lot of cards go into PCs that won't make any real use of them (like the ones at the office I work at). The breakdown from Valve is a bit iffy, but it is about cards in use rather than cards sold which to me makes it more interesting.
    But its very unreliable - I have only been asked once in THREE years and I had three different cards in that period. Plus the problem since it is optional,it also means that the vast majority of GAMERs are not going to care. It biases hardware enthusiasts - if you don't believe me,then look at how common the GTX750TI and GTX960 and the GPUs in them are in prebuilt laptops and desktops worldwide.

    The GPU in the GTX970 is hardly represented in most laptops - the GPUs in the GTX750/GTX750TI and GTX960 are.

    In fact I know FAR more GTX750TI and GTX960 users than GTX970 and thats among quite a reasonable number of gamers. FFS,there are more HD7900 series users than those who have an HD7700 or HD7800 series cards...right.

    Even going back to the HD5000 series days - I remember there was a discussion on Anandtech where someone showed that it was not picking up the changes in marketshare which Nvdia were having by basically not having DX11 cards for like six to nine months and there were hard sales and shipping figures to back that up too.

    Plus what are the actually numbers of users surveyed??

    Its like those adverts which say 95% of people use our brand and then it says the sample number is 50 people,and I do know people who work in the market analytic fields too.

    The whole point I don't think it is a reliable way to say anything.

    The only way for it to mean anything is to make it compulsory and have a large sample number.
    Last edited by CAT-THE-FIFTH; 30-12-2015 at 02:22 PM.

  17. #12
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,009
    Thanks
    781
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: 4Gb vs 8GB

    OK Cat, I hear and understand that your old card didn't fare well compared the its AMD competitor of the time. By what percentage though? If you started off at parity and ended up 10% lower, then can you really feel 10%? If you are 25% down then feel free to rage and I will even join you

    Also makes me wonder if that is why AMD still haven't unlocked the unused data width on the R380, perhaps they have a similar problem.

    Having almost hit they buy button myself on the 750ti, 960 and 970 at some point, I actually only know 970/980 owners and one person who is probably about to buy a 970. But then I am sure I am not statistically significant The whole point of the Steam survey is to give a heads up for games developers to know what level to pitch & test their games. Valve don't do it for fun, so I would expect given there is profit margin involved that they would make the survey fit for purpose of informing developers, and I would expect at least some developers to take notice of it else Valve would stop bothering.

    I was genuinely surprised seeing this:

    http://www.computerbase.de/2015-12/2...nity-1920-1080
    Well I'm not surprised at all that 2GB is getting iffy. I was quite annoyed at buying a 2GB card this time around when my 5 year old GTX460 was also a 2GB card, but then at the time a 4GB version was 50% more expensive and that was more than I was prepared to pay. 4GB is what I would consider the sane amount right now. Not convinced that 8GB is worth it if HBM is going to obsolete your expensive 8GB card long before the memory wall becomes an issue.

  18. #13
    OilSheikh
    Guest

    Re: 4Gb vs 8GB

    Playing Rainbow Six Siege currently and as soon as you want MSAA , 2GB isn't enough and causes lag!

  19. #14
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,042
    Thanks
    3,909
    Thanked
    5,213 times in 4,005 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: 4Gb vs 8GB

    Quote Originally Posted by DanceswithUnix View Post
    OK Cat, I hear and understand that your old card didn't fare well compared the its AMD competitor of the time. By what percentage though? If you started off at parity and ended up 10% lower, then can you really feel 10%? If you are 25% down then feel free to rage and I will even join you

    Also makes me wonder if that is why AMD still haven't unlocked the unused data width on the R380, perhaps they have a similar problem.

    The problem is that review websites quietly dropped the GTX660 in reviews from 2014 onwards even though it was available for quite a while after that. But you can see in anything apart from Nvidia sponsored games,instead of being closer to an HD7870 its performance in starting to go downhill:

    http://gamegpu.ru/images/remote/http...a_v_1920_2.jpg
    http://gamegpu.ru/images/remote/http...ew-bh_1920.jpg
    http://gamegpu.ru/images/remote/http...ew-dl_1920.jpg
    http://gamegpu.ru/images/remote/http...st-r7_1920.jpg
    http://gamegpu.ru/images/remote/http...t-jc3_1920.jpg

    Thats just a few recent games and Dying Light is Nvidia sponsored. The GTX660 was significantly faster than a GTX750TI. See what it is now!

    Plus no frametime measurements either since the GTX660 is now generally not included in reviews.

    Quote Originally Posted by DanceswithUnix View Post
    Having almost hit they buy button myself on the 750ti, 960 and 970 at some point, I actually only know 970/980 owners and one person who is probably about to buy a 970. But then I am sure I am not statistically significant The whole point of the Steam survey is to give a heads up for games developers to know what level to pitch & test their games. Valve don't do it for fun, so I would expect given there is profit margin involved that they would make the survey fit for purpose of informing developers, and I would expect at least some developers to take notice of it else Valve would stop bothering.
    The problem is that people have analysed figures from different sources and the numbers don't add up - there have been multiple situations where Nvidia or AMD card breakdown figures contradict actual retail figures! I remember when ATI/AMD launched the HD5000 series,they actually released the ACTUAL numbers of HD5700 and HD5800 series cards which were sold in one time period. Interestingly,enough the Steam figures entirely warped them in favour of the higher end cards...which is hilarious since you found simply more OEM PCs with HD5700 series cards,and I simply knew FAR more people owning something like an HD5770. But most HD5770 owners I knew were not hardware enthusiasts but gamers,and it was more likely you would be a hardware enthusiast if you had something like an HD5850 and so on. The same went with the HD6850 and HD6870 where the Steam figures were really weird. Or the situation where AMD never sold any HD6850 cards for ages it appears and so on.

    Its simply not that reliable to say anything IMHO,especially when it is biased towards hardware enthusiasts since they are more likely to answer it,since its an opt-in service.

    A much more accurate measurement would be from the Windows System manager and I am sure Microsoft has that data!

    Quote Originally Posted by DanceswithUnix View Post
    Well I'm not surprised at all that 2GB is getting iffy. I was quite annoyed at buying a 2GB card this time around when my 5 year old GTX460 was also a 2GB card, but then at the time a 4GB version was 50% more expensive and that was more than I was prepared to pay. 4GB is what I would consider the sane amount right now. Not convinced that 8GB is worth it if HBM is going to obsolete your expensive 8GB card long before the memory wall becomes an issue.
    The main problem is that AMD is positioning the HBM enabled cards as being 4K capable,especially so in the end there is going to be the same issues with memory management.

  20. #15
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,009
    Thanks
    781
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: 4Gb vs 8GB

    Quote Originally Posted by CAT-THE-FIFTH View Post

    The main problem is that AMD is positioning the HBM enabled cards as being 4K capable,especially so in the end there is going to be the same issues with memory management.
    Don't think I understood that bit, though I was sloppy and should have said "HBM2" not just generically HBM which makes it sound like I meant the first gen stuff that you can already buy so perhaps you were responding to something where I didn't make sense

  21. #16
    Registered+
    Join Date
    Jan 2016
    Location
    Ipswich
    Posts
    26
    Thanks
    0
    Thanked
    1 time in 1 post
    • SlayerMatt's system
      • Motherboard:
      • Asus P8H61-MX
      • CPU:
      • i5 2500k @3.2Ghz
      • Memory:
      • 8gb DDR3 HyperX Kingston Blue
      • Storage:
      • 240gb OCZ SSD + 1tb HGST DeskStar 7200RPM
      • Graphics card(s):
      • Gigabyte R9 285 2GB
      • PSU:
      • BeQuiet Pure Power 530w
      • Case:
      • Antec 302
      • Operating System:
      • Windows 8.1
      • Monitor(s):
      • BenQ GL2450HM
      • Internet:
      • 100mb Virgin Media Fiber

    Re: 4Gb vs 8GB

    Memory argument aside (which has more than been debated above in some detail) the VRAM is relatively meaningless if the card cannot full utilise it. Based on current prices and benchmarks, a R9 390 and a GTX 970 look about on par regardless - so if that's your price bracket think of which ones features you're going to use. Shadowplay/G-Sync/Physx or FreeSync/Mantle/etc? Both cards I'd argue will stand the test of time at 1080p and probably 1440p

    Either way, you're left with a strong card.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •