Page 2 of 2 FirstFirst 12
Results 17 to 23 of 23

Thread: First build in... well, basically first build.

  1. #17
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: First build in... well, basically first build.

    Quote Originally Posted by Ulti View Post
    I can't speak for the OP but for myself, the problem is pricing. The 3800/5800 just isn't great value at the moment and I'm patient enough to wait for it to be better value. I can live with taking a 3-4% hit 1% of the time to my FPS and keep the ~£100 or so difference for something else.
    Its far more than a few percent. The RTX3070 and RTX3060TI are not as badly affected. The RTX3080/RTX3090 seem to be affected somewhat more.Its actually manifested in scenarios where a game which is similar speed on an AMD RX6900 and RTX3090,has the latter fall behind.





    Look at the frametime consistency when you step down to a slower CPU?? AMD is fine. Nvidia shows a huge drop. In some of those instances an RTX3070 with a faster CPU is equal to an RTX3080 with a slower CPU.

    Testing SOTTR.



    FPS capped to 60FPS.

    The RTX3090 and RTX3080 are very close together in raw TFLOPs. The RX6900XT seems to not care as much.

    The reason why I suggested the Ryzen 7 5800X was because the Ryzen 5 5600X is too expensive at over £300 streetprice.If it was at £280 it might be worth considering.

    Also with my GTX1080,I saw large improvements everytime I changed the CPU. Xeon E3 1230 V2,to Ryzen 5 2600 to Ryzen 7 3700X.Some of this was in games which were single core dependent,and others where the extra cores made a difference. This is with a GTX1080 at qHD when you would think I am GPU limited - what it manifested itself was in minimums. I only got the Ryzen 7 3700X as it was a good deal,and I was intending to buy a new dGPU.

    But since you can't get dGPUs easily now,I stuck with what I have,with surprising results.
    Last edited by CAT-THE-FIFTH; 15-03-2021 at 07:05 PM.

  2. #18
    Senior Member Ulti's Avatar
    Join Date
    Feb 2009
    Posts
    2,054
    Thanks
    769
    Thanked
    230 times in 195 posts
    • Ulti's system
      • Motherboard:
      • MSI B550I Gaming Edge
      • CPU:
      • AMD Ryzen 7 3700X
      • Memory:
      • Kingston 32GB HyperX 3200Mhz
      • Storage:
      • Corsair MP510 1920GB
      • Graphics card(s):
      • Nvidia RTX 3060 Ti FE
      • PSU:
      • SilverStone SX500-LG V2.0
      • Case:
      • SSUPD Meshlicious
      • Operating System:
      • Windows 10 Pro
      • Monitor(s):
      • AOC Agon AG322QC4 31.5"
      • Internet:
      • TalkTalk Fibre 150Mb

    Re: First build in... well, basically first build.

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Its far more than a few percent. The RTX3070 and RTX3060TI are not as badly affected. The RTX3080/RTX3090 seem to be affected somewhat more.Its actually manifested in scenarios where a game which is similar speed on an AMD RX6900 and RTX3090,has the latter fall behind.





    Look at the frametime consistency when you step down to a slower CPU?? AMD is fine. Nvidia shows a huge drop. In some of those instances an RTX3070 with a faster CPU is equal to an RTX3080 with a slower CPU.

    Testing SOTTR.



    FPS capped to 60FPS.

    The RTX3090 and RTX3080 are very close together in raw TFLOPs. The RX6900XT seems to not care as much.

    The reason why I suggested the Ryzen 7 5800X was because the Ryzen 5 5600X is too expensive at over £300 streetprice.If it was at £280 it might be worth considering.

    Also with my GTX1080,I saw large improvements everytime I changed the CPU. Xeon E3 1230 V2,to Ryzen 5 2600 to Ryzen 7 3700X.Some of this was in games which were single core dependent,and others where the extra cores made a difference. This is with a GTX1080 at qHD when you would think I am GPU limited - what it manifested itself was in minimums. I only got the Ryzen 7 3700X as it was a good deal,and I was intending to buy a new dGPU.

    But since you can't get dGPUs easily now,I stuck with what I have,with surprising results.
    I could be interpreting the graphs wrong but I was looking at the 6 core versus 8 core results of Igorslab that you linked earlier and I only saw small hits. I thought the comparison was the 3600 versus the 3700X/3800X, not 3600 vs the 5800X.

    I get your point that with consoles all being on 8 core/16 threads, we should maybe look at the different going from 2 to 4 cores and 4 to 6 cores etc but I think that's just too extreme and those tests aren't representative of a 3600 vs 3700X/3800X paired with an RTX 3080.

    I think if the OP was not planning on upgrading and it was a buy once and forget for 5 years thing, I would agree to getting a 3700X/5800X but if they're happy to get the 3600 for now and see how it lasts in a year or two and upgrade if needed, that would be the more price efficient approach. Obviously doesn't factor in effort required to sell the 3600, look for a 5800X, install it etc.

  3. #19
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: First build in... well, basically first build.

    Quote Originally Posted by Ulti View Post
    I could be interpreting the graphs wrong but I was looking at the 6 core versus 8 core results of Igorslab that you linked earlier and I only saw small hits. I thought the comparison was the 3600 versus the 3700X/3800X, not 3600 vs the 5800X.

    I get your point that with consoles all being on 8 core/16 threads, we should maybe look at the different going from 2 to 4 cores and 4 to 6 cores etc but I think that's just too extreme and those tests aren't representative of a 3600 vs 3700X/3800X paired with an RTX 3080.

    I think if the OP was not planning on upgrading and it was a buy once and forget for 5 years thing, I would agree to getting a 3700X/5800X but if they're happy to get the 3600 for now and see how it lasts in a year or two and upgrade if needed, that would be the more price efficient approach. Obviously doesn't factor in effort required to sell the 3600, look for a 5800X, install it etc.
    Yes,I did suggest that they could see how it goes with the Ryzen 5 3600,but I do think it will probably need upgrading at some point even with the same GPU. I am just trying to point out they can hit a bad CPU bottleneck even now in certain titles with Ampere GPUs,and AMD actually seems to not have this issue. I am going to follow this some more. I was kind of 50:50 WRT to getting AMD or Nvidia for different reasons(more VRAM and lower power consumption with AMD and better RT performance with Nvidia,and more SFF GPUs). However,if Nvidia is more CPU limited in modern titles,I might have to take that into consideration.

    The shift in less than 3 years of the Core i5 7600K/Ryzen 5 1600 is really interesting. Essentially we have gone from 4C/4T being fine to 6C/12T being the sweetspot. The titles tested were console orientated titles,which is why you are seeing this. The issue is though you really want a PC to outspec a console. The Ryzen 5 3600 and console SOCs share the same Zen2 cores,but the consoles have more. The Ryzen 5 has more L3 cache but I suspect the consoles which have access to huge amounts of GDDR6 memory bandwidth don't really need it. ATM,we are going through the intergenerational period,but maybe when AMD releases Zen4 and Intel releases Alderlake,we will see more cores on desktop,which will push Zen3 pricing down a bit.

  4. Received thanks from:

    Ulti (15-03-2021)

  5. #20
    Registered+
    Join Date
    Mar 2021
    Posts
    32
    Thanks
    33
    Thanked
    4 times in 4 posts

    Re: First build in... well, basically first build.

    Thanks again all for your contributions here

    Quote Originally Posted by cptwhite_uk View Post
    In terms of FPS it really depends on the type of games you're playing. Personally I can't really imagine playing with less than 60fps in any game these days. I think 90fps is where the payoff starts to deminish. The law of deminishing returns comes into play above that. By that I mean to "feel" the same difference between 60 and 90, you'd need to get to 140-150fps to "feel" a noticeable improvement in fluidity to a similar degree than you perceive between 60 and 90, if that makes sense - going from 90 to 120, doesn't feel as dramatic as 60 to 90, even though you're adding 30fps on each jump.
    Yeah, this makes sense, the 30fps jump is no longer a 50% increase from 90 to 120 so it's not going to have the same impact.

    Quote Originally Posted by Ulti View Post
    My situation is different though in that my GPU is a lot weaker than yours but if I had a RTX 3080FE and I was on a budget, I would go for the 3600 too and worry about upgrading if needed in the future as it should be a simple drop in process.

    The 5800 looks really good performance wise so maybe in the future you can grab one of them for cheap and not having to change motherboard or anything else should make it easy (depending on your cooler choice it may be a bit of hassle though to uninstall the cooler).
    That is the only part I'd really worry about with the plan to upgrade cpu at some point, taking off the cooler and presumably having to mess with cleaning up and reapplying thermal paste.

    Quote Originally Posted by CAT-THE-FIFTH View Post
    I don't disagree at all the Ryzen 5 3600 is excellent value!

    However,if you are running older games,which are more single core dependent,a Ryzen 5 3600 is easily beaten by a sub £200 Core i5 10600K. Stuff like Fallout,Skyrim,etc will generally run much better on the latter especially if modding games,etc. I say this as someone on Zen2 with a "lowly" GTX1080,it can be seen even with such a GPU.
    I had a looksee at the Intel stuff, I couldn't find the 10600K under £200 only the standard 10600 and the KF version. I have no idea on the differences. Did also find the i7 10700F 8 core at £240, I don't know how that stacks up to the AMD 8 cores.

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Try your best to get the Crucial kit as it has a very high chance of getting Micron E-die. I would do some research on the PNY XLR8 kit and see what memory ICs it uses. Basically Micron E-die and Samsung B-die seem to be the safest choices with Zen2 and Zen3. Hynix CJR is not as good but you can tune it a bit. Ideally you want at least 3200MHZ DDR4 tuned to CL14,or 3600MHZ DDR4 at CL16 or lower.
    Managed to get the 16GB Crucial Ballistix Red 3200Mhz for £65 off amazon the other day so that's sorted now. Think it's still CL16 at the 3200Mhz though.

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Igors Lab tested HZD:
    https://www.igorslab.de/en/driver-ov...own-drivers/5/

    Under DX12/Vulkan AMD GPUs have less CPU issues - Nvidia seems to have more CPU issues now. AMD is more bottlenecked under DX11,but most games implementing RT use DX12/Vulkan,and Nvidia is showing problems already.

    It seems Igor thinks Nvidia ASync compute is not working properly.

    Its not about FPS,but FPS variance especially minimums. Its very easy to see CPU bottlenecks at 1440P and 4K because it manifests itself in poor frametimes and minimums. So the FPS looks OK but its not smooth. This is going to not improve as time progresses - the consoles have a CPU with 8 Zen2 cores running at 3.6~3.8GHZ which is a big upgrade over the previous generation.
    Quote Originally Posted by CAT-THE-FIFTH View Post
    Evidence which shows issues with higher end Nvidia dGPUs and lower end CPUs. Some of those frametime variances seem to be an issue with Ampere more than with an RX6800XT.

    This is the same issue GCN had under DX11 against Maxwell and Pascal. Nvidia enabled better DX11 multi-threading driver support,so until Vega/RDNA1,Nvidia simply scaled better in DX11 games with a slower CPU than AMD.

    Now it seems with DX12/Vulkan the same is happening with Ampere,which has gone a very GCN like approach with a ton of shaders. Remember,RDNA1/RDNA2 uses hardware scheduling and AFAIK Ampere is still using a software scheduler.

    For the last 10 years,I have always matched my CPU upgrades and GPU upgrades together,ie,if I have a slower GPU I can get away with a slower CPU. When I got hold of this GTX1080 it broke that typical mould of what I did,my Xeon E3 1230 V2 was definitely a bottleneck even at qHD. When I upgraded to a Ryzen 5 2600,it was noticeably faster,and the Ryzen 7 3700X shows CPU gains at qHD. I really surprised,and in some ways I should have gotten a Core i7 8700 at the start.
    Quote Originally Posted by CAT-THE-FIFTH View Post
    Its far more than a few percent. The RTX3070 and RTX3060TI are not as badly affected. The RTX3080/RTX3090 seem to be affected somewhat more.Its actually manifested in scenarios where a game which is similar speed on an AMD RX6900 and RTX3090,has the latter fall behind.





    Look at the frametime consistency when you step down to a slower CPU?? AMD is fine. Nvidia shows a huge drop. In some of those instances an RTX3070 with a faster CPU is equal to an RTX3080 with a slower CPU.

    Testing SOTTR.



    FPS capped to 60FPS.

    The RTX3090 and RTX3080 are very close together in raw TFLOPs. The RX6900XT seems to not care as much.
    I'm afraid I think all this stuff is over my head. The images in the last post perhaps make it a bit clearer, there are drops of around 10-20 frames between generations of 600 level processors but its also not showing a 3080 or a 3600 so I guess that would be somewhere in between the 2600 with 3070 and 5600 with 3090. At any rate it will be monumentally better than the pushing 10 year old PC I'm using with a 650ti, whatever problems it may have.

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Remember,we are at an intergenerational time in games. We went from a 4C/4T Core i5 7600K thrashing a 6C/12T Ryzen 5 1600 in 2017,to the latter now showing increasingly consistent gameplay despite loosing in single core performance. This happened in 3 years. This was in an era of consoles which had 8 Atom class cores. Now we have a console era,with double the thread count,and a few times the single core performance. The consoles basically have the CPU performance of a downclocked Ryzen 7 3700X(especially the XBox Series X). Games which you see now,are using the older console generation as the baseline,but over the next 12~24 months its going to shift increasingly to the next generation.

    Hardware Unboxed were one of the few to point out this would happen with the Ryzen 5 1600,and it happened.

    So as long as you realise the CPU is a stop-gap then its OK,but I wouldn't promise you won't have issues over time. The AM4 platform does have good options,so if the Zen3 CPUs drop in price it should be an easy upgrade.
    Quote Originally Posted by CAT-THE-FIFTH View Post
    I just think if you can find £600 for a dGPU,a £150 spend on a CPU is a bit low. To put in context it makes for a more balanced system to have got a RTX3070FE for £470 and a better CPU. A good CPU will last more than one GPU upgrade.
    Quote Originally Posted by CAT-THE-FIFTH View Post
    Yes,I did suggest that they could see how it goes with the Ryzen 5 3600,but I do think it will probably need upgrading at some point even with the same GPU. I am just trying to point out they can hit a bad CPU bottleneck even now in certain titles with Ampere GPUs,and AMD actually seems to not have this issue. I am going to follow this some more. I was kind of 50:50 WRT to getting AMD or Nvidia for different reasons(more VRAM and lower power consumption with AMD and better RT performance with Nvidia,and more SFF GPUs). However,if Nvidia is more CPU limited in modern titles,I might have to take that into consideration.

    The shift in less than 3 years of the Core i5 7600K/Ryzen 5 1600 is really interesting. Essentially we have gone from 4C/4T being fine to 6C/12T being the sweetspot. The titles tested were console orientated titles,which is why you are seeing this. The issue is though you really want a PC to outspec a console. The Ryzen 5 3600 and console SOCs share the same Zen2 cores,but the consoles have more. The Ryzen 5 has more L3 cache but I suspect the consoles which have access to huge amounts of GDDR6 memory bandwidth don't really need it. ATM,we are going through the intergenerational period,but maybe when AMD releases Zen4 and Intel releases Alderlake,we will see more cores on desktop,which will push Zen3 pricing down a bit.
    Quote Originally Posted by CAT-THE-FIFTH View Post
    But since you can't get dGPUs easily now,I stuck with what I have,with surprising results.
    And this is the crux of the problem really. It's just a terrible time to need to upgrade. Actually choosing your GPU at the moment is potentially leaving you to either spend a long time or a lot of money. I took the 3080 because I had the chance to just get a decent GPU and be done with it, if I'd had my choice I would have gone for one of the lower cards for less money but I was willing to accept the 3080 for £660, the same price as the 3070 AIB cards I'd seen popping up (at best).

    So I think I'd rather stick with the plan to start with the 3600 for now with the acceptance that there's a good chance I'll want to upgrade it before long. I admit though that I am still a little torn because it's obviously more convenient to just plop the 8 core in now and be done with it, its just a hefty price tag for that 5800 which would last the longest.

  6. #21
    Registered+
    Join Date
    Mar 2021
    Posts
    32
    Thanks
    33
    Thanked
    4 times in 4 posts

    Re: First build in... well, basically first build.

    It occurs that it may also be pertinent that my 4k screen is an LG OLED TV with HDMI 2.1. I think does that limit my maximum frame rate to 120? The display stuff is something I haven't really looked into much yet.

  7. #22
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: First build in... well, basically first build.

    Quote Originally Posted by Drago MkII View Post
    It occurs that it may also be pertinent that my 4k screen is an LG OLED TV with HDMI 2.1. I think does that limit my maximum frame rate to 120? The display stuff is something I haven't really looked into much yet.
    HDMI 2.1 is fine for 4K at 144hz. Whether your specific monitor supports it I don't know - there have been several OLED LG TV models

  8. Received thanks from:

    Drago MkII (20-03-2021)

  9. #23
    Registered+
    Join Date
    Mar 2021
    Posts
    32
    Thanks
    33
    Thanked
    4 times in 4 posts

    Re: First build in... well, basically first build.

    Quote Originally Posted by kalniel View Post
    HDMI 2.1 is fine for 4K at 144hz. Whether your specific monitor supports it I don't know - there have been several OLED LG TV models
    Thanks. It's an LG E9 OLED. Missed it when I tried to search before but it looks like it has a native 120hz panel. Not that I'm expecting to get near that for 4k with my given specs and all that's been discussed. Just wasn't sure if it might have been lower than that or impacted it some other way with it being a TV rather than a monitor.

Page 2 of 2 FirstFirst 12

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •