Page 2 of 3 FirstFirst 123 LastLast
Results 17 to 32 of 36

Thread: id Software CTO talks about a Ryzen optimised game engine

  1. #17
    Senior Member
    Join Date
    Jul 2016
    Location
    My happy place
    Posts
    230
    Thanks
    75
    Thanked
    16 times in 14 posts
    • afiretruck's system
      • Motherboard:
      • Gigabyte X399 Designare Ex
      • CPU:
      • AMD Threadripper 1900X
      • Memory:
      • Corsair 32GB 3200MHz
      • Storage:
      • 2x 250GB NVMe + 2x 1TB SATA
      • Graphics card(s):
      • RX Vega 64 + GTX 970
      • PSU:
      • Corsair RMi 850
      • Case:
      • Fractal Design Define R6
      • Operating System:
      • Linux Mint 19
      • Monitor(s):
      • Screeny

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by nobodyspecial View Post
    I'm hoping to kick my windows habit at some point if this really takes off.
    I'm with you, with that one! Really looking forward to seeing how much Linux desktops will develop into gaming-friendly platforms over the next few years. Hopefully Vulkan, and more sensible games developers will speed things along There's just no good reason, in today's world, to not build cross-platform games.

  2. #18
    Not a good person scaryjim's Avatar
    Join Date
    Jan 2009
    Location
    Gateshead
    Posts
    15,196
    Thanks
    1,232
    Thanked
    2,290 times in 1,873 posts
    • scaryjim's system
      • Motherboard:
      • Dell Inspiron
      • CPU:
      • Core i5 8250U
      • Memory:
      • 2x 4GB DDR4 2666
      • Storage:
      • 128GB M.2 SSD + 1TB HDD
      • Graphics card(s):
      • Radeon R5 230
      • PSU:
      • Battery/Dell brick
      • Case:
      • Dell Inspiron 5570
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 15" 1080p laptop panel

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by nobodyspecial View Post
    ... This was NOT the case the last time AMD took the lead for 3yrs where they won almost EVERYTHING out of the box ...
    You know, this always amuses me. There's this rose tinted view of Athon 64 that it simply hammered the various Pentium 4s, but if you go back through the historic reviews for FX 57 / FX 60 that's not borne out - here's plenty of benchmarks that still favoured Intel, even back then when AMD were widely acknowledged to have the better processors. Gaming was the Athlon's forte at that time, but actually the P4 wasn't that far behind comparitively, even there.

    Quote Originally Posted by nobodyspecial View Post
    ...I think AMD will sell to a lot of workstation users and hopefully that paves the way for a greater GEN2 gamer chip. It sounds like they were already working on it knowing they had a problem across the board for gamers (as reviews have shown). ..
    I think one of the clever things AMD have done this time is talk up the workstation credentials of Ryzen - hopefully that will encourage the big OEMs to put them in their enterprise offerings, and that in turn might help them get into the consumer lines (as the OEMs will already be familiar with AMD's designs & motherboards).

    As to Zen 2, I don't think AMD would agree they have a "problem" with gaming, but they have publicly stated that they understand which parts of the architecture have the most potential for optmisation* and are already working on revisions to improve those. It'll be interesting to see what they come up with, given just how competitive the first iteration is against a very mature and well-optimised Intel architecture.

    *i.e. they know where the bottlenecks are (*ahem*cross-CCX communication*ahem*)

  3. #19
    Senior Member
    Join Date
    Jul 2016
    Location
    My happy place
    Posts
    230
    Thanks
    75
    Thanked
    16 times in 14 posts
    • afiretruck's system
      • Motherboard:
      • Gigabyte X399 Designare Ex
      • CPU:
      • AMD Threadripper 1900X
      • Memory:
      • Corsair 32GB 3200MHz
      • Storage:
      • 2x 250GB NVMe + 2x 1TB SATA
      • Graphics card(s):
      • RX Vega 64 + GTX 970
      • PSU:
      • Corsair RMi 850
      • Case:
      • Fractal Design Define R6
      • Operating System:
      • Linux Mint 19
      • Monitor(s):
      • Screeny

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by scaryjim View Post
    *i.e. they know where the bottlenecks are (*ahem*cross-CCX communication*ahem*)
    Couldn't that be resolved, at least partially, by pinning the application threads to single CPUs? I know Windows likes to juggle them around when the system isn't under 100% load...

  4. #20
    Not a good person scaryjim's Avatar
    Join Date
    Jan 2009
    Location
    Gateshead
    Posts
    15,196
    Thanks
    1,232
    Thanked
    2,290 times in 1,873 posts
    • scaryjim's system
      • Motherboard:
      • Dell Inspiron
      • CPU:
      • Core i5 8250U
      • Memory:
      • 2x 4GB DDR4 2666
      • Storage:
      • 128GB M.2 SSD + 1TB HDD
      • Graphics card(s):
      • Radeon R5 230
      • PSU:
      • Battery/Dell brick
      • Case:
      • Dell Inspiron 5570
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 15" 1080p laptop panel

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by afiretruck View Post
    Couldn't that be resolved, at least partially, by pinning the application threads to single CPUs? I know Windows likes to juggle them around when the system isn't under 100% load...
    From what I've read over the last couple of months, when applications don't try to set their own affinity Windows 10 prefers to assign threads to the same CCX anyway (i.e. the Windows 10 scheduler is Ryzen aware), it just doesn't always seem to happen for some reason. Not sure if there'll ever be anything that can be done about that...

    One of the problems with games specifically is that they try to optimise thread placements themselves by assigning affinity, and if they get that wrong (and a number of games incorrectly interpret the cores/threads/cache layout of Ryzen) it's going to hurt performance. There was also an issue with core affinity and the way Windows 10 handles core parking that was making a mess of things. So there's a few things that are already being addressed (e.g. with AMD's modified power profiles they've released recently) that will boost performance on Ryzen 1; but the best way of dealing with that is simply to speed up the communication between the CCXes, which I assume will be a target for Ryzen 2.

  5. #21
    Senior Member Xlucine's Avatar
    Join Date
    May 2014
    Posts
    2,162
    Thanks
    298
    Thanked
    188 times in 147 posts
    • Xlucine's system
      • Motherboard:
      • Asus prime B650M-A II
      • CPU:
      • 7900
      • Memory:
      • 32GB @ 4.8 Gt/s (don't want to wait for memory training)
      • Storage:
      • Crucial P5+ 2TB (boot), Crucial P5 1TB, Crucial MX500 1TB, Crucial MX100 512GB
      • Graphics card(s):
      • Asus Dual 4070 w/ shroud mod
      • PSU:
      • Fractal Design ION+ 560P
      • Case:
      • Silverstone TJ08-E
      • Operating System:
      • W10 pro
      • Monitor(s):
      • Viewsonic vx3211-2k-mhd, Dell P2414H
      • Internet:
      • Gigabit symmetrical

    Re: id Software CTO talks about a Ryzen optimised game engine

    Easiest solution is wait for the APU's (as they ought to be based on a single CCX)

  6. #22
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,009
    Thanks
    781
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by kalniel View Post
    I bought a bike
    Nice! I hope the lead time is better than the 7 weeks I will have to wait for the printer


    Quote Originally Posted by Xlucine View Post
    Easiest solution is wait for the APU's (as they ought to be based on a single CCX)
    I fully expect one of the CCXs to be replaced with a block of shaders making the overall size of the silicon about the same. That would mean the production cost would be pretty similar to the current CPU only part, with the only difference down to whether one of them gets made in larger quantities. For laptop use it could be awesome, but for desktop you lose half the L3 cache and gain some shaders that you don't care one jot for because you already have a graphics card.

    Ryzen 2 will probably improve lots of things, but that would be a year away at best.

  7. #23
    Senior Member Xlucine's Avatar
    Join Date
    May 2014
    Posts
    2,162
    Thanks
    298
    Thanked
    188 times in 147 posts
    • Xlucine's system
      • Motherboard:
      • Asus prime B650M-A II
      • CPU:
      • 7900
      • Memory:
      • 32GB @ 4.8 Gt/s (don't want to wait for memory training)
      • Storage:
      • Crucial P5+ 2TB (boot), Crucial P5 1TB, Crucial MX500 1TB, Crucial MX100 512GB
      • Graphics card(s):
      • Asus Dual 4070 w/ shroud mod
      • PSU:
      • Fractal Design ION+ 560P
      • Case:
      • Silverstone TJ08-E
      • Operating System:
      • W10 pro
      • Monitor(s):
      • Viewsonic vx3211-2k-mhd, Dell P2414H
      • Internet:
      • Gigabit symmetrical

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by DanceswithUnix View Post
    I fully expect one of the CCXs to be replaced with a block of shaders making the overall size of the silicon about the same. That would mean the production cost would be pretty similar to the current CPU only part, with the only difference down to whether one of them gets made in larger quantities. For laptop use it could be awesome, but for desktop you lose half the L3 cache and gain some shaders that you don't care one jot for because you already have a graphics card.

    Ryzen 2 will probably improve lots of things, but that would be a year away at best.
    Ryzen is 4.8bn transistors (here), so if half a CCX is about half that - 2.4bn transistors is a hell of a lot for a block of shaders. A 560 is 3bn (here), and a 550 is 2.2bn (here). There'll be some of the I/O that you won't lose, of course, but the GDDR5 memory controller on the block of shaders won't be necessary either. A system with 4 SMT enabled cores and a 550, probably all fitting in a 95W TDP (half a 1700 + 50W for the 550), would be awesome - there'll be no point in getting a graphics card for 1080p.

  8. #24
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,009
    Thanks
    781
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by Xlucine View Post
    Ryzen is 4.8bn transistors (here), so if half a CCX is about half that - 2.4bn transistors is a hell of a lot for a block of shaders. A 560 is 3bn (here), and a 550 is 2.2bn (here). There'll be some of the I/O that you won't lose, of course, but the GDDR5 memory controller on the block of shaders won't be necessary either. A system with 4 SMT enabled cores and a 550, probably all fitting in a 95W TDP (half a 1700 + 50W for the 550), would be awesome - there'll be no point in getting a graphics card for 1080p.
    Go look at the die photo, the pair of CCXs are little rectangles in the middle. One CCX is a pretty small area of the die, a huge amount is taken up with DDR controllers, PCIe lanes etc to keep the cores fed. Now the APU could drop some of the PCIe lanes so making discrete crossfire a CPU only thing and maybe drop the L3 cache, but otherwise I don't see much room for shaders.

    From the other direction, the RX550 is some 80% of the size of the RX560 despite having half of the shaders, because AMD kept the same number of RAM channels, video decode etc in the two parts: http://www.anandtech.com/show/11280/...ries-polaris/2

  9. #25
    Senior Member Xlucine's Avatar
    Join Date
    May 2014
    Posts
    2,162
    Thanks
    298
    Thanked
    188 times in 147 posts
    • Xlucine's system
      • Motherboard:
      • Asus prime B650M-A II
      • CPU:
      • 7900
      • Memory:
      • 32GB @ 4.8 Gt/s (don't want to wait for memory training)
      • Storage:
      • Crucial P5+ 2TB (boot), Crucial P5 1TB, Crucial MX500 1TB, Crucial MX100 512GB
      • Graphics card(s):
      • Asus Dual 4070 w/ shroud mod
      • PSU:
      • Fractal Design ION+ 560P
      • Case:
      • Silverstone TJ08-E
      • Operating System:
      • W10 pro
      • Monitor(s):
      • Viewsonic vx3211-2k-mhd, Dell P2414H
      • Internet:
      • Gigabit symmetrical

    Re: id Software CTO talks about a Ryzen optimised game engine

    Tom's hardware is reporting that the 550 is not a fully enabled polaris 12: http://www.tomshardware.co.uk/amd-ra...iew-33875.html
    There's also a videocardz leak before the 550 was launched suggesting 10 CUs: https://videocardz.com/67503/amd-pol...eam-processors
    I'd also completely forgotten that the A10-78X0 APU's had 8 graphics compute units, and wasn't actually all that great: http://hexus.net/tech/reviews/cpu/92...-7860k/?page=5
    DDR4 will help, as will the normal tweaks to the architecture, but it won't set the world on fire. So given AMD has a history of A) the X50 card being ideally suited to paring with an IGP and B ) they have previously fit 512 shaders in an IGP, I would be very surprised if we didn't see a ryzen based APU with 8 CUs

  10. #26
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,009
    Thanks
    781
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by Xlucine View Post
    So given AMD has a history of A) the X50 card being ideally suited to paring with an IGP and B ) they have previously fit 512 shaders in an IGP, I would be very surprised if we didn't see a ryzen based APU with 8 CUs
    We should be capable of a very educated guess here. I start off assuming die size for the APU remains the same as Ryzen and with the same I/O.

    A single CCX is 44mm^2, less than quarter of the Ryzen die size : http://www.realworldtech.com/forum/?...rpostid=165748
    That's a really nice figure, because according to http://www.anandtech.com/show/11280/...ries-polaris/2 the difference between RX560 and RX550 is 22mm^2. If Tom's is right and there are two disabled CUs in there for a total of 10, then that means 22mm gets you 16-10 = 6 CUs.

    So, 6CUs in 22mm^2, nice when I don't have to reach for the calculator, that gives 12CUs in the same area 44mm^2 of a single CCX.

    That also lines up quite nicely with your rx550 comment, 12CUs fighting the CPUs for DDR4 accesses could well be on par with an 8CU GPU for performance.

    If Tom's is wrong and the 550 really only has 8 CUs, them that means 22mm^2 corresponds to 16-8 = 8CUs, so 44mm^2 would allow for 16CUs in total. That seems a lot.

    One thing I have skipped over is if the APU needs extra cache for the GPU section, so a couple of CUs could get knocked out to help DDR4 which would help both for bandwidth and in laptop use would help power consumption.

    So I think that is the high end, now just a case of what AMD feel like cutting out to save cost.

  11. #27
    Not a good person scaryjim's Avatar
    Join Date
    Jan 2009
    Location
    Gateshead
    Posts
    15,196
    Thanks
    1,232
    Thanked
    2,290 times in 1,873 posts
    • scaryjim's system
      • Motherboard:
      • Dell Inspiron
      • CPU:
      • Core i5 8250U
      • Memory:
      • 2x 4GB DDR4 2666
      • Storage:
      • 128GB M.2 SSD + 1TB HDD
      • Graphics card(s):
      • Radeon R5 230
      • PSU:
      • Battery/Dell brick
      • Case:
      • Dell Inspiron 5570
      • Operating System:
      • Windows 10
      • Monitor(s):
      • 15" 1080p laptop panel

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by DanceswithUnix View Post
    ... that gives 12CUs in the same area 44mm^2 of a single CCX. ...
    Interestingly 12 CUs (768 shaders) matches the last rumours I heard for Zen APUs.

    Based on the performance of the 512 shaders in Kaveri (which roughly matched a 386 shader discrete R7 250 DDR3), a 768 shader APU might well be a reasonable performance match for the 512 shader RX 550 (it'll all depend on clocks and memory bandwidth, of course)...

  12. #28
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    13,009
    Thanks
    781
    Thanked
    1,568 times in 1,325 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by scaryjim View Post
    Interestingly 12 CUs (768 shaders) matches the last rumours I heard for Zen APUs.

    Based on the performance of the 512 shaders in Kaveri (which roughly matched a 386 shader discrete R7 250 DDR3), a 768 shader APU might well be a reasonable performance match for the 512 shader RX 550 (it'll all depend on clocks and memory bandwidth, of course)...
    It did sound like the fabric is already designed to have shaders plugged into it, so it would make sense if AMD can just swap out that rectangle that contains the CCX.

    What I am wondering though is how they can make an AM4 low cost option. They could do with a genuine 2 CPU core option rather than 4 cores with a couple disabled and lower graphics count for competing in the low end. That would involve a new CCX design. But if they drop to 2 cores, 6 CUs, no L3 cache and only 8 PCIe lanes you are probably still looking at 140mm^2. Perhaps with integrated south bridge that is acceptable, but if Skylake dual core is just under 100mm^2 (https://en.wikichip.org/wiki/intel/m...ctures/skylake) then it feels like AMD need to chop a fair bit off to be price competitive.

    Mind you, Intel's square mm are more expensive than everyone else's.

  13. #29
    Senior Member Xlucine's Avatar
    Join Date
    May 2014
    Posts
    2,162
    Thanks
    298
    Thanked
    188 times in 147 posts
    • Xlucine's system
      • Motherboard:
      • Asus prime B650M-A II
      • CPU:
      • 7900
      • Memory:
      • 32GB @ 4.8 Gt/s (don't want to wait for memory training)
      • Storage:
      • Crucial P5+ 2TB (boot), Crucial P5 1TB, Crucial MX500 1TB, Crucial MX100 512GB
      • Graphics card(s):
      • Asus Dual 4070 w/ shroud mod
      • PSU:
      • Fractal Design ION+ 560P
      • Case:
      • Silverstone TJ08-E
      • Operating System:
      • W10 pro
      • Monitor(s):
      • Viewsonic vx3211-2k-mhd, Dell P2414H
      • Internet:
      • Gigabit symmetrical

    Re: id Software CTO talks about a Ryzen optimised game engine

    Ryzen is already expensive on a square mm basis, compare any ryzen 7 to the 580 - both fully enabled, but the 580 is ~20% bigger and comes with VRAM and a board

  14. #30
    Banned - repeated insults to other members
    Join Date
    Feb 2015
    Posts
    146
    Thanks
    0
    Thanked
    4 times in 3 posts

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by shaithis View Post
    The problem is, you need every developer to do this.....I'm still at the point where I'd prefer to buy something that everything is optimised for out-of-the-box......surely AMD could have kept certain architectural features in-line with Intel so that every engine and piece of code didn't need specific Ryzen optimisations?

    The code used to make most games are based on previous versions of that code that date back years. EA Games, for example, is well known to re-use code they built from previous years, such as those used in various EA Sports titles. Some games even advertise the use of game engines used in previous popular titles such as the Quake engine that was used in multiple successful titles including Half-Life. And today's Unreal engine, which was developed 17 years ago and was used in titles as recent as Gears of War and Bioshock among others.

    Intel is complicit in this conspiracy and has been peddling you crap CPU architectures optimised for single-threaded crap. 5 to 10% IPC improvements every year. Still happy with dual core laptops and PCs are you?

    Parallelization in software is still not taught early enough in most schools. Its often a chicken & egg issue where the software industry waits for hardware to come which they can harness and vice-versa. Parallelization is inevitable, the likes of cloud services, VR, AI, autonomous cars etc. simply dictates that massively parrallel processing will become the norm.

    AMD deliberately ditched the old paradigm and went for an architecture to deliver this parallelization future at as much as half the price of the Intel legacy crap. Its amazing how so many of you and your Intel fanboy types crap about wanting the old crap just because it runs legacy single threaded 1080p games 5 to 8% faster.

    Go on, spend twice the amount of AMD Ryzen on Intel and see it become slower and obsolete in the next year or two. Or spend half the amount of Intel on the AMD Ryzen and see it become way faster over the next 3 to 5 years. No brainer to me but you will be surprised how many ass clowns (juvenile-gamers-spending-their-parent's-money-types) are completely unable to comprehend this.

  15. #31
    Banned - repeated insults to other members
    Join Date
    Feb 2015
    Posts
    146
    Thanks
    0
    Thanked
    4 times in 3 posts

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by Primey0 View Post
    Sorry but I'm not going to invest in a CPU that promises future support from developers. Publishers/developers can be extremely lazy when it comes to pc games, you really think they all are going to go out of their way to specifically optimise for Ryzen? Ryzen might be best for id software games but I want a CPU that is good on all games not certain games. That's why Intel will be my future CPU.
    See what I mean by the types of clowns who do not understand what is happening yet?

    "The code used to make most games are based on previous versions of that code that date back years. EA Games, for example, is well known to re-use code they built from previous years, such as those used in various EA Sports titles. Some games even advertise the use of game engines used in previous popular titles such as the Quake engine that was used in multiple successful titles including Half-Life. And today's Unreal engine, which was developed 17 years ago, is still used in titles as recent as Gears of War and Bioshock among others.

    Intel is complicit in this conspiracy and has been peddling you crap CPU architectures optimised for single-threaded crap. 5 to 10% IPC improvements every year. Still happy with dual core laptops and PCs are you?

    Parallelization in software is still not taught early enough in most schools. Its often a chicken & egg issue where the software industry waits for hardware to come which they can harness and vice-versa. Parallelization is inevitable, the likes of cloud services, VR, AI, autonomous cars etc. simply dictates that massively parrallel processing will become the norm.

    AMD deliberately ditched the old paradigm and went for an architecture to deliver this parallelization future at as much as half the price of the Intel legacy crap. Its amazing how so many of you and your Intel fanboy types crap about wanting the old crap just because it runs legacy single threaded 1080p games 5 to 8% faster.

    Go on, spend twice the amount of cash on Intel and see it become slower and obsolete in the next year or two. Or spend half the amount of cash on the AMD Ryzen (which beats Intel already in many areas out of the box) and see it become way faster over the next 3 to 5 years. No brainer to me but you will be surprised how many ass clowns (juvenile-gamer-spending-their-parents’-money-types) are completely unable to comprehend this."

  16. #32
    Banned - repeated insults to other members
    Join Date
    Feb 2015
    Posts
    146
    Thanks
    0
    Thanked
    4 times in 3 posts

    Re: id Software CTO talks about a Ryzen optimised game engine

    Quote Originally Posted by Primey0 View Post
    Sorry but I'm not going to invest in a CPU that promises future support from developers. Publishers/developers can be extremely lazy when it comes to pc games, you really think they all are going to go out of their way to specifically optimise for Ryzen? Ryzen might be best for id software games but I want a CPU that is good on all games not certain games. That's why Intel will be my future CPU.
    Also, just to delight in bursting your bubble. Any computer component is an expense that depreciates out of the box. Its never an investment, unless it becomes worth more.

    In that sense, AMD Ryzen is far more likely to appreciate in value since it will become faster and faster as software developers re-code to increase parallelization (see my other post in this thread).

    Of course, not that I expect you to understand this since the schools didn't teach parrallelization early enough.

Page 2 of 3 FirstFirst 123 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •