Page 3 of 6 FirstFirst 123456 LastLast
Results 33 to 48 of 90

Thread: Normal Witcher 3 performance is possible on AMD GPUs

  1. #33
    Comfortably Numb directhex's Avatar
    Join Date
    Jul 2003
    Location
    /dev/urandom
    Posts
    17,074
    Thanks
    228
    Thanked
    1,027 times in 678 posts
    • directhex's system
      • Motherboard:
      • Asus ROG Strix B550-I Gaming
      • CPU:
      • Ryzen 5900x
      • Memory:
      • 64GB G.Skill Trident Z RGB
      • Storage:
      • 2TB Seagate Firecuda 520
      • Graphics card(s):
      • EVGA GeForce RTX 3080 XC3 Ultra
      • PSU:
      • EVGA SuperNOVA 850W G3
      • Case:
      • NZXT H210i
      • Operating System:
      • Ubuntu 20.04, Windows 10
      • Monitor(s):
      • LG 34GN850
      • Internet:
      • FIOS

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by CAT-THE-FIFTH View Post
    but when has GTX960 been just behind a GTX780??
    http://www.3dmark.com/compare/fs/4740806/fs/2757339

    When compared directly to each other, I guess? Note especially how much better the 960 is on the physics benches than the 780, which is relevant given this discussion is all about GameWorks

  2. #34
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Normal Witcher 3 performance is possible on AMD GPUs





    Quote Originally Posted by directhex View Post
    http://www.3dmark.com/compare/fs/4740806/fs/2757339

    When compared directly to each other, I guess? Note especially how much better the 960 is on the physics benches than the 780
    Fail - the graphs are with Gameworks off. So,with reduced tessellation and much lower physics.

    And yet the AMD cards are doing better and they had generally worse tessellation in synthetic benchmarks overall.

    R9 280X = Geforce Titan,right??

    Geforce Titan only 20% faster than a GTX960??

    Yeah,a compute enabled GK110 card.

    People have run older drivers on Kepler cards as a check - zero change.

    Nvidia is basically not bothering on optimising for Kepler anymore within 5 to 9 months.

    Awesome driver support,right!

    Even,looking at this post on Reddit,it increasingly looks like Kepler is being kept on the wayside:

    https://www.reddit.com/r/buildapc/co...ler_cards_has/

    Yet,no one seems to commenting on this.

    Built in obsolescence within a few months.

    Yet,only one company gets the flack for drivers and lack of performance from drivers.

    At this rate,I think I am just going to play Indie games.

    Just getting fed up with the STUPID E-PEEN enabling marketing from BOTH companies.

    For what??

    Games which only look like HD versions of their consoles counterparts??

    I want Crytek,circa,2006 back.
    Last edited by CAT-THE-FIFTH; 22-05-2015 at 12:01 PM.

  3. #35
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by CAT-THE-FIFTH View Post
    R9 280X = Geforce Titan,right??
    Perhaps more surprisingly, 290X = 780Ti SLI too!

  4. #36
    Anthropomorphic Personification shaithis's Avatar
    Join Date
    Apr 2004
    Location
    The Last Aerie
    Posts
    10,857
    Thanks
    645
    Thanked
    872 times in 736 posts
    • shaithis's system
      • Motherboard:
      • Asus P8Z77 WS
      • CPU:
      • i7 3770k @ 4.5GHz
      • Memory:
      • 32GB HyperX 1866
      • Storage:
      • Lots!
      • Graphics card(s):
      • Sapphire Fury X
      • PSU:
      • Corsair HX850
      • Case:
      • Corsair 600T (White)
      • Operating System:
      • Windows 10 x64
      • Monitor(s):
      • 2 x Dell 3007
      • Internet:
      • Zen 80Mb Fibre

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Physx is used in a lot of titles, just not running on the GPU for some crazy reason. It is even used in a ton of console games.

    I guess it is now more a selling point of GamesWork rather than a selling point for GPUs.......which probably makes it more profitable to nVidia in the long run.

    The Kepler optimisations are a tad worrying, especially as I own 3 kepler cards even more so when the graphs I have seen that show kepler cards under-performing have been GamesWork titles (PC and W3)....and while that is only two titles, they are 2 extremely new titles.....so I am not going to get too stressed about it yet (especially when PC still runs really well on a 780@1080p)

    I think I need to see more "form" before deciding that the sky is falling.......and I am not seeing much on the geforce forums yet, normally performance issues generate quite an amount of traffic there.

    As for voting with your wallet, it's becoming increasingly more difficult to do that as the amount of GamesWork titles increases........you cannot stop developers using the tools they feel they need and buying an AMD card now comes with a bit of a GamesWork gamble. What we really need, is the same thing I have been saying for over a year: AMD to work closer with devs. They spent so much money trying to get Mantle out there, when all that did was make their APUs more acceptable in very particular circumstances and help alleviate some Crossfire CPU-bottlenecks (so extreme low and high ends). I can't help but feel that they could have spent that money better (or just spent more) on hiring people to go and work with software houses on optimising titles. I have heard time and time again devs saying they have worked with nVidia but the only time I've really heard it about AMD was during the Mantle implementations on the Frostbite engine.

    At the end of the day, I see the biggest problem makers to be the devs. They choose the tools, they write the code....it is ultimately their code that dictates features and performance.....and at the moment it seems they are lapping up the GamesWork libraries.
    Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
    HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
    HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
    Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
    NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
    Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive

  5. #37
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    And the reason for equally terrible Kepler performance?

    Edit: Ninja'd

    A lot of people are treating this as just an AMD vs Nvidia thing. It really isn't.
    No it isn't, it's a tessellation issue.
    Nvidia cards before the GTX 680 sucked at tessellation, and all AMD cards suck at tessellation.

    Quote Originally Posted by watercooled View Post
    A lot of it is, but that's kinda the point isn't it?
    No. The point is for a company to make a profit and you don't do that by giving away any advantage you have over your competitors for free, OTH if the market share is evenly distributed then it makes more sense to work with your competitors.

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Thats the point - if Nvidia had just opened PhysX a bit it would probably be more widely used. Its hilarious that despite all the physics demos for the last decade,its just a random feature doing not much in games.
    Total agree, but what we're seeing is what happens when one company has a near monopoly.
    When Microsoft had the lions share of the OS market the last thing they would do is give away the source code for the win32 binaries as that's one of their main competitive advantages over the likes of Linux and Apple, if they open sourced the win32 binaries people would have no reason to stick with Windows, they could have run their Windows software on any OS they wanted, that's great for consumers but terrible for Microsoft bottom line.

  6. #38
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    Perhaps more surprisingly, 290X = 780Ti SLI too!
    Well,if you only really care about latest cards,which a fraction of your userbase own,I suppose it makes sense if you only stopped really selling them in volume a scant few months ago.

    But,pfft,I am going to stick to playing Pillars of Eternity,a bit of PS2,some D3 and some Ingress.

    This whole gen of cards is a bore and no games released this year appear to be that interesting now. Borderlands: The Pre-Sequel! was the only game released last year I really had any interest in. W3 has all this politics and still looks downgraded over the initial previews.

    Watch Dogs,I was all excited about and it ended up a damb squib.

    I have the money to buy a GTX970/GTX980/R9 290X for these games. Whats the point now??

    You might as well buy a PS4 for them. So they can do one too.

    Going to just spend it on a holiday or one of my other hobbies now.

    All hype for these games for graphics. Yet when released need excessive hardware compared to the console versions,don't look anywhere as great as the previews and serve as hype trains for hardware companies.

    At least Crysis delivered in the visual department and still does even now. At least I could see the point of spending money on upgrades for it.



    Its over 8 years!!

    For 8+ years of graphics evolution,the newer games look rubbish for the amount of hardware they need.

    If Fallout 4 and Cyberpunk 2077(by CDPR too) end up like damb squibs, then I might as well stick to Indie games,who have a tendency to not care what they run on or need uber hardware. They might look crapper but at least its in line with what hardware you need.

    Quote Originally Posted by shaithis View Post
    Physx is used in a lot of titles, just not running on the GPU for some crazy reason. It is even used in a ton of console games.

    I guess it is now more a selling point of GamesWork rather than a selling point for GPUs.......which probably makes it more profitable to nVidia in the long run.

    The Kepler optimisations are a tad worrying, especially as I own 3 kepler cards even more so when the graphs I have seen that show kepler cards under-performing have been GamesWork titles (PC and W3)....and while that is only two titles, they are 2 extremely new titles.....so I am not going to get too stressed about it yet (especially when PC still runs really well on a 780@1080p)

    I think I need to see more "form" before deciding that the sky is falling.......and I am not seeing much on the geforce forums yet, normally performance issues generate quite an amount of traffic there.

    As for voting with your wallet, it's becoming increasingly more difficult to do that as the amount of GamesWork titles increases........you cannot stop developers using the tools they feel they need and buying an AMD card now comes with a bit of a GamesWork gamble. What we really need, is the same thing I have been saying for over a year: AMD to work closer with devs. They spent so much money trying to get Mantle out there, when all that did was make their APUs more acceptable in very particular circumstances and help alleviate some Crossfire CPU-bottlenecks (so extreme low and high ends). I can't help but feel that they could have spent that money better (or just spent more) on hiring people to go and work with software houses on optimising titles. I have heard time and time again devs saying they have worked with nVidia but the only time I've really heard it about AMD was during the Mantle implementations on the Frostbite engine.

    At the end of the day, I see the biggest problem makers to be the devs. They choose the tools, they write the code....it is ultimately their code that dictates features and performance.....and at the moment it seems they are lapping up the GamesWork libraries.
    Well,AMD is not helping itself either - I get the impression in the last few months their XFire profiles have become more erratic and it appears from a high in 2013,they don't seem to be involved in nearly as many games as before. I just think all the issues with the CPU side of their business are starting to just drag down the graphics side now.

    The thing is also I get the impression,it might be ease and cost that might be the reason why some of the Gameworks features are seeing increased uptake??

    Having said that its been mostly in Ubisoft titles and I believe that CDPR has tended to be closer to Nvidia anyway?? Could be wrong on the latter though.
    Last edited by CAT-THE-FIFTH; 22-05-2015 at 12:35 PM.

  7. #39
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by shaithis View Post
    Physx is used in a lot of titles, just not running on the GPU for some crazy reason. It is even used in a ton of console games.
    GPU PhysX effects != PhysX. The PhysX physics engine is used by a relatively large amount of games, the GPU part is something different.


    Quote Originally Posted by Corky34 View Post
    No it isn't, it's a tessellation issue.
    Nvidia cards before the GTX 680 sucked at tessellation, and all AMD cards suck at tessellation.
    Blatantly untrue as I explained earlier. And again, why then do cards including the 680 and newer also run very badly?

    Quote Originally Posted by Corky34 View Post
    No. The point is for a company to make a profit and you don't do that by giving away any advantage you have over your competitors for free, OTH if the market share is evenly distributed then it makes more sense to work with your competitors.
    By restricting PhysX access all they've managed is to destroy its potential. I don't see how that's an advantage.

    Quote Originally Posted by Corky34 View Post
    Total agree, but what we're seeing is what happens when one company has a near monopoly.
    The GPU market isn't even close to a near-monopoly. If anything, Intel is in the lead by far in terms of market share.

  8. #40
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    Blatantly untrue as I explained earlier. And again, why then do cards including the 680 and newer also run very badly?
    Perhaps you can point me in the direction of the post where you explained that's untrue, I can't find the benchmarks you posted showing the differences between something like the 680 and 670 with HairWorks on.

    Quote Originally Posted by watercooled View Post
    By restricting PhysX access all they've managed is to destroy its potential. I don't see how that's an advantage.
    Because arguably they've sold more cards than their competitor, whether that's entirely down to PhysX is neither hear nor there.
    What is certain is that if they didn't keep PhysX as a proprietary system customers would have less reason to choose their cards over their competitors.

    Just like Microsoft keeping the win32 binaries proprietary has caused customers to choose Windows over any other OS.

    Quote Originally Posted by watercooled View Post
    The GPU market isn't even close to a near-monopoly. If anything, Intel is in the lead by far in terms of market share.
    Yea because Intel's IGPU are really great at running games on anything above the lowest setting ain't they. :
    Like it or not the market share of discrete GPUs, about the only type of GPUs that can run modern games at reasonably high settings, is divided between two manufacturers (arguably) that's a 20-80% split at best.

  9. #41
    Anthropomorphic Personification shaithis's Avatar
    Join Date
    Apr 2004
    Location
    The Last Aerie
    Posts
    10,857
    Thanks
    645
    Thanked
    872 times in 736 posts
    • shaithis's system
      • Motherboard:
      • Asus P8Z77 WS
      • CPU:
      • i7 3770k @ 4.5GHz
      • Memory:
      • 32GB HyperX 1866
      • Storage:
      • Lots!
      • Graphics card(s):
      • Sapphire Fury X
      • PSU:
      • Corsair HX850
      • Case:
      • Corsair 600T (White)
      • Operating System:
      • Windows 10 x64
      • Monitor(s):
      • 2 x Dell 3007
      • Internet:
      • Zen 80Mb Fibre

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    GPU PhysX effects != PhysX. The PhysX physics engine is used by a relatively large amount of games, the GPU part is something different.
    That's not my understanding, IMO, Physx is Physx. It's just a matter of whether the developers allows it to run on the GPU or not. Surely, they just make API calls and the Physx engine decides how it is going to run it.

    I cannot see any other way for the "Run on CPU or GPU" option on nVidia hardware would work or for the same code to run on GPU-accelerated and CPU-only configurations.
    Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
    HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
    HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
    Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
    NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
    Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive

  10. #42
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by Corky34 View Post
    Perhaps you can point me in the direction of the post where you explained that's untrue, I can't find the benchmarks you posted showing the differences between something like the 680 and 670 with HairWorks on.
    Kepler performance is indisputably bad regardless of Hairworks. There's nothing at all wrong with tesselation performance on GCN - if devs choose to use idiosyncrasies of the architecture to favour one over the other, that's their problem. If a game ran Bitcoin-like code in the background, would you blame Nvidia for 'sucking' with their lack of compute throughput? Because that's pretty much exactly what excessive tesselation achieves as seen in the likes of Batman/Crysis 2 as I did explain earlier - it provably did nothing to improve image quality, just harmed performance on AMD more than Nvidia at the time.

    Quote Originally Posted by Corky34 View Post
    Because arguably they've sold more cards than their competitor, whether that's entirely down to PhysX is neither hear nor there.
    What is certain is that if they didn't keep PhysX as a proprietary system customers would have less reason to choose their cards over their competitors.
    Besides a couple of rabid fanbois, who is really likely to make a purchase decision based on a very minor bit of visual effects used in about two games?

    Quote Originally Posted by Corky34 View Post
    Just like Microsoft keeping the win32 binaries proprietary has caused customers to choose Windows over any other OS.
    Apples and oranges. iOS is a very closed ecosystem yet the more open Android has the greater market share.

  11. #43
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by shaithis View Post
    That's not my understanding, IMO, Physx is Physx. It's just a matter of whether the developers allows it to run on the GPU or not. Surely, they just make API calls and the Physx engine decides how it is going to run it.

    I cannot see any other way for the "Run on CPU or GPU" option on nVidia hardware would work or for the same code to run on GPU-accelerated and CPU-only configurations.
    PhysX is a physics engine like many others. The base engine runs on the CPU, always. Only certain parts like particle and cloth effects are able to be computed on the GPU.

    What a lot of gamers know as 'PhysX' is actually that GPU part, not the base physics engine. They are two separate things.

  12. #44
    Anthropomorphic Personification shaithis's Avatar
    Join Date
    Apr 2004
    Location
    The Last Aerie
    Posts
    10,857
    Thanks
    645
    Thanked
    872 times in 736 posts
    • shaithis's system
      • Motherboard:
      • Asus P8Z77 WS
      • CPU:
      • i7 3770k @ 4.5GHz
      • Memory:
      • 32GB HyperX 1866
      • Storage:
      • Lots!
      • Graphics card(s):
      • Sapphire Fury X
      • PSU:
      • Corsair HX850
      • Case:
      • Corsair 600T (White)
      • Operating System:
      • Windows 10 x64
      • Monitor(s):
      • 2 x Dell 3007
      • Internet:
      • Zen 80Mb Fibre

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    PhysX is a physics engine like many others. The base engine runs on the CPU, always. Only certain parts like particle and cloth effects are able to be computed on the GPU.

    What a lot of gamers know as 'PhysX' is actually that GPU part, not the base physics engine. They are two separate things.
    Your making it sound like all Physx API calls are either CPU or GPU based and the GPU based ones will not run on the CPU and visa-versa.

    If that was the case, how did the old 3D Mark Physx benchmark work? It ran the same thing on the CPU or the GPU depending on whether you had hardware Physx available.
    Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
    HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
    HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
    Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
    NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
    Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive

  13. #45
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    The CPU part is always run on the CPU. The 'GPU' part, for want of a better term to describe it, can also run on CPU. Just generally very badly as not many games use more recent and better-optimised versions of the SDK.

    The 'CPU part' is never run on the GPU - some code just doesn't lend itself to being run on a GPU.

    Edit: @CAT: True about Crysis. Despite throwing computing power at it, I've yet to see many games convincingly better than Crysis in terms of image quality! However, sadly, I've come across plenty of games which both look and run far worse...

  14. #46
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    Kepler performance is indisputably bad regardless of Hairworks. There's nothing at all wrong with tesselation performance on GCN
    Citation needed please.

    Quote Originally Posted by watercooled View Post
    Besides a couple of rabid fanbois, who is really likely to make a purchase decision based on a very minor bit of visual effects used in about two games?
    If you looked at it objectively you would see that given two options, one with special secret sauce and one without, people are going to choose the one with the special secret sauce.

    Quote Originally Posted by watercooled View Post
    Apples and oranges. iOS is a very closed ecosystem yet the more open Android has the greater market share.
    iOS is a desktop OS, Android is not.

    Microsoft made many attempts in the past to break into the mobile OS market, it's what they have been trying to do for over a decade.
    Windows 8.x and 10 are just the latest attempts to dominate the mobile OS landscape as they did on desktops, to have a proprietary system that makes people choose them over anyone else, it's common business practice.

  15. #47
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Emm,I have not met anyone who is a gamer in real life who actually cares about PhysX or TressFX or any of these effects apart from people like us on forums.

    The biggest selling PC games,are not graphical powerhouses. They are games like LoL,DOTA2 and Minecraft. Even Blizzard games don't use any of that tech and are more CPU limited anyway.

  16. #48
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by Corky34 View Post
    Citation needed please.
    http://forums.hexus.net/hexus-news/3...ml#post3472141

    Quote Originally Posted by Corky34 View Post
    If you looked at it objectively you would see that given two options, one with special secret sauce and one without, people are going to choose the one with the special secret sauce.
    Looking at it objectively, I really don't come to that conclusion? AMD have their own exclusive features, but I wouldn't choose them over Nvidia because of them; it works both ways before you accuse me of being biased.

    Quote Originally Posted by Corky34 View Post
    iOS is a desktop OS, Android is not.
    And that matters how exactly?
    Edit: Didn't notice when I replied, but yeah CAT is correct. I was referring to iOS, the OS which runs on the iPhone/iPad and the competitor to Android, not OSX. For some reason I read it as something along the lines of 'Windows is desktop, Android is not'.

    Quote Originally Posted by Corky34 View Post
    Microsoft made many attempts in the past to break into the mobile OS market, it's what they have been trying to do for over a decade.
    Windows 8.x and 10 are just the latest attempts to dominate the mobile OS landscape as they did on desktops, to have a proprietary system that makes people choose them over anyone else, it's common business practice.
    That same proprietary system which is promising to run iOS and Android apps since there are so few on Windows?

    Edit2: Some more links to benchmarks:
    http://www.gamersnexus.net/game-benc...-fps-benchmark
    http://www.pcgameshardware.de/The-Wi...marks-1159196/
    Last edited by watercooled; 22-05-2015 at 01:53 PM.

Page 3 of 6 FirstFirst 123456 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •