Page 5 of 6 FirstFirst ... 23456 LastLast
Results 65 to 80 of 90

Thread: Normal Witcher 3 performance is possible on AMD GPUs

  1. #65
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    Would you care to explain where I've backtracked? Because I completely stand by what I said - performance is poor on Nvidia cards previous to Maxwell.
    Forgive me, maybe I misinterpreted the following...
    Quote Originally Posted by watercooled View Post
    And the reason for equally terrible Kepler performance?

    Edit: Ninja'd

    A lot of people are treating this as just an AMD vs Nvidia thing. It really isn't.
    Like I said maybe I've misinterpreted the point your trying to make, if it's like you seem to be suggesting that new video cards out perform old ones then that's kind of the point isn't it?

    Quote Originally Posted by watercooled View Post
    However expecting a 750Ti to beat everything Kepler just because it's Maxwell is silly, and not what I implied at any point. Kepler performs comparatively far worse than Maxwell.
    Yea still can't see it myself so I guess we are going to have to agree to disagree.

  2. #66
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Kepler cards are performing very poorly considering their performance relative to both Maxwell and GCN in other games, as has been explained by myself and others.

  3. #67
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by GuidoLS View Post
    Is it irony that what's left of Mantle ended up at a consortium led by an executive from Nvidia?
    TBH I think that is just a product of it being a market driven by a small number of people. Who else was it going to be leading the consortium, someone from Intel or someone from AMD? I suppose it could be Qualcomm these days or one of the other mobile device people, but even including them it is a pretty small pool of interested parties.

    Still, AMD got the whole driver game reset to the start line which I think is a win for them. Let's hope they don't mess it up.

  4. #68
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by Corky34 View Post
    Like I said maybe I've misinterpreted the point your trying to make, if it's like you seem to be suggesting that new video cards out perform old ones then that's kind of the point isn't it?
    I think the worry is that people won't buy an upgrade unless it really is faster. So, why not do something that the old cards can't manage and then force it into games and that way the new cards look really fast in benchmarks compared to the old ones? That gives customers of old cards a worse experience than they could have had, but who cares about them eh? I really hope I am wrong, but I seem to get that feeling about Nvidia enough recently for me to buy an AMD card for the first time in probably a couple of decades. Nvidia used to make me feel like I was a customer that needed supporting, not like cattle to be milked.

  5. #69
    Token 'murican GuidoLS's Avatar
    Join Date
    Apr 2013
    Location
    North Carolina
    Posts
    806
    Thanks
    54
    Thanked
    110 times in 78 posts
    • GuidoLS's system
      • Motherboard:
      • Asus P5Q Pro
      • CPU:
      • C2Q 9550 stock
      • Memory:
      • 8gb Corsair
      • Storage:
      • 2x1tb Hitachi 7200's, WD Velociraptor 320gb primary
      • Graphics card(s):
      • nVidia 9800GT
      • PSU:
      • Corsair 750w
      • Case:
      • Antec 900
      • Operating System:
      • Win10/Slackware Linux dual box
      • Monitor(s):
      • Viewsonic 24" 1920x1080
      • Internet:
      • AT&T U-Verse 12mb

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by DanceswithUnix View Post
    I think the worry is that people won't buy an upgrade unless it really is faster. So, why not do something that the old cards can't manage and then force it into games and that way the new cards look really fast in benchmarks compared to the old ones? That gives customers of old cards a worse experience than they could have had, but who cares about them eh? I really hope I am wrong, but I seem to get that feeling about Nvidia enough recently for me to buy an AMD card for the first time in probably a couple of decades. Nvidia used to make me feel like I was a customer that needed supporting, not like cattle to be milked.
    Interesting thought. There's a reason I haven't upgraded from my 9800GT yet. That reason is, other than W3 and Shadows of Mordor, any game I've really wanted to play, the 9800GT can handle. DA:I was the first one that I had to cut back on in any meaningful way, and even then, it was still completely playable and enjoyable. No, I'm not getting 4k resolutions, or 120FPS, but those aren't needed to make a game fun. And quite frankly, the race to the top by both the card makers and the developers are leaving a huge gap between what a regular player needs and what should truly be necessary. In a world where everything is more and more disposable, it's a darned shame that GPU makers are trying to force everyone else on board, and sloppy programmers are the fuel that's making that engine run. Even if I were inclined to, I can't justify a new card every 2 years, or every year, or however quickly they want to speed up the cycle, and I'd be willing to bet the majority of the world is the same way.

    At this point, as much as I'd like to support CDPR, I'll not be buying W3 for at least a few years, because it's insulting to me that I have to buy the most current generation (and most expensive) video card, regardless of team color, to be able to play at anything above 30fps. This isn't Crysis, and every review that isn't busy making sure that Nvidia is evil has pretty much said just that - this game isn't Crysis, and shouldn't come close to requiring this much horsepower to play.
    Esse Quam Videri
    Out on the road today I saw a Black Flag Sticker on a Cadillac...


  6. #70
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by GuidoLS View Post
    has pretty much said just that - this game isn't Crysis, and shouldn't come close to requiring this much horsepower to play.
    I agree with much of what you said but I disagree with this bit. Graphical fidelity isn't the domain of shooters alone, and when you compare what this game is showing with a game like Crysis, it should be more demanding. CDPR already said they had to drop several rendering features from their earlier preview because they didn't work in the open world setting, so there's scope for even more demanding things in the future!

    The main problem I have, graphically, with games like TW3, at least in its current version, is that it's inconsistent - you have alpha texture based effects for foliage and NPC hair, as we've had in games for years, then the main character suddenly shows up with his mop head. There's not a consistent level of technological increase, just a sort of base-level with odd highlights which can sometimes make it feel less good as a whole. Low fidelity games can end up looking a lot 'nicer' as a package because they use a lower, but consistent, technology level and use art to get around limitations - see Blizzard for supreme expertise at this kind of thing.

  7. #71
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    Kepler cards are performing very poorly considering their performance relative to both Maxwell and GCN in other games, as has been explained by myself and others.
    It seems Nvidia agrees with regards to The Witcher 3 performance on Kepler.

  8. #72
    Senior Member
    Join Date
    Mar 2010
    Posts
    2,567
    Thanks
    39
    Thanked
    179 times in 134 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    rumour mill says the original Titan is faring worse than a GTX 960......

  9. #73
    Registered User
    Join Date
    Apr 2014
    Posts
    4
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    i dont like this game but just gonna install it beacause of the graphics.. just wanna see if its really that good..

  10. #74
    ZaO
    Guest

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by kalniel View Post
    I agree with much of what you said but I disagree with this bit. Graphical fidelity isn't the domain of shooters alone, and when you compare what this game is showing with a game like Crysis, it should be more demanding. CDPR already said they had to drop several rendering features from their earlier preview because they didn't work in the open world setting, so there's scope for even more demanding things in the future!

    The main problem I have, graphically, with games like TW3, at least in its current version, is that it's inconsistent - you have alpha texture based effects for foliage and NPC hair, as we've had in games for years, then the main character suddenly shows up with his mop head. There's not a consistent level of technological increase, just a sort of base-level with odd highlights which can sometimes make it feel less good as a whole. Low fidelity games can end up looking a lot 'nicer' as a package because they use a lower, but consistent, technology level and use art to get around limitations - see Blizzard for supreme expertise at this kind of thing.
    Man I was thinking this too. The game looks great, but some things do feel kinda odd and out of place at times. It's no big deal though. The last time I really enjoyed just travelling around a game world this much was Skyrim

  11. #75
    Token 'murican GuidoLS's Avatar
    Join Date
    Apr 2013
    Location
    North Carolina
    Posts
    806
    Thanks
    54
    Thanked
    110 times in 78 posts
    • GuidoLS's system
      • Motherboard:
      • Asus P5Q Pro
      • CPU:
      • C2Q 9550 stock
      • Memory:
      • 8gb Corsair
      • Storage:
      • 2x1tb Hitachi 7200's, WD Velociraptor 320gb primary
      • Graphics card(s):
      • nVidia 9800GT
      • PSU:
      • Corsair 750w
      • Case:
      • Antec 900
      • Operating System:
      • Win10/Slackware Linux dual box
      • Monitor(s):
      • Viewsonic 24" 1920x1080
      • Internet:
      • AT&T U-Verse 12mb

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by kalniel View Post
    I agree with much of what you said but I disagree with this bit. Graphical fidelity isn't the domain of shooters alone, and when you compare what this game is showing with a game like Crysis, it should be more demanding. CDPR already said they had to drop several rendering features from their earlier preview because they didn't work in the open world setting, so there's scope for even more demanding things in the future!

    The main problem I have, graphically, with games like TW3, at least in its current version, is that it's inconsistent - you have alpha texture based effects for foliage and NPC hair, as we've had in games for years, then the main character suddenly shows up with his mop head. There's not a consistent level of technological increase, just a sort of base-level with odd highlights which can sometimes make it feel less good as a whole. Low fidelity games can end up looking a lot 'nicer' as a package because they use a lower, but consistent, technology level and use art to get around limitations - see Blizzard for supreme expertise at this kind of thing.
    The comment wasn't based on the generic shooter - the comment was aimed straight at the posted title - all 3 iterations of Crysis were made to be ball-breakers for PC performance as they were video games - for that matter, the game part may well have been the tertiary intent. In that vein, W3 most certainly isn't Crysis - or if it is, I missed the advertising where they were advertising it as being as much a tech demo as an open world RPG. Patch 1.03 alone indicates sloppy programming on an already delayed game.
    Esse Quam Videri
    Out on the road today I saw a Black Flag Sticker on a Cadillac...


  12. #76
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by GuidoLS View Post
    Patch 1.03 alone indicates sloppy programming on an already delayed game.
    I don't think you can call the programming sloppy when you see what they've achieved. The game is still fine without patching, but they've been quick to respond to real world usage which is always of a vastly bigger scale than internal and beta testing. Absolutely nothing unusual there, and really quite incredible given the scope of the game.

  13. #77
    Token 'murican GuidoLS's Avatar
    Join Date
    Apr 2013
    Location
    North Carolina
    Posts
    806
    Thanks
    54
    Thanked
    110 times in 78 posts
    • GuidoLS's system
      • Motherboard:
      • Asus P5Q Pro
      • CPU:
      • C2Q 9550 stock
      • Memory:
      • 8gb Corsair
      • Storage:
      • 2x1tb Hitachi 7200's, WD Velociraptor 320gb primary
      • Graphics card(s):
      • nVidia 9800GT
      • PSU:
      • Corsair 750w
      • Case:
      • Antec 900
      • Operating System:
      • Win10/Slackware Linux dual box
      • Monitor(s):
      • Viewsonic 24" 1920x1080
      • Internet:
      • AT&T U-Verse 12mb

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Would you prefer not optimally optimized? And the internet says that not only is it not fine without patching, but that Nvidia is the anti-Christ and CDPR is the steed known as pestilence, come to unleash havoc on our unsuspecting PC's, totally ruining hours of our life. When a single patch (reportedly) gives performance boosts of up to 25%? Even 5-10% says something wasn't quite right...

    It's just a repeat of history - W2 dragged butt like a 3 legged dog when it first came out, and a couple of patches later, it was just fine. The internet, for being the collective soul of the world, and receptacle of the sum knowledge of mankind, is inhabited by a population that has a memory somewhat shorter than the time it takes to type in google.com (or duckduckgo.com for the counterculture). It's just that I'm old, and remember the world pre-WWW. Companies didn't get a 2nd chance on a gold stamped product. It worked or they went out of business. And while I appreciate the product that CDPR has released (at least those which I have been able to use), I'm in no way obligated to forgive something that is released in an improper state, and I'm not inclined to forget tomorrow what's happened today, or yesterday. It's not like CDPR had someone like EA breathing down their neck to get it out the door...

  14. #78
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,025
    Thanks
    1,871
    Thanked
    3,383 times in 2,720 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    I'd agree it's not optimised to it's fullest potential, but how could it be? Afraid I will just have to disagree with 'the internet' on most of those points. I've not seen any significant improvement in performance with the patches so far, but it was fine out of the box (not amazing, but I can accept 30fps with my old card).

    I also didn't have such a bad experience with TW2 out of the box - it was a bit slow (slower than TW3 was for me), but playable. TW3 comes in a better state out of the box, while also being a huge amount bigger and better in just about every way (scale, scope, style and story). Certainly it's nowhere close to an improper state, as most of the reviews are acknowledging. It's absolutely no Daggerfall or TotSC.

  15. #79
    Senior Member Hicks12's Avatar
    Join Date
    Jan 2008
    Location
    Plymouth-SouthWest
    Posts
    6,586
    Thanks
    1,070
    Thanked
    340 times in 293 posts
    • Hicks12's system
      • Motherboard:
      • Asus P8Z68-V
      • CPU:
      • Intel i5 2500k@4ghz, cooled by EK Supreme HF
      • Memory:
      • 8GB Kingston hyperX ddr3 PC3-12800 1600mhz
      • Storage:
      • 64GB M4/128GB M4 / WD 640GB AAKS / 1TB Samsung F3
      • Graphics card(s):
      • Palit GTX460 @ 900Mhz Core
      • PSU:
      • 675W ThermalTake ThoughPower XT
      • Case:
      • Lian Li PC-A70 with modded top for 360mm rad
      • Operating System:
      • Windows 7 Professional 64bit
      • Monitor(s):
      • Dell U2311H IPS
      • Internet:
      • 10mb/s cable from virgin media

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by GuidoLS View Post
    Interesting thought. There's a reason I haven't upgraded from my 9800GT yet. That reason is, other than W3 and Shadows of Mordor, any game I've really wanted to play, the 9800GT can handle. DA:I was the first one that I had to cut back on in any meaningful way, and even then, it was still completely playable and enjoyable. No, I'm not getting 4k resolutions, or 120FPS, but those aren't needed to make a game fun. And quite frankly, the race to the top by both the card makers and the developers are leaving a huge gap between what a regular player needs and what should truly be necessary. In a world where everything is more and more disposable, it's a darned shame that GPU makers are trying to force everyone else on board, and sloppy programmers are the fuel that's making that engine run. Even if I were inclined to, I can't justify a new card every 2 years, or every year, or however quickly they want to speed up the cycle, and I'd be willing to bet the majority of the world is the same way.

    At this point, as much as I'd like to support CDPR, I'll not be buying W3 for at least a few years, because it's insulting to me that I have to buy the most current generation (and most expensive) video card, regardless of team color, to be able to play at anything above 30fps. This isn't Crysis, and every review that isn't busy making sure that Nvidia is evil has pretty much said just that - this game isn't Crysis, and shouldn't come close to requiring this much horsepower to play.
    Sorry I just dont believe this is a good mindset, why do you think a MID RANGE GPU from 7 YEARS AGO should be able to play a latest AAA game with above 30 fps? This is a real console mentality, sorry to break it to you but even those Xbox 360 and PS3 old gen consoles that are a mere few years older dont support witcher 3 and they performed better with optimizations than the 9800GT.

    The reason no graphics really changed from the crysis era is because no one but crytek seemed to have the balls to avoid console finance and thus had to limit fidelity and functionality to fit in with the consoles limited power budget. With everyone else focusing on the console market the pc versions didnt really get much effort so the 'need' to upgrade was very limited as the performance required rarely jumped but now the consoles have finally be moved to this generation and have a much larger power budget it means companies are able to 'push' games more, no more DX9 crap, DX11 is used everywhere now however its not that efficient at all due to extra graphic fidelity but not much optimization but this will come with DX12 as thankfully someone was thinking correctly, AMD proved it was very beneficial to sort out graphic APIs as they're quite poor right now, mantle was the foundation and showed it was possible for a 'small' company (compared to Microsoft and Nvidia) could produce a solid API that can be vendor agnostic and improve performance, this is why DX12 is coming because Microsoft realized it was needed (highly doubt Microsoft did this alone, AMD influenced this massively).

    Basically DX12 when its out (in a couple of months) will mean much better performance for everyone CPU limited which is most next gen games. Back to the point though, just because your 9800GT cant handle a couple of games doesnt mean the industry is wrong, you have had the GPU almost as long as the whole last console cycle! How can you expect to not be left behind? Yes I completely agree that graphics arent the priority in most games but its inevitable for graphics to 'improve' and we should always have it improving else we will be stuck with 8 bit games, think how people who 'upgraded' their pcs from the 2d to 3d era felt, they had to actually spend money as well .

    I think I lost my point, Witcher 3 is more demanding than Crysis and thats because it is HUGE! I love Crysis (one of the few that actually enjoyed the similar 'fps' gameplay), the physics were amazing the simple act of jumping on a hut and breaking it down was great but at the end of the day it was fairly limited in scope so 'optimizing' it is really quite simple but we had to wait a few years for it to be easily playable on max, its a shame that no game has really surpassed Crysis in graphics but I hope someday they do . Witcher 3 on pc at least wouldnt be that well optimised as it is a port, a solid port in my opinion and I get 45 - 60 fps on my 7950 which im happy with (most things on ultra bar grass, hairworks and couple others), the scale is insane you have many more AI characters moving around and the amount of buildings is probably at least 20x as many as crysis!

    I wouldnt call it sloppy programmers, thats being fairly harsh I think its more the fact these newer games require developers to relearn how they're coding, the change to DX12 will require this so wait for a couple years for dx12 to be the norm and im sure it will be a much better place for 'optimized' games. The situation isnt helped with Nvidia swooping in with their big bucks and offering devs 'gameworks' and developer support, Nvidia is using their strong market dominance to strong arm the next gen games to focus development on Nvidia platforms as gameworks is a 'simple' solution for developers to quickly hit the ground running on PC (as PC is the 'worst' in money its generally made as minimal as possible for the team budget) which is enticing for developers however its easy to be completely short sited and say oh our game will work within 6 months instead of 9months great, but then the industry as a whole will ONLY work well on Nvidia cards cutting out a significant part of the PC community.

    Nvidia is slowly but surely moving in on securing their monopoly and in my opinion its like Intel vs AMD all over again where Intel were paying customers to not buy AMD products its pretty much the same here, Nvidia is paying to get the game to work best on their platform through large developer tools and actual developers on site and I dont believe people can say they arent getting paid to use Nvidia gameworks because quite-frankly they are but not directly via cash but simply through the tools they spent money making and developers. I am not opposed to trying to get the most out of your platform but its too far in Nvidia case, AMD made mantle and opened it to vendors where it is now used as a foundation for vulkan by the Khronus group (competitor to DX12), this will benefit everyone in the long term and then there is Freesync which is free and open to all to implement, its now part of the VESA standard so people like Nvidia can add support in right now if they wanted to and everyone could begin benefiting from this excellent piece of tech but they would rather stick to the strategy of locking customers in via G-Sync and propriety tech.

    AMD cannot optimize for gameworks as they cannot view the code so its hard for it to be 'fair' competition if one is blind folded, another case is the hairworks its running 64x tessellation for no real reason only to artificially hamper performance on other GPUs (including Nvidias last gen cards! This is crazy!!!!) , I liked the look of hairworks but really tressfx looks like the superior beast and can actually be optimized on both companies cards so its a shame AMD failed to deliver it to CD Project in time or that it was simply too much work to implement as Gameworks is already implemented it would simply be adding an on switch vs implementing that on switch (OKAY VERY SIMPLIFIED ).


    I did try the manual override for hairworks and moved the tesselation to 16x, lost like 10 FPS but it did look better (wasnt worth it for me) and really couldnt tell the difference between the max version (maybe im blind ha). In regards to the witcher 3 launch I think it was solid, have seen two bugs and that is one attack rune is displaying 0 dmg (display issue apparently ) and the other was the intermittant game crashes when in the inventory screen (ranging from 10 minutes to 3 hours no crashes, got frustrating) but the crashing was fixed in the latest patch on wednesday at least in my case so I have 0 complaints from the witcher 3 PC version, didnt expect my 7950 to keep up so its hard to keep waiting for next year to actually get a new GPU , side quests of this game are AMAZING so much more than your usual 'kill x monsters' which are boring as hell but these ones require investigating and almost indepth as the main story line quest so im a happy chappy as I really enjoyed the first 2 and the books .

    Sorry GuidoLS you ended up quoted and most of my post isnt actually aimed at you, only the part about the old card trying to run new games .
    Quote Originally Posted by snootyjim View Post
    Trust me, go into any local club and shout "I've got dual Nehalem Xeons" and all of the girls will practically collapse on the spot at the thought of your e-penis

  16. Received thanks from:


  17. #80
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    TBH,Skyrim with mods would like to say hello to W3.

Page 5 of 6 FirstFirst ... 23456 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •