Steve from Gamers Nexus turned up at Principled Technologies to ask some question about their flawed testing.
outwar6010 (10-10-2018)
Looks ike Neil (from "The Young Ones")!
(Readers under 40 will have no idea what I'm talking about )
(\__/)
(='.'=)
(")_(")
Been helped or just 'Like' a post? Use the Thanks button!
My broadband speed - 750 Meganibbles/minute
I disagree with this. Games consoles which could even hope to rival a PC are a relatively new thing. I was using PCs long before consoles could even hope to do 3D graphics. The PC was, in days gone by, the driver of the technology. I agree that now consoles are the main market but that was only made possible by sticking essentially modified PC hardware into a nicer box. An example of this is "After Burner" which was an arcade game (https://gamefabrique.com/games/afterburner/). At the time consoles had no chance of playing it and it was ported into the PC (which is what the arcade was). It came on two giant 5.25" floppies.
Consoles do not drive the hardware limits, quite obviously as the limits really almost always end up being thermal at the end of the day. The development of cooler running, faster PC components is what drives development and then these are modified for use in the consoles where the TDP must be lower as you can't have the fan noise or complex liquid coolers. There are exceptions to this, like the cell processor which was totally weird for its time but open an XBox and you have basically got a mid spec PC.
For games consoles are the main drivers now for the last decade especially when you see the revenue stream breakdowns. The main PC revenue streams are MMOs and MOBAs BTW.
Ever thought why a game like Skyrim does not have loading screens throughout the world and only when entering internal areas?? It was because Bethesda implemented streaming tech to make up for the lack of RAM for that generation of console. The same thing with more and more games starting to use more core effectively - also down to consoles. Most PC gamers use dual or quad core CPUs so the Battlefield series scaling to more than that would make little sense if it was a PC only title.
Then for years you had texture quality plateau in games....also down to consoles.
Even things like destructible environments are rare now - again because consoles have weak CPUs.
The problem with PCs is now wasted potential as the inefficiencies are starting to be seen and this is happening since AMD, Intel and Nvidia have been in a piddling contest the last decade trying to screw each other over. So all we have is overuse of effects just to show one company is better whereas consoles tend to focus on efficiently using them overall.
An example is the use of tessellation as a weapon to sell high end cards when it was originally meant as a way to do things more efficiently.
Then APIs being held back since it is not in the interests of certain hardware companies and so on. So plenty of games especially on PC still hammer one core.
Now you have cons like Early Access where games are sold with hardly any optimisations - they don't even implement techniques which are well known in games development to limit draw calls! This is why so many Early Access games run crap and more PC devs are pushing them over normal games.
This is why you get some great looking games on relatively potato level hardware in consoles.
Last edited by CAT-THE-FIFTH; 10-10-2018 at 09:32 AM.
outwar6010 (10-10-2018)
I should have been more specific - my disagreement was with the "and always have been" whereas for me it's relatively new and the PC was the driver for the majority of my gaming "career", consoles being not even in the same league.
One of the things that REALLY riles me about dual development is that they've got rid of the good old fashioned proper save game. It's all checkpoints or some weirdness where it'll take you back to a safehouse or something. It really strangles creativeness and enjoyment when you set up for a mission, get yourself into position all creepy creepy and then you want to save it so you can try it different ways and so all your work up to that point insn't wasted. This also affects management of bugs as if you encounter one, you can't go back two saves to before hand, you can only go back to the last checkpoint or redo the entire level which is INFURIATING.
This is making me grumpy. Grump.
Yeah things are very well optimised now - I've been playing Forza Horizon 4 and it's nailed to 60fps at 1080p at max settings on a mainstream GPU - while looking better than almost any other racing game out there. You could argue it's wasted potential, or you could rejoice that gaming is remaining affordable by not needing hardware upgrades.
I would say PCs were the main focus until the era of Crysis and when Crytek failed it was when companies started to realise they needed to change tact.
Its a multiplatform title. Its down to consoles,and what you don't seem to realise is that PC is full of wasted potential now. Look at what consoles use - CPUs which are Atom class and GPUs which for the most part are midrange or lower midrange. Look at the visuals and even the FPS. FH4 runs 60FPS on a 2GHZ Atom class CPU and it has really good core scaling - it will use 16 threads easily but also has very solid core load balacing. All that is down to consoles.
If anything I am happy consoles are starting to become the main focus point of games development since it forces the PC to do things better but there is a lot of improvement needed.
Look at something like Horizon Zero Dawn or Death Stranding. They look amazing in both scale,art design and good graphics and run on Atom class CPUs.
Things like Skyrim's lack of loading screens,more game using more cores are down to consoles,not PC.
Now,lets look at PC only titles,like MMOs,many "early access" titles - they don't even do simple things like proper object culling,or proper draw call limiting mechanisms.
Massively popular games like ARK,Conan,PUBG,are very poorly coded especially earlier in their lifespans. Kingdom Come Deliverance is another one,when I was chatting to a mate who does a degree of game design.
I have had people ask me and mates to spec hardware for such games,and I honestly told its like throwing money away.
They look meh for the level of hardware they required.
Look at The Witcher 3 for example - what did Nvidia do,add a ton of poorly optimised tessellation based effects which cratered performance,and so on. The console version did some of those effects,but more efficiently via different methods.
Then the waste of space which was PhysX,which displaced far more efficient ways of doing similar things.
So in the end its better to switch it all off,and then the game only looks marginally better than a console,whilst still have to throw so much hardware at things.
How about using that faster hardware on PC on more useful things,like a more destructable and user interactable world?? More intelligent NPCs??
Any increased potential of PCs is wasted on fad effects,and its getting worse and worse. The lack of movement of APIs for example,again since its not in the vested interest of a number of hardware companies and so on.
Last edited by CAT-THE-FIFTH; 10-10-2018 at 10:25 AM.
Oh and last night I decided to have a giggle and went on that CPU comparison site to compare my existing 4690K to newer processors to see what I'd have to pay to see any real improvement in games performance or indeed general use. A Ryzen 2700 is overall.... 8% better in effective speed and for gaming the rating goes from 81% to 87%.
For anything even remotely representing a decent improvement I'm looking at an i9-79600X which gives me 49% effecgive speed increase with gaming from 81% to 120%.
Both of these require a new mobo and new RAM and likely a new sound card as old PCI slots aren't exactly easy to find.
From this, I can conclude that in 4 or 5 years, CPUs have not moved on in terms of what they can offer me for my use, but the ceiling has being raised. So, there are much more powerful processors available but what you get for the same money is the same as all those years ago.
I tried different configurations with the RAM and so on and the impact was negligable. The investment required for any kind of meaningful upgrade mean I'll be sticking with my existing set up for as long as physically possible.
For general usage,my Ryzen 5 2600 is double the speed of my Xeon E3 1230 V2 and that was things like video transcoding and RAW developing. For gaming,as long as PC games push ancient and inefficient APIs,you will be massively limited by a single thread. Any game which won't run well on an overclocked Core i5 4690K will be either:
1.)Its another craply coded game which relies on a massive single threaded bottleneck
2.)Its better coded but implemented for consoles
The second will cause less issues than the first IMHO.
Most PC games are still poorly coded especially with the scourge know as "early access" which more and more PC devs seem to be pushing,and I avoid entirely now. Now wait until hybrid-RT becomes a thing.
Now the next pissing contest on PC will be who can push more "realistic lighting and shadows" to the nth degree,and ignore every other part of the game graphics. Also ignore all the other more important parts of the game itself.
Its bloody sad when games like Red Faction,Crysis,some of the earlier FarCry games,had destructable environments,proper weather systems,or even things like fires responding to the wind.
Now you have environments made of Tungsten which can withstand a bloody H bomb and the most unimmersive environments ever.
How many times you walk through an environment in an open world game,and its mostly empty space,or boarded up skyscrapers,etc. Yeah,well why bother spending spending time on that when you can use GPU and CPU time to push a certain effect more.
Plus a game like Crysis still looks good today,and that is because again the last decade has seen the increased use of fad effects which concentrate on one aspect of effects to the detriment of everything else. Plus with Crysis it was also the use of the environment which I found even really cool at the time - throwing a chicken at a soldier,was just priceless or dropping a tree on top of their building hence destroying it...LOL.
I want more interactive worlds FFS,and that is for me part of the immersion. Its no point just making it look good,when its a sterile non-interactable environment.
Look at Horizon Zero Dawn:
https://www.eurogamer.net/articles/d...-tech-showcase
The attention to detail is awesome. Yay,PC can get better shadows and lighting. Meh.
Last edited by CAT-THE-FIFTH; 10-10-2018 at 10:26 AM.
Horizon 4 does have a more interactive world - it's even more so than 3. I don't think it's wasted potential if you can make something look and play amazing on the majority of systems out there - whether that's PC or console. Yes, PCs are more powerful, and correspondingly you do get a better experience. But you don't need to buy a take the piss CPU and GPU to have that experience.
It ends up sounding elitist if you say that developers should take features away from the majority (by diverting dev time) in order to appease a wealthy minority. To then complain about gaming increasing in cost or hardware being more expensive in order to access that 'extra potential' would be almost hypocritical.
Who said wealthy minority - even a cheap CPU has more power than a console one. Even a cheap PC has more RAM. Even a mainstream graphics card as much power or even more power than a console GPU.
Even my old Xeon E3 sat half the time in most games with most of it cores doing eff all. Its a joke.
Now look at the PC only games,many are early access,using crap engines which are literally running on one core,meaning people end up throwing hardware to make them run well. These are not hardware enthusiasts BTW,these are just gamers wanting their game to run well.
I have had to spec systems(as have mates) to try and run games like Ark,Conan,or even PUBG(especially when it first came out) and it was utterly shocking what they needed for what they looked like,and what they did.
Plus again you keep proving my point,you mentioned FH4 and HZD - games which look amazing and do great stuff,but running on a 2GHZ Atom class CPU,and with the equvalent of an HD7790 to a GTX1060 level of card.
Look at PC open world games - how about using an old Core 2 quad and an HD7790 and see how many of the newer ones will run,yeah like crap and they will look crap.
Now look at so many PC games,especially the exclusives and see how piss poor they run,or even the level of hardware they need. PC gamers keep defending the poor state of how PC games are now.
All we have is overused effects,which consoles don't use. So overdone tessellation based effects,obssession with stupid ways of doing AA,and now the next fad overdone shadows and lighting.
Edit!!
Plus HZD is a console exclusive.
Death Stranding is console exclusive.
Last of Us is another one.
All single player RPG games.
PCMR defenders are hypocritical.
They are so defensive of PCMR they don't want any criticism and act very elitist when the flaws of PC games are shown.
They then on forums act very elitist on purpose pushing that hardware should cost more,and that companies are more important than consumers,since they are elitists and want PC gaming to be more and more out of the reach of normal people,and make themselves "more special".
So that means I have seen more people move to consoles,when their PCs become too old or parts start failing.
Then excuse maker say games should cost more,since they have spending E-PEEN.
They are destroying PC gaming.
Last edited by CAT-THE-FIFTH; 10-10-2018 at 10:46 AM.
Destroying a building with a chicken is far more fun. I want to be able to ram TNT up the rear end of the chicken and then propel it into a person who then flies backwards into a building before detonating.
Being serious, I totally agree on the part about pushing stupid lighting effects way too far. It reminds me of the days of motherboards shoving a few million (okay, so not quite millions) phases onto their CPU power management just because that was the number where the current fad had landed.
I think the lighting effects from ray tracing when I see them side by side look impressive sometimes but mostly just different when I consider how much attention I'm going to pay to them in the heat of the gaming moment. Knowing someone could have spent the time ensuring that the wooden cart I'm currently using as hard cover against grenades and .50 BMG rounds behaved more realistically and in a way that actually affected gameplay is kind of annoying.
There's a level in COD WW2 with tanks and there are some buildings which are destructable and your rounds will completely deface the building with smoke, rubble, etc. Then you're against an enemy tank in a dragged out game of cat and mouse and you find that a small pile of 3 bricks totally stops your AP tank round and doesn't even move. No only that but you just need two shots into the "weakest" point of the tank (which was actually the strongest but hey, who cares) to destroy it. Don't think about immobilising it by hitting the tracks or bringing a building down on it and slowing it down so you can get the hits in whilst it's blind. Probably the most infuriating level I've ever played in a game.
CAT-THE-FIFTH (10-10-2018)
Which mainstream graphics cards have more power than a GTX 1060?
I guess we'll just have to agree to disagree about PC games being in a poor state now - despite the overheads of a PC environment compared to console I think modern PC games look great and run great, even on old hardware. I have more games than I know what to do with, and not a single one of them makes me want for better hardware. The witcher 3 looks and plays great. So does divinity OS 2. So does Metro. So do any number of driving games.
There are currently 1 users browsing this thread. (0 members and 1 guests)