Re: AMD Radeon RX Vega 56 gaming benchmarks published
Quote:
Originally Posted by
philehidiot
Are these peak, average or (unlikely) minimum FPS? Often it's the minimum FPS that's most important as that slow down is usually at the most frantic and involving parts of gameplay. Also the 1070 has been out for quite some time so this card SHOULD out pace it. If it didn't AMD would be wasting their time. I expect Nvidia will have something decent out soon or will be able to cut prices given they'll have begun to make savings in and increase yield from manufacture as well as having paid off a lot of R&D costs by now.
I hope that I'll soon be in a position to invest in a new card as my current one is getting on a bit (GTX 780) but when you look at comparisons, the AMD RX480 is running at 125% of the performance of my current card, which is still chooching along nicely in most games. I think consoles basically becoming PCs has slowed down the progress of graphics development as devs look to get more out of existing hardware (as games are cross platform) rather than looking to take advantage of increasing horsepower available from newer cards. The result is that really the newer GPUs are only really aiding higher resolution gameplay and not taking advantage of the extra power for better textures, etc at the same kind of rate they were doing before the modern console era.
This is an opinion based on observation and so if anyone wants to disagree I'd be interested to hear it.
Far from disagree I always wonder why a blue ray looks incredible but a game at 1080p is only ok, given that gpus have the horsepower to make them look much better. I get that it is on the game devs but surely even as an experiment someone must have thought to try to make a 1080p photo realistic game, or demo or something, given that 4k can be done at ok why not 1080p at amazing?
Re: AMD Radeon RX Vega 56 gaming benchmarks published
These claims from your 'trusted source' are outlandish and misleading. Tweaktown themselves in their review for the 1070 in Battlefield 1 show a minimum of 85fps and an average of 105fps @ 1440p/Ultra. Guru3d shows an average of 90Fps, and Techspot mirrors this. I place no confidence in a 'benchmark' result that reports lower than average of most other benchmarkers.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
I agree with the last post it's just plain crazy ?unless they used an old battered 1070 on it's last legs ..
vega will be what it will be lets just wait for the real results ..
I may buy one but there still going to have to knock atleast £50 off the price ..
we know NVidia will do soon to the 1080 and 1070 ..
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Quote:
Originally Posted by
sykobee
How, they aren't vouchers you can use in the future. You use the monitor discount and ryzen discount at the point of purchase or you use it not at all.
Maybe the games can be redeemed separately still.
Like i said IDK, i was just parroting what i read about the bundles and so I've not put much thought into it, however as Biscuit and others have said they probably won't work but at least AMD are trying to address the problem and for that they deserve some brownie points IMO.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Quote:
Originally Posted by
sykobee
Quote:
Originally Posted by
Biscuit
Quote:
Originally Posted by
Corky34
Isn't that what their bundles are meant to combat, although IDK how as i always find bundles confusing. :)
Supposed too... wont.
Miners will just buy the bundles and sell the extras separately.
How, they aren't vouchers you can use in the future. You use the monitor discount and ryzen discount at the point of purchase or you use it not at all.
Maybe the games can be redeemed separately still.
Where does it say this about the monitor / ryzen discount only being usable at the point of purchase?
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Quote:
Originally Posted by
EN1R0PY
Quote:
Originally Posted by
philehidiot
Are these peak, average or (unlikely) minimum FPS? Often it's the minimum FPS that's most important as that slow down is usually at the most frantic and involving parts of gameplay. Also the 1070 has been out for quite some time so this card SHOULD out pace it. If it didn't AMD would be wasting their time. I expect Nvidia will have something decent out soon or will be able to cut prices given they'll have begun to make savings in and increase yield from manufacture as well as having paid off a lot of R&D costs by now.
I hope that I'll soon be in a position to invest in a new card as my current one is getting on a bit (GTX 780) but when you look at comparisons, the AMD RX480 is running at 125% of the performance of my current card, which is still chooching along nicely in most games. I think consoles basically becoming PCs has slowed down the progress of graphics development as devs look to get more out of existing hardware (as games are cross platform) rather than looking to take advantage of increasing horsepower available from newer cards. The result is that really the newer GPUs are only really aiding higher resolution gameplay and not taking advantage of the extra power for better textures, etc at the same kind of rate they were doing before the modern console era.
This is an opinion based on observation and so if anyone wants to disagree I'd be interested to hear it.
Far from disagree I always wonder why a blue ray looks incredible but a game at 1080p is only ok, given that gpus have the horsepower to make them look much better. I get that it is on the game devs but surely even as an experiment someone must have thought to try to make a 1080p photo realistic game, or demo or something, given that 4k can be done at ok why not 1080p at amazing?
XD ok seriously? you really believe that your computer can dish out photo realistic frames fast enough to play a game? even at 1080p? do you know how much gpu power goes into rendering movies? Not just that but it takes days to render some scenes. Let me put it into perspective, it took pixar 29 hours to render 1 frame of Monsters University! That was with a huge server set up costing millions. So yeah we are WAY off of photo realism.
Also I'm pretty sure consoles slowed down the development of PC games far more before this last generation of consoles joined the x86 arena. The ports from the PS3/xbox 360 days were way worse than what we get now. At the end of the day there is not enough money in PC games, the money required to create a game that would push PC hardware to the edge would be far more than you would get back in sales. The last people to try it was crytek with the original crysis.
Edit: If you read the launch material you'll see that you do have to redeem your vouchers at the point of purchase, so you'd have to cough up like £1400 or something in one go to be able to redeem it all if you go for the top model which is nuts.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
This looks interesting, might get one in the future.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
rendering a 1 frame photo realistic scene @ 1920x1080 on 3DsMAX VRay takes 60 miniutes or more using the latest affordable hardware pushing VRay settings to max can take a day.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Photo realistic was perhaps not the right phrase to use, my point was more that the ubiquitous 1080 resolution is not holding back visual quality in that way a crt TV would. I remember a quote about how 40 terraflops was needed to render photorealistic scenes in games, this card can do 10.5 so at 1080p its capable of rendering far better scenes than it will likely be used for because of the push for higher resolutions rather than actual image improvements at a resolution a lot of people still use.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Quote:
Originally Posted by
EN1R0PY
Photo realistic was perhaps not the right phrase to use, my point was more that the ubiquitous 1080 resolution is not holding back visual quality in that way a crt TV would. I remember a quote about how 40 terraflops was needed to render photorealistic scenes in games, this card can do 10.5 so at 1080p its capable of rendering far better scenes than it will likely be used for because of the push for higher resolutions rather than actual image improvements at a resolution a lot of people still use.
40Tflops was just a theory, the developer who said so never gave us full technical details that 40Tflops was required. If rendering 1 Frame on a popular movie such as Monstor Universty takes 26 hours on a server farm I think 40Tflops is well bellow true photo realistic rendering. With my experience with VRay it takes massive amounts of settings to render just 1 scene, I wonder the tons of rendering settings the game logic will need to process for the 10 seconds of gaming that has tons of variables, unless we come with a new revolutionary game engine that can mimic VRay settings every frame!.........so hard to explain but fire up VRAY on 3DS and view the tons of settings you have to do just to make that 1 scene look perfect! ....
The 2,000 computers have more than 24,000 cores. The data center is like the beating heart behind the movie’s technology.
Even with all of that computing might, it still takes 29 hours to render a single frame of Monsters University, according to supervising technical director Sanjay Bakshi.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Was watching Terminator 3 the other day. Skynet was reported as running at 60TFLOPs, which seems quite amusing these days :)
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Quote:
Originally Posted by
DanceswithUnix
Was watching Terminator 3 the other day. Skynet was reported as running at 60TFLOPs, which seems quite amusing these days :)
I am not sure with this TFlops theory, GPU makers are claiming their GPUs are way fast than supercomputers of late 90s but the T flops of those supercomputers are composed of thousands of real xeon/Opteron/PowerPC cores each that can process an MS Excel sheet, can a Vega GPU process my Excel documents? also those TFlops are linked to terabytes of RAM...Vega and P100 only have 16-24gigs of RAM
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Quote:
Originally Posted by
lumireleon
also those TFlops are linked to terabytes of RAM...Vega and P100 only have 16-24gigs of RAM
https://hothardware.com/ContentImage...maxheight=1170
Well, Vega at least isn't theoretically bound by that... P100 you are probably right about though.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Quote:
Originally Posted by
lumireleon
I am not sure with this TFlops theory, GPU makers are claiming their GPUs are way fast than supercomputers of late 90s but the T flops of those supercomputers are composed of thousands of real xeon/Opteron/PowerPC cores each that can process an MS Excel sheet, can a Vega GPU process my Excel documents? also those TFlops are linked to terabytes of RAM...Vega and P100 only have 16-24gigs of RAM
Depends on what problem the computer is trying to solve. Modern graphics cards are capable of running general purpose code, just very slowly. Think of it like multi threading CPU support gone mad, one individual thread runs rather slowly but you can have 10000 threads.
For some tasks, mesh analysis or neural net simulation, a GPU can work very well.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Well it beats the 1070, its a good card and possibly it will even get better from here, have to say am going to be more interested in AMD now and in the future... and I am not a fanboy of either brand... afterall they all want my money and that is about it... being fan of a brand is weak as it is, rather be a fan of what performance you can get no matter if intel, nvidia or AMD.
Re: AMD Radeon RX Vega 56 gaming benchmarks published
Well if the Vega 56 can match a GTX1070 and does not drink too much power,it looks a good bet as we all know it will probably gain extra performance over time - you only need to look at the Hexus RX570 4GB review,where it now seems to trade blows with the GTX1060 6GB.