okay, the texture resolution of the image made by witcher is not 2 gigapixel for instance the grass you see was made of imported textures into the game engine those textures were not seriously high because at the end many users cant afford a 4k monitor or even have a TitanX
2 gigapixel. that's not very high..
thought it was supposed to be 8 gigapixel capable. (edit, just checked, NVidia say 8 gigapixel or 32x game res)
and I wonder if someone will make a patch to run that tech on a certain older game, they may have a bit of a crysis trying to get it to work though
so I headed out for research and realized it was a printed canvas not real video created using Ansel which captured but probably the game ran at 1 Fph (frames per hour!)
To anyone who has had stuff go wrong during a presentation, quite believable.
If you would prefer a conspiracy theory, then it sounds like the frame rate dipped to 100fps at 1080p so with four times the pixels that might have dropped to 25fps which would give a noticeable stutter so 1080p would have been a safer bet.
However, you should never put down to malice something that can be more probably explained by simple mistake, and demonstrating things can go wrong on the day in so many ways...
Now, the line about billions spent designing the card, that I have a hard time swallowing!
You buy a 1080 to play games... doesn't matter if it is 1080p or higher... you get it for processing fast... besides 4K anyway is all around 60FPS in general as it is... if you can gain 100 or better FPS on your monitor it is what you go for... actual performance.
100FPS or better is a gamers goal as it is.
It's been a very long time since I religiously read graphic card benchmark reviews (nowadays I will read the first page, the page on heat and noise, and the summary) but have graphic cards performance improved to the point that people expect the latest flagship single GPU graphic card to be able to play a yet to be released game at the highest resolution available with all eye candy at over 60FPS? I always thought that is what SLI/Crossfire is there for, and even then it's not guaranteed.
Rewind twelve years to the last Doom game (wow, it's been that long already eh) http://www.xbitlabs.com/articles/gra...2_3.html#sect0
(And 4K is also a much, much bigger jump to 1080 than 1600x1200 was to 1280x1024)
Ah yes, but in those days you knew that every 6 months a new and significantly faster card would come out so a doubling in performance didn't take long.
These days, you know that every year you are lucky to get a small clock speed bump, a small tweak like an improved tessalator, and some abuse of the word "innovation".
I'm not one for conspiracy theories and maybe nVidia's marketing team is incompetent. I'm more cynical though and I am inclined to think they purposely choose 1080p to ensure the card is shown in the best possible light. It wasn't 1440p as that might not be totally smooth and it certain wasn't shown at 4k as the new cards still aren't capable of 4k at a decent framerate.
I think they mean the screenshot can either be upto either 8MP or 32x your set resolution, i.e. if you're gaming at 4K you'd be capped at upscaling to 265MP and, despite being unlikely, you could take screenshots at 24K and upscale them to 8GP but if you were playing at 32K you would still 'only' be able to take 8GP screenshots, not 16GP.
For me personally a 1070 will be replacing my aging 680 but I will be waiting for some 3rd party cards rather than paying Nvidia tax for early adoption. Assuming all the benchmarks hold fire that seems like a good investment for a bit.
I kind of like that you can make an investment on these kind of things now, it was great when performance doubled every so often. Now it's more affordable in terms of life cycle of a card. Hell I remember running SLI Voodoo 2's...
Given normal rendering also involves some measure of breaking down a scene and rendering it in several passes, it's not so dissimilar. But yes, it's Ansel, hence the frame rate depends on how long you stare at it But my point was it was most certainly The Witcher 3, and it was also most certainly more than 4k
An investment would rely on the value of a card increasing in the time you own it.. that's not going to happen any time soon! I'm not sure it's more affordable over the life cycle of a card at all - especially in nVidia's case where previous chips become obsolete very quickly. Look at the cost of a 780ti new and then realise it was matched or even beaten by the mainstream 960 in not a great period of time. AMD do a lot better in this regard - the original GCN 79** and 78** cards still perform great in today's games.
Dear nVidia,
I didn't ask for Vulcan so please don't install it for me without asking. Or, obviously, at least include it in the list so I can remove the tick.
Thanks,
A User.
There are currently 1 users browsing this thread. (0 members and 1 guests)