Originally posted by Korky
I feel sorry for these poor soles who always say nvidia will reign again
Originally posted by Korky
I feel sorry for these poor soles who always say nvidia will reign again
LMAO
(\__/)
(='.'=)
(")_(")
Nvidia will get it right with NV40. They've made a lot of messups and I'm guessing they won't wanna make another.
As for the driver improvement, I can see a 50-60% increase, not 100% tho, it'll all depend on the drivers for the pixel shading engines.
Acer Travelmate 8104WMLi
P-M 2.0 Ghz
2Gb DDR533 Corsair RAM
100Gb 7200rpm Seagtae HD
128Mb ATi x700 Mobility
erm..........isn't NV40 ati's next offering?
Hmm, too much flannel and guff I think.
Remember that manufacturers like Nvidia need higher stock prices at different times so why not deflate your stock with what seems like a unperforming product whilst holding back fully working drivers now and then releasing them nearer the game release so the sales for crimbo of the Nvidia cards and sales of the game peaks around crimbo. ...
Just a thought....
Perry.
LOL, erm nope, NV = NVidia dev name, RXXX = ATI dev nameOriginally posted by Knoxville
erm..........isn't NV40 ati's next offering?
Unless nVidia have been the subject of a shock takeover in the last few hours
(\__/)
(='.'=)
(")_(")
because it destroys trust that the company keeps to their word of producing graphics cards that live up to their marketing?Originally posted by perryh73
Hmm, too much flannel and guff I think.
Remember that manufacturers like Nvidia need higher stock prices at different times so why not deflate your stock with what seems like a unperforming product whilst holding back fully working drivers now and then releasing them nearer the game release so the sales for crimbo of the Nvidia cards and sales of the game peaks around crimbo. ...
Just a thought....
Perry.
The thing about Doom3 being better on nvidia hardware sounds a bit odd too, if the Dawn demo was supposedly "impossible" to run on anything other than an fx5900/ultra it occurs that anything that will run on an fx5900 will run easily on an ati card with a few tweaks..
(\__/)
(='.'=)
(")_(")
only if ati rename there cards to what the y will beat
(or pwn), amd's pr marketing rox
nvidia made any comments yet? should be funny, these r the possible ones think so far:
"our cards suck... so were giving them away half price"
OR
"with new drivers we pwn ati"
OR
"erm.. shh so we can still trick everyone else into buying a crap card"
or words to that effect..
also doom3 on a fx5900u, there any benches on the leaked version? my 9700np does well at 50 fps average, pc not set up too well though, and that was b4 the clockage
also, aslong as u can do more than 45fps or better ALL THE TIME and it never drops lower then tbh it doesnt matter cos u wont visually notice, and id play with high eye candy at 20fps even if it was 100fps at low
Isn't the problem with the fx series the current incarnation of the Cg programming doofer? (not up on the nvidia stuff at the moment) assuming that it is, then one assumes that it's like any other form of programming and that it can be updated fairly easily? maybe a bios update on the cards or even as simple as a driver update.
If it's the core running out of steam though, that would be somewhat more detrimental..
(\__/)
(='.'=)
(")_(")
David posted this on the front page..
Hmm, the whole thing smells like PR hype to meOver the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.
During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.
We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.
Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.
Pending detailed information from Valve, we are unaware of any issues with Rel. 50 and the drop of Half Life 2 that we have. The drop of Half Life 2 that we currently have is more than 2 weeks old. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.
The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.
In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.
We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.
(\__/)
(='.'=)
(")_(")
Sounds like the 50's seem to be nvidia's last strike
tbh, I hope they help them out.
Is this saying that they are going with valves idea to use DX8 with their cards? Also are they trying to fool us into thinking there is no point to DX9...?Might just be me and the fact its late thoughPart of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit.
also smells of poo, or is that the rubbishrubbishrubbishrubbish in my shoe? this rhymes too
seriosly i think it is stupid for them to pretend they still have the better card, and it seems they have opted for number 3hree
"erm.. shh so we can still trick everyone else into buying a crap card"
i know ati has had very bad drivers before and they have been critisised very heavly for this even tough the current ones are, imo , perfect. but it seems noone cares with it the other way round - it will be fixed months after the cards release, oh thats fine . i cant belive how blatent the PR marketing crap is here either, they r trying to cover up the fact that the card is crap, and too many promises. nvidia would make good polititions, unfortuatly. when we see them own ati again then they will deserve it but not before.
oh and i think them trying to blame valve is typical, and very very bad, i hope valve drop support for the 5900u, it took them 5x longer to code than the proper dx9 gfx (WHICH IS MAKING US WAIT LONGER HURRY THE F *K UP, then they get blamed cos nvidia made a few crap cards (5200-5900u). aint valves fault that nvidias cards run to slow in dx9 specs, shouldnt need to cards to change to 16bit fp, ati can manage it why shouldnt nvidia have to?! there cards should be striped of all dx9 logos on the boxes, cos it aint gonna be playable in ANYTHING.
also the fact that nvidia mentioned the need to change presision in floating points shows that the problem lies in the core, it cant crunch as many numbers as it needs to, and no matter how many driver updates they get, nothing is gonna change, exept image quality (32bit-16bit fp does count! (although it might not be noticable, i class it as cheating))
yes, im pissed of with nvidia, for these resons:
i dont own nvidia card, but many do, i feel they were tricked into buying inferior product.
they are trying to blame valve for what is essientally a bad product.
after this long after release and having worked with the game coders, they are still v far from perfecting anything, if it can be, and there promises are 99% bs.
blatent cheating, again. (note that ati have never cheated, not from the info ive found about the incident anyway, some 1 find a link to something about that)
and the fact they even bothered to make a pr announcement, that takes ages of proffesional decoding to understand wtf there pr rubbishrubbishrubbishrubbish means, just like every other fking company. PR announcements are like the french language, they are 3x longer than they need to be to say what they would in english. for EXAMPLE!
its the best? prove it. optimisations so far = cheats. highly programmable WOW just like every other dx9 card! im amazed. this bit is the best "and includes feature and performance benefits for over 100 million NVIDIA GPU customers" cos my r9700np will still performs better though it costed less, and there performance cheating cant count as a feature. last they mention the amount of customers they wont have for much longer.Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.
sorry for the long post, would be nice if the box to type in was a bit bigger (no i cba to click new reply first!)
Last edited by SilentDeath; 12-09-2003 at 02:03 AM.
Not sure, I think it's more to do with on-the-fly optimisation especially with the automatic shader optimiser bit.Originally posted by acidrainy
Sounds like the 50's seem to be nvidia's last strike
tbh, I hope they help them out.
Is this saying that they are going with valves idea to use DX8 with their cards? Also are they trying to fool us into thinking there is no point to DX9...?
Might just be me and the fact its late though
If there is a point in the game where there will be no benefit in terms of image quality and will take a performance hit using PS v2.0, then it looks like it will automatically switch to PS1.4 until PS2.0 is needed..
Not sure how well that works out though, I thought you had to initialise a game for dx8 or 9 before it starts, unless they've managed some nifty backwards compatibility thingy in the PS2.0 code..
But yeah, getting late, brain getting sluggish..
(\__/)
(='.'=)
(")_(")
it will all be in dx9, with the switching done by the drivers. it works in this order:
code>drivers>card
code = gfx instructions, drivers can alter it, and change any settings they wish, when ever, and the card will just do what its told (or overheat and die, in the case of 3d screensavers on the 5800)
when u initilise a rendering context, u use API to do it, this goes to windows, then the cards drivers. any changes like for example alt tab'ing (with games on older cards / drivers) during a game will often corrupt the screen, this is because its done only by windows and the driver step is missed out. recent drivers will usually save al needed info, minimise and restore it on the way back in. so cleaver drivers are very able to optimise/cheat on the fly.
Good grief, do we really need fan-boy rantings in here?
Can't we discuss things sensibly like rational adults for once?
Okay, do we really need the country bashing too?and the fact they even bothered to make a pr announcement, that takes ages of proffesional decoding to understand wtf there pr rubbishrubbishrubbishrubbish means, just like every other fking company. PR announcements are like the french language, they are 3x longer than they need to be to say what they would in english. for EXAMPLE!
Since when has software optimisation ever been a crime?
It's just making the most out of the resources, and is an entirely sensible programming technique.
Why use 32 bits of data when 16 can do the same (or near enough to be un-noticed) job? Do you not think that ATI optimises it's code also? (the 3.6 cat drivers had major issues issues in il2FB with certain settings, making it chug horribly at times but it was then patched in 3.7, and everything is fine)
As is mentioned in the press release, the 50.x dets will be out before the game is released, so why not wait until both the drivers and the game have been released before posting such a garbled rant?
(\__/)
(='.'=)
(")_(")
There are currently 3 users browsing this thread. (0 members and 3 guests)