hmmm, rumour has it AMD will be releasing the linux drivers open source too - just how much remains to be seen but seeing how the previous attempt put me off buying an ATI card - fingers crossed eh?
Nox
hmmm, rumour has it AMD will be releasing the linux drivers open source too - just how much remains to be seen but seeing how the previous attempt put me off buying an ATI card - fingers crossed eh?
Nox
It's not 1.3 it's only 1.2
David
The bang for the buck is now wrong hexus estimated them to be 250 there 290 ish which i dpnt think is worth the performance as the 8800 60 more
band for ur buck at the price is 0.694 make the 8800 worth getting
Last edited by optimadam; 14-05-2007 at 03:08 PM.
they aren't £290, the rrp is £249, komplett has them for that price, ocuk are pumping prices up as they are well known for doing, by getting all the early allocation stock. scan have them listed for £258, other places in the next few weeks will no doubt be selling them for £250 aswell, possibly even less.
price in one store, especially one known to be terrible with prices on stuff like this, doesn't mean anything.
the only issues really are how it performs, there are glaringly obvious bugs in the drivers, which tends to show up on various reviews as a 4xAA issue on various games. i would guess this has something to do with games not knowing which AA method to emply, nvidia until recently has issues with vista + source engine, 4xaa in game would actually choose i think the 16xaa which obviously killed performance.
once bugs are ironed out(monthly releases of drivers minimum, someone can learn from that) performance should improve. also its been shown that complex vertex shader power on this card is almost twice that of a 8800gtx, but pixel shader power is much lower. the question is, how will that effect performance in future games, and is DX10 geared towards using either better. I don't know at all.
drivers, and different drivers seem to be half at fault for the massively varying reviews around today, theres been some pretty huge jumps forwards(and a couple bugs pushing things back) from various driver sets so it would certainly appear ATi have some performance still to come.
the BIG question is, are the current drivers stable even if slow, on windows vista. because 8800's still have major problems in quite a lot of new games on vista. performance is great, but useless if its just being used to give higher framerate on a bluescreen, no matter how much i try i can't tell the difference between 3fps and 300fps on the bios screen either but thats just me.
SiM,
It's definitely v1.2. I have a moan about it stating that AMD's not adhering to the latest spec.
drunkenmaster,It all sounds good on paper but we note that AMD is only adhering to v1.2 spec, so it doesn't support later features such as TrueHD and DTS-HD lossless codecs as well as audio syncing.
We've looked at the HIS bundle and noted that it missed out on connectors. The packaging will probably undergo a revision to ensure that 2 x 6-pin will be included.
The architecture is generally sound, if a little lacking on the texturing front. It will be good to revisit it in a couple of months and see where performance stands.
optimadam,
The RRP is $399 and more than one AIB has told us that it will have a UK RRP of £250. HIS cards are already listed at that price. You'll find the average etail price to fall to that in a week. Speculators will always try to charge more.
Last edited by Tarinder; 14-05-2007 at 03:50 PM.
is there going to be any better performing cards, i heard some rumour about a xtx or am i mistaken? wonder what effect this will have on the 8800 card prices
There are no plans to bring a Radeon HD 2900 XTX to market anytime soon, according to the PR folks at AMD.
The reality is that we'll probably see one in the next 3 months, launched as a very limited-run product and designed to draw performance parity with the faster GeForce 8800-series cards.
tarinder, are you still playing around with the cards. almost all the reviews seem to have issues with the 4x aa settings. a couple of sites have said that they set no aa, then most steps up of aa and they all look extremely similar, and they said they had the same performance with 4xaa as they did with 8xaa with wide tent.
nvidia certainly had this issue with CS:S at least under vista(not sure about XP), where setting 4xaa in game was really choosing a far higher level, and was the reason lots of people were getting 20-30% lower performance with CS:S under vista. it would seem to be maybe some games that are having the issue but there are too many reviews that show a significant drop at just 4xaa, then tiny or even no drops at higher AA settings.
in which case most reviews have at least some games not comparing like for like which is obviously going to hurt the 2900xt's scores.
its obviously very difficult to know exactly which games have the issue if indeed there isn't an issue. but for maybe farcry could you compare the gts/2900xt with no aa/af, 2xaa/16xaf, and 8xaa(wide tent)/16xaf.
the one other question is, i guess most reviews with the 8800's when they launched had much closer results and IQ was discussed at length. if we could see say the difference between say 8xaa normal, 8x wide tent/narrow and so on against a few nvidia methods. it might be that 8x aa normal gives 95% of the quality of 8x aa wide tent, but at so much higher performance that its not worth the extra detail.
one last thing, i've read that you can still overclock and as far as can be seen to same levels with 3rd party tools with only 2x6 pin pci-e cables connected. its only the overdrive overclocking tool that will detect the 8pin and disable overclocking without it?
EDIT:- i've also noticed a slight trend, the inability to match the 8800 series at maximum framerates quite often, due to the lower ROP's i would assume. but minimum framerates in a lot of reviews have shown some games to have a much higher minimum framerate. so the rop's are limited max framerates but the hardware and 320stream processors are helping keep the minimum framerate up in the incredibly intensive parts of games. if this is an actual trend in many/most games, then we might find that in future very intensive games that while a card could get 80fps at points is unplayable because it frequently dipped into the 10-20 range(in maybe an FPS game), maybe the 2900xt will be able to keep a steady 30fps. which might be the way to go? minimum framerates are normally far more important to playable experience than maximum's.
i think there might be a heck of a lot of hidden potential in this card if and when some bugs can be ironed out.
do'h, edit again could you check or know the size inside the shim on the card? or how big the difference is between the height of the shim and core? yup, already thinking bigger cooler or water. can you please tell one of these company's to make a decent cooler for gfx cards. thermalright finally has the idea, almost no one had pci cards or sli/crossfire, which leaves masses of space for huge silent cooling underneath the card which no one is taking advantage of. thermalright have a habbit of bringing out compatible cooling for a new product mere weeks before the next big thing is out though is their main issue.
blowers = tiny airflow, awful airflow to noise ration, and they force the tiny airflow into a bottlenecked pci slot and cover half of the slot anyway. this is why a puny and honestly quite crap zalman old style cooler often improved the x1950xtx cooling despite being tiny compared to the stock sink. not sure where they got the idea that blowers and tiny air pathways led to good cooling, it doesn't.
Last edited by drunkenmaster; 14-05-2007 at 10:50 PM.
If that where true, I would switch my allegiance in a heartbeat.
I have been avoiding ATI for years, because of the sucky quality of their graphics drivers (and stories of even worse quality chipset drivers), but to be honest nVidia have not been that great either, they are just a lot easier to install. If either of the major graphics card companies released true open source drivers that could be properly integrated into linux distros and the kernel. (instead of the unstable hacks currently used), they would win a great number of fans, and a near fanatical following from that segment of the community.
hmm, checking out the guru3d review, well it has two different views of the ati card, the first half of the review it keeps using 4xaa, after the very first benchmark shows that, what i believe, is 4xaa is all defaulting to the highest quality mode as whatever 4xaa setting they use gives the exact same performance number. then they go on to imply the card sucks in other games but still use the 4xaa setting, which again is most likely using basically 8xaa, so not giving like for like performance.
the thing that gets me is they specifically point this out, in their very first benchmark, that 4xaa has an issue, they go on to ignore it. if its only in some games, of which it performs fairly badly and closely to the x1950xtx in most, then performance might actually be fairly good in that game if 4xaa mode is fixed, or if they benchmark the other cards in equal 8xaa mode.
then if the 4xaa mode is broken with all games the rest of the results of which the 2900xt matches/beats the GTX in their review maybe its really spanking the GTX in those games, to be fair i think thats highly unlikely, but its seems fairly obvious which games could be flawed, and considering they've highlighted it right at the beginning, why not check run the benchmarks without AA and see how 8800/2900xt look without aa, or try the 8xAA setting and see if how they stack up.
Drunkenmaster,
I think the most telling thing is AMD's own internal benchmarks, obviously showing off the HD 2900 XT in the best possible light, beating out a GeForce 8800 GTS 640 by around 10-20 per cent. However, and this is the important aspect to remember, AMD doesn't compare its performance against either GTX or Ultra.
Now, if AMD could 'engineer' the benchmarks in its favour and knew the 2900 XT could give GTX a real run, we would have seen those graphs in an internal document.
We overclocked using a secondary 8-pin connector on an Enermax PSU. Our numbers tend to be in line with AMD's own.
It's a good architecture, as I point out, and it will take a month or two for AMD to really get up to driver speed. It's worth revisiting then, under Vista and with DX10 gaming.
I'll find out the size of the shim later on today.
Last edited by Tarinder; 15-05-2007 at 09:10 AM.
what happened to the GDDR4 version?
YorkieBen,
GDDR4 was slated to be introduced with the XTX.
Our best guess is that AMD will handpick GPUs in a couple of months, run them at, say, 900MHz core, add in some 2GHz+ GDDR4 memory and release the cards as XTX
You may need to get the April DX update for Vista to get Lost Planet running. Found that one out this morning!
There are currently 1 users browsing this thread. (0 members and 1 guests)