Originally Posted by |SilentDeath|
http://download.microsoft.com/downlo...iquid_1080.exe
there you go!
http://www.microsoft.com/windows/win...tShowcase.aspx
Originally Posted by |SilentDeath|
http://download.microsoft.com/downlo...iquid_1080.exe
there you go!
http://www.microsoft.com/windows/win...tShowcase.aspx
it is balls!!!!!
nothing is working, and i aint happy, i cannot even watch dvds without queer lines coming across the screen!! omg i am not happy!
it ruins the whole thing, most obviously is the picture is blurred ALWAYS around the edges!
and using the pure vid nvidia program, the bvids fine, but it just wont play the sound coherently!
Mac fancier > white macbook base spec .................. CS: muddyfirebang
Step into liquid in particular seems to drop frames, but 1280x720 movies all play without a hitch at 60-70 cpu, though the lazy 6800 still does nothing for me on divx HD.
Is it possible to just disable the built in decoder and just let the CPU do it. Isnt that the way all the other graphics cards work?
Surely this would give better performance
how do u do that?
ill compare the performance!
Mac fancier > white macbook base spec .................. CS: muddyfirebang
That's exactly what happens now, the CPU takes the decoding from the GPU when the GPU can't cope, but the GPU cannot cope any of the time(because it doesn't work properly), that's why people are getting high CPU utilisation.Originally Posted by Timmy!!!
I don't mean to sound cold, or cruel, or vicious, but I am so that's the way it comes out.
Originally Posted by mnet674
Well, I'm the lucky one... There is no droped frames on my rig!
Also, maximum PC mag has oficialy questioned Nvidia and they clam that while the GPU will never properly handle divx and WMV9 It does handle the other types with a future driver release...
Oh ok, sorry.Originally Posted by Mblaster
So the problem is that the built in video decoder is not doing its job. Then why cant nvidia just make the GPU decode the video instead - this is how my ATI graphics card works is it not? (I think the gpu and video decoder are two seperate things if I am not mistaken)
DivX and WMV9 isn't necessariy the problem because current DivX and WMV9 decording doesn't generate enough of a performance overhead in most PCs for it to cause a problem in performance terms. It's the HD stuff you need to be concerned about because high definition video is on its way. Nvidia have shown, time and time again, that they're prepared to make statements about their products which simply aren't matched by reality. I'd hazard a guess that you will never see fully functional HD decoding until the next generation of Nvidia cards are released, which will coincide with the wider adoption of HD video. Nvidia has shown that they will do anything to move product - properly enabling video decoding on the current generation of cards does nothing but have an adverse impact on future product sales. Nvidia's comments (if you've reported correctly) about 'other types of video' is vague to the point of being meaningless.Originally Posted by myth
Just for any people new to this issue, there is absolutely no video acceleration of any type of HD video, either with or without Nvidia's DVD player, under any driver release (up to 71.24) on the 6800/GT/Ultra cards.
Last edited by davidstone28; 06-01-2005 at 03:02 AM.
Has anyone tried playing hte videos on a 4MB PCI card without any hardware GPU based decoder? - Althoguht its probs not posible due to the dependance of the WMV9 decoder on hardware DirectX it would indicate the performance required to decode the video on a CPU. Cos even at high resolutions modern CPUs should be able to handle Mpeg4 decoding - unless the bit rate has been squeazed soooo small that it has to do alot of re-construction.
GK
Keeper of the Gates of Hell
That step into liquid is jerky as hell on my machine.
HEXUS|iMc
There are currently 1 users browsing this thread. (0 members and 1 guests)