I see 3D TV taking off a long time before 3D games. Having used the primitive IZ3D Anaglyphic 3D system myself in games, the colour issues related to anaglyph and the performance drop of its software rendering aside, it makes games look bad. The way 3D works to enhance certain textures makes the quality of textures stand out far more than usual, and it's hideous to look at. This was Left 4 Dead as well, a game largely praised for having decent quality textures. Should graphics developers do what I've been wanting for years and stop messing around with lighting techniques and start actually making better textures, 3D games may make far more sense, but for now, the double performance requirement for interlacing (assuming this does become the standard method), the general nastiness of how current games look, the requirement of developer support and the rest, 3D games are a long, long way off.
3D TV makes far more sense, no scabby textures to highlight, pre-rendered beauty and so on. Couple this with the fact that TVs are already able to run at the refresh rates required and beyond, PC monitors for the most part are not.
3D as a whole, however, needs standardisation. The grand irony is that while nvidia blab on about it being one of their center-pieces, them using it is doomed to failure until it stops being one of their center-pieces and everybody is selling it. The same goes for PhysX and all the rest.
Eyefinity on the other hand is the opposite scenario - ATI were wise to get the leg-up here, as it does not require specific game support, and is useful even at the desktop, let alone in games. I'm still an eyefinity sceptic for gaming, as it has the same problem as 3D for highlighting bad textures, only this time due to the fact they're being stretched to vastly beyond their normal size, as well as the issue of bezels. For desktop programs and strategy games, it's amazing. For racers with 3 screens, it should work well. For FPS games? no thanks.