Some people can tell the difference. Doesn't change the fact that cinemas use 144Hz refresh, which is why they're more popular.
Just fine.
People with one eye will still see the same photo and the same details and the same depth cues... because it is 2D. The content doesn't change in any way.
The only time having one eye makes a difference is in stereoscopic acuity, for which you need two eyes and for which people with only one eye still use the same depth cues to perceive as with the flat photo.
What they are filmed at and what they are shown at are two different things. Standard rate is 24fps, which is then (at least) doubled to 48fps for 3D. Most TVs can manage 60Hz and old CRTs were around 72Hz, IIRC. 24 (per eye) is all you need for persistence of vision, though, and shooting higher can rob you of that much needed motion blur, in effect simulating too high a resolution, thus flooding you with too much detail and 'over-immersing' you. This, combined with the usual focal mis-adoption and framing errors will more than likely lead to headaches.
Ah, so giving us 4K will give us much more information and make the image better at our 24fps, yes?
Nope - 4K is around 8.8 megapixels (depending on brand). NASA pegs the human eye at 576 megapixels, but then dials it back down to just 7 useful ones, with the average foveal vision about 2ยบ from centre. We are getting 8K soon and 16K is already on the cards. That's 132 megapixels, of which 7 are of any good.
But it makes no difference how high a resolution you put on the screen. Put up two million megapixels if you like - Framerates and refreshes, along with various filming techniques like motion blurring, are what define how readily a still image is perceived as smooth motion by the average human eye, and your brain has had up to 6 decades of watching footage at about 25fps interlaced on 60Hz screens.
Resolution has nothing to do with it and even at 120fps, nay 300fps, there will be some people who can see 'through the frames'. Again, we're going back to filming and displaying correctly within an acceptable range for what is considered teh average viewer.
If it's THAT bad that 72fps per eye is unwatchable for you, then I assume old CRTs were so flickery that you lived with a permanent migraine, or that you're just exceptionally perceptive. Fighter pilots have identified single frame images at over 200fps - Are you a fighter pilot?
Sounds more like another combination of factors, both medical and filmographical.
The average human can perceive smooth motion at just 18fps, if it's filmed properly.
Apparently BluRay 3D is encoded slightly higher than 48fps...