Most TFT's expect an input of around 60hz. Refresh rate is not an indication of how many FPS a screen can do, only how many times a second the display is refreshed, including images that have not changed. Most TFT's require a signal of 60hz purely because its easier for them to manage. Refresh rate isn't used in the same context as they are on CRT screens. Pixel transition time is the important factor there and will be the limiting factor in how many FPS you can 'see' (or to be more accurate, how many you can perceive to be happening)
Disable V-sync (even on a TFT!) and you can get into the 100fps range if you have a system that can do it whatever game you choose.
You seem to be getting confused with how many FPS can be processed by a game, and how many a screen can display, they are not the same (and I never claimed they were)
A game will feel smoother at 100fps than at 30fps, even on a TFT, because the game is calculating more changes in the scene. This means when the monitor next comes to refresh the image that is displayed, it will be a more accurate scene of what is actually happening.
Edit - If you look at the transition time of pixels on any TFT, that will allow you to work out what's the theoretical maximum speed that it can show A full scene change, thus it is the nearest thing you can get to "how many FPS you can see on it" (although I can't stress how inaccurate that actually is). Of course, the transition blur between the two scenes will also help in showing any scene change, regardless of the FPS its being calculated at, which all helps in the illusion of animation/video.