The myth here is that if your CPU is running at 100% load on a game and you're only getting, for example, 40 FPS, you can simply upgrade your video card to get 60 FPS and then your CPU will run at less than full load. I found out through a $250 upgrade that this isn't how it works. If you are getting a low framerate, and your CPU is running at full load, you should look into upgrading your CPU. I should elaborate more on that in my post.
An FX-4100 is plenty good for gaming right now. This is how I see it:
If a game can run at 100 FPS on minimum settings, your CPU is good. If you max the game out and get 60 FPS, then your CPU will drop to however much it takes to run the game at 60 FPS, while your graphics card strains to run the game at that framerate. This is, of course, just a theory, and not backed up by anything
This only applies if you are getting unplayable framerates at low settings. While running a game, use Windows Task Manager in graph mode to see what your CPU is running it.
<EDIT> Also, LOL @ the name ch1ck3nch4s3r. I get the reference