The huge elephant in the room as to why even gamers aren't updating their PCs regularly.
Along with Windows demanding ever more from a PC, there were the games, demanding ever more performance - whether that be screen resolutions, physics or the plethora of other details.
Unfortunately, 5 years ago, the consoles' latest incarnation came to town. At the time at least on a par with high-end PCs, however that has ceased to be true for several years now. Unfortunately, they are a huge cash generator for games makers.
Add in to that that monitors have stalled at a resolution of 1080p for the same 5 years or so and you get to the bottom of why nothing has moved on.
Games are designed and built for 5 year old hardware, to be displayed at 1080p resolutions.
Nobody is pushing for a performance increase - even the CPUs Intel is producing aren't pushing the envelope. 5 years ago the Q6600 was released, quad core @ 2.4GHz; today's equivalent the 3570K is a quad core running @ 3.4GHz. I know there are other differences, but the big one is that clock cycles in 5 years haven't advanced particularly far and raw computing power today isn't a large enough leap to warrant the expense, so why bother?
Add in to that mix that a lot of gamers have looked over the fence and gone "hang on, why am I paying out £x when I can just get a console for £x/2?" and a lot have jumped ship. Look at the games - 5 years ago games were released to support 64 players online (with Battlefield 2 unofficially supporting 128 with some, at the time, ridiculous hardware requirements), today we're lucky if they support 32 (there are exceptions).
If the big requirements for faster processors have diappeared (e.g. Windows, Games), then why is anyone surprised that there is no market for new, faster processors?