It's hard to explain, but first you have to understand more about the vast differences in software and hardware architectures. The way consoles are designed differs a lot from the way a PC is designed.
For instance, a PC has an entire layer of software just for drivers. Without a driver layer, you can greatly increase the throughput of an operating system. A console does not have modular and interchanging hardware, so the computations going to and from the graphics card do not have to pass through a software layer such as drivers.
It's the way the software and hardware comes together as a whole. Individually consoles and PC's have a lot of similarities, but as a whole, the two systems are miles apart. A good example is the kind of graphics power that can be achieved from an Xbox 360. Take an equivalent graphics card in a PC and you will never achieve comparable results.
As far as PC's not able to get better, you're absolutely right. They most certainly can. The problem is they rarely do. Because of how fast the hardware advances in the PC world, developers don't really need to optimize their code to expand the capabilities of older systems. They can simply assume that if you want better performance and results, you are going to upgrade to better hardware that is more capable of processing their code.
On the other hand, console developers know the hardware will never change. The only way a console will improve is with optimizations in the software. So while a PC developer can do the same, they simply aren't focused on the same areas as a console developer.
It's not really easy to understand all of this without a proficient understanding of operating system kernels and the way they work, and without an understanding of how computing systems work down to the hardware abstraction. I tried my best to explain it in simpler terms