A (very) brief history of computer advancement
Much of the history of personal computers has been new radical, advanced technologies appearing (Apple I / II, Macintosh, Amiga, NeXT, BeOS) then being copied by the competitors and especially the now ubiquitous Wintel PC. The pace of development has slowed in the last decade, we see constant advancement and innovation of sorts but this is the incremental drip feed of evolution, who in this day and age is doing computers as radical, as advanced as those platforms?
Just as cars have engines so does the technology industry, but it is not software developers or hardware designers, it is the semiconductor process Engineers who have driven this industry for the past four decades. Gordon Moore noticed their progress back in the 1960s in what became known as "Moore's law", they've been going solidly ever since. Perhaps I should call them "magicians" - how many other fields of human endeavour have seen such advancement in such a short time? They get little credit for it, but without them none of the other advances would have taken place, none of the above platforms or their technologies would even exist.
If it wasn't for them we would not have the ever faster computers we are used to, we would not have the continuing advances in technology. Most of the performance in CPUs are not the result of advanced architectures, they are the result of ever finer process geometries being used, transistors get smaller, their threshold voltages lowered and as a result their switching speed goes up. If it wasn't for this continuing advancement we'd still be using CPUs who's clocks would measure in KHz, not MHz or GHz.
The same progress has lead to the higher capacity and ever lower cost memories. I have an Amiga A1000 on this desk, it came as standard with 256 Kilobytes of RAM, beside it is a digital camera which can take a memory card with a capacity 32,000 times higher .
Because of the ever more powerful semiconductor technologies we can have ever more advanced CPUs and ever higher memory capacities. Because of these ever more sophisticated software can be written faster, friendlier and more capable with every generation. Unfortunately however there is a problem, one day sooner or later the engine is going to stop.
The Engine Stops
One day you are going to be able to walk into a computer shop two years running to buy the fastest CPU, both times you'll get the same CPU, Intel, AMD and IBM will not have produced a faster CPU. That may be inconceivable today but it will happen, it may be 20 years yet, maybe even longer, but it will happen.
It's been predicted for many years but time and again but the process engineers have always found ways around the road blocks and kept going. Eventually even they are going to hit immutable physical limits. Eventually we will all find out that Moore's law is not a law after all.
Of course they will keep trying, perhaps we will see the number of layers increased so chips can be built upwards. This is already done today to a degree today but could you double the number of transistors by building upwards? I bet they'll try.
They'll try different chemicals and techniques as well no doubt. Even after Moore's law stops chips will keep advancing. The Cray 3  used gallium arsenide instead of silicon for it's transistors, perhaps they'll try that and we'll see power budgets going completely bananas - the Cray 3 used 90,000 Watts!
On the other hand other technologies sitting on the sidelines today could come to the fore. It's astonishing what's out there even today: Using Superconductor technology an 8 bit CPU operating at 15.2GHz has recently been developed . What's more despite it's stratospheric clock speed it uses just 0.0016 Watts, 18,000 times less power than the Intel's low power Pentium M or a colossal 56,000,000 times less power than the Cray 3.
One thing that could make a very big advance is to find a way to cut the price of manufacturing chips, an unfortunate side effect of Moore's law is the ever increasing cost of manufacturing plants (known as Fabs), these costs may kill off progress long before they ever hit the physical limits.
It could come to a sudden halt brought on by simple physical impossibility or more likely the engine will stutter out. Either way the engine will stop and from that day on computing is in a different world.