Is this the end, or just the beginning?
Ironically, the end of Moore's law could actually be a good thing, all this advancement has allowed software developers to get lazy. The art of efficient programming has all but died. Abstractions and easy programming techniques rule these days. Did you know it was possible to have a GUI on an Apple II? Did you know it is possible to do real time 3D on a Commodore 64? When programmers are restricted they can solve seemingly impossible problems. When Moore's law runs out they are going to hit exactly these sorts of problems.
Modern desktop computers are extraordinarily inefficient. We could be in for a programming renaissance as developers find ways to fight the bloat. To bring the computing power already present in modern systems to the surface.
But this is evolution, not revolution. A revolution will come, and it will come when a number of technologies are combined to create something that today we don't know as a computer. It will be possible when a number of powerful technologies shall combine to create something new but strangely familiar.
Software becomes hardware
There are of other ways of improving performance than better programming, moving software functions into hardware is one of them. A future CPU could contain a set of general purpose cores, special purpose cores and a big FPGA.
FPGAs are probably the future of hardware computing, general purpose hardware is too slow, special purpose hardware is inflexible and takes up room. FPGAs cuts between these lines, they will allow very high speed computation but have the advantage of being reprogrammable.
We don't know when FPGA hardware will replace high speed CPUs (it's been predicted for years) but when the engine stops we will find the advances it has provided us with are quite enough to be getting on with. The rapid progress made in the last 40 years will allow the next stage of computer revolution to begin.
An FPGA based computer will be nothing like anything on your desktop today, there are no registers or pipelines like normal CPUs, in fact there's not much more than a big bunch of gates and you have to figure out how to get them to do your work. Sure there are tools available but these are really for electronic engineers not programmers, if you like a challange...
FPGAs make our computers more powerful than ever and a lot more difficult to program. Just how exactly does a (insert language of choice) programmer write software for a FPGA? Luckily there are more familiar languages that can be used such as Stream-C  so you don't have to go learning hardware languages such as Verilog or VHDL.
The technology for programming FPGAs will move into the mainstream. Rather than having software libraries to do everything you will have hardware libraries, when a specific problem is encountered the CPU will program this library into hardware and use it, this will boost performance into the stratosphere.
Computers will become adaptive, optimising themselves for the task you are doing. Want to encode a movie to MPEG? It'll transform itself into a giant MPEG encoder then tear through the entire movie as fast as the storage system can deliver it. When FPGA based computing becomes widespread who will need fast general purpose processors any more?
However, learning to program FPGAs isn't a question we are going to be worrying about, we are not going to be doing the programming, someone, or rather something else is.
Artificial Intelligence is a long held goal of Scientists and Science Fiction authors alike, it's got it's ticket and has boarded the bus. The road may be long but like everything else I've written about much of the technology is here today. However unlike anything else I've covered AI will have a much greater impact on computing and wider society. AI is going to change everything.
There is a revolution coming, I didn't said it's going to be nice.
 Alan Turing, The father of computing.
 The Python programming language.
 A website just for Operating Systems ;-)
 8 Gigabyte CompactFlash card.
 15GHz! I did say all the action was at 8 bits...
Microprocessor Watch Vol 116
 Program hardware with a variant of C.
Copyright (c) Nicholas Blachford February 2004
Disclaimer: This series is about the future and as such is nothing more than informed speculation on my part. I suggest future possibilities and actions which companies may take but this does not mean that they will take them or are even considering them.