The Future of Computing Part 5: Evolution and the Bump

Today’s computers are the result of many decades of evolution of techniques and technologies. From this process new techniques and technologies have sprung forth and some of these are really just starting on their evolutionary journey, in time they will change the face of computing but there’s a road bump to get over first.

The Evolution of Software

Computing started in theories [1] then became reality with hardware in the first half of the 20th century. “Software” development started with physically changing valves in the early 1940s, then there was machine code, assembly code and then we moved onto higher level languages making software development easier. Basic was specifically designed to be easy and became very popular as a result being supplied with most early personal computers. VisualBasic did the same trick and I think we will see the same happen again even for large applications.

Another part of this trend has been the addition of ever more powerful libraries. You don’t need to worry about the specifics of data structures anymore, you can just call up a library which manages it all for you. We now have software components which go to a higher level giving you specific functionality for all manner of different operations. Add to this the proliferation of virtual machines, application servers and middleware and you see the programmer is doing less and less of the programming and more and more of joining up the dots.

The next stage is to drop the complexities of Java and C# and do it all in scripting languages like Python [2] which allow easier, faster and cheaper software development. Microsoft are already doing this with the .net version of Visual Basic and it is on it’s way for Java with projects like Jython [3].

As application servers and middleware become more powerful how long will it be before the vendors start shipping actual applications? All you will do then is customise the application to your own needs, you’ll still need to understand logic but there will probably even be tools that even help you do that. Programming could become very different in the future and open to a lot more people, the specialist skills that development requires will be less and less required, at least for business development.

I think the long term trend for the software development industry is not looking good, but the trend for the home developer, the open sorcerer, is very different, quite the opposite in fact. I can see business development becoming so incredibly easy and thus incredibly boring that many developers will take to open source development simply for the challange, so they can tackle the complex problems in the language of their choice.

All software will be free (as in do what you want)

Patents do not last forever, Everything patented today will be freely available in twenty years time. As software advances all the techniques being invented will eventually become free and open for use by everyone. The difference then between open and closed source will be one of functionality rather than one of technique.

As open source software advances it will keep catching up with the propriety vendors, there will come a time when you’ll not be able to tell the difference at least in terms of functionality. The differences of integration and consistency will remain, but as more companies become involved in open source development the needs of users will be fed into the development process and open source products will also become more integrated and consistent just as closed source products are today.

Linux will continue to grow but as it becomes more business-like I can see the potential for the more adventurous developers moving on to other platforms simply for the challenge. Arguably this is already happening and you don’t need to look far [4] to see that these days there is a proliferation of different Operating System projects.

All hardware will be free

The same applies for hardware patents, these too will become free for everyone to use.
I don’t see the possibility of everyone making their own multi-million transistor CPUs in their bedroom becoming possible any time soon but with the increasing availability of open source tools and larger, faster FPGAs* creating a CPU in your bedroom will become easier.

*An FPGA (Field Programmable Gate Array) is a “blank” chip you can wire up in software to do pretty much anything.

One day you may even be able to code new CPU specifications and software will automatically create the CPU design for you then program it on an FPGA for you to use, of course it will also create a compiler so you can program and test your design. As with many future developments this is already in development (or at least being considered).

A (very) brief history of computer advancement

Much of the history of personal computers has been new radical, advanced technologies appearing (Apple I / II, Macintosh, Amiga, NeXT, BeOS) then being copied by the competitors and especially the now ubiquitous Wintel PC. The pace of development has slowed in the last decade, we see constant advancement and innovation of sorts but this is the incremental drip feed of evolution, who in this day and age is doing computers as radical, as advanced as those platforms?

The Engine

Just as cars have engines so does the technology industry, but it is not software developers or hardware designers, it is the semiconductor process Engineers who have driven this industry for the past four decades. Gordon Moore noticed their progress back in the 1960s in what became known as “Moore’s law”, they’ve been going solidly ever since. Perhaps I should call them “magicians” – how many other fields of human endeavour have seen such advancement in such a short time? They get little credit for it, but without them none of the other advances would have taken place, none of the above platforms or their technologies would even exist.

If it wasn’t for them we would not have the ever faster computers we are used to, we would not have the continuing advances in technology. Most of the performance in CPUs are not the result of advanced architectures, they are the result of ever finer process geometries being used, transistors get smaller, their threshold voltages lowered and as a result their switching speed goes up. If it wasn’t for this continuing advancement we’d still be using CPUs who’s clocks would measure in KHz, not MHz or GHz.

The same progress has lead to the higher capacity and ever lower cost memories. I have an Amiga A1000 on this desk, it came as standard with 256 Kilobytes of RAM, beside it is a digital camera which can take a memory card with a capacity 32,000 times higher [5].

Because of the ever more powerful semiconductor technologies we can have ever more advanced CPUs and ever higher memory capacities. Because of these ever more sophisticated software can be written faster, friendlier and more capable with every generation. Unfortunately however there is a problem, one day sooner or later the engine is going to stop.

The Engine Stops

One day you are going to be able to walk into a computer shop two years running to buy the fastest CPU, both times you’ll get the same CPU, Intel, AMD and IBM will not have produced a faster CPU. That may be inconceivable today but it will happen, it may be 20 years yet, maybe even longer, but it will happen.

It’s been predicted for many years but time and again but the process engineers have always found ways around the road blocks and kept going. Eventually even they are going to hit immutable physical limits. Eventually we will all find out that Moore’s law is not a law after all.

Of course they will keep trying, perhaps we will see the number of layers increased so chips can be built upwards. This is already done today to a degree today but could you double the number of transistors by building upwards? I bet they’ll try.

They’ll try different chemicals and techniques as well no doubt. Even after Moore’s law stops chips will keep advancing. The Cray 3 [6] used gallium arsenide instead of silicon for it’s transistors, perhaps they’ll try that and we’ll see power budgets going completely bananas – the Cray 3 used 90,000 Watts!

On the other hand other technologies sitting on the sidelines today could come to the fore. It’s astonishing what’s out there even today: Using Superconductor technology an 8 bit CPU operating at 15.2GHz has recently been developed [7]. What’s more despite it’s stratospheric clock speed it uses just 0.0016 Watts, 18,000 times less power than the Intel’s low power Pentium M or a colossal 56,000,000 times less power than the Cray 3.

One thing that could make a very big advance is to find a way to cut the price of manufacturing chips, an unfortunate side effect of Moore’s law is the ever increasing cost of manufacturing plants (known as Fabs), these costs may kill off progress long before they ever hit the physical limits.

It could come to a sudden halt brought on by simple physical impossibility or more likely the engine will stutter out. Either way the engine will stop and from that day on computing is in a different world.

Is this the end, or just the beginning?

Ironically, the end of Moore’s law could actually be a good thing, all this advancement has allowed software developers to get lazy. The art of efficient programming has all but died. Abstractions and easy programming techniques rule these days. Did you know it was possible to have a GUI on an Apple II? Did you know it is possible to do real time 3D on a Commodore 64? When programmers are restricted they can solve seemingly impossible problems. When Moore’s law runs out they are going to hit exactly these sorts of problems.

Modern desktop computers are extraordinarily inefficient. We could be in for a programming renaissance as developers find ways to fight the bloat. To bring the computing power already present in modern systems to the surface.

But this is evolution, not revolution. A revolution will come, and it will come when a number of technologies are combined to create something that today we don’t know as a computer. It will be possible when a number of powerful technologies shall combine to create something new but strangely familiar.

Software becomes hardware

There are of other ways of improving performance than better programming, moving software functions into hardware is one of them. A future CPU could contain a set of general purpose cores, special purpose cores and a big FPGA.

FPGAs are probably the future of hardware computing, general purpose hardware is too slow, special purpose hardware is inflexible and takes up room. FPGAs cuts between these lines, they will allow very high speed computation but have the advantage of being reprogrammable.

We don’t know when FPGA hardware will replace high speed CPUs (it’s been predicted for years) but when the engine stops we will find the advances it has provided us with are quite enough to be getting on with. The rapid progress made in the last 40 years will allow the next stage of computer revolution to begin.

An FPGA based computer will be nothing like anything on your desktop today, there are no registers or pipelines like normal CPUs, in fact there’s not much more than a big bunch of gates and you have to figure out how to get them to do your work. Sure there are tools available but these are really for electronic engineers not programmers, if you like a challange…

FPGAs make our computers more powerful than ever and a lot more difficult to program. Just how exactly does a (insert language of choice) programmer write software for a FPGA? Luckily there are more familiar languages that can be used such as Stream-C [8] so you don’t have to go learning hardware languages such as Verilog or VHDL.

The technology for programming FPGAs will move into the mainstream. Rather than having software libraries to do everything you will have hardware libraries, when a specific problem is encountered the CPU will program this library into hardware and use it, this will boost performance into the stratosphere.

Computers will become adaptive, optimising themselves for the task you are doing. Want to encode a movie to MPEG? It’ll transform itself into a giant MPEG encoder then tear through the entire movie as fast as the storage system can deliver it. When FPGA based computing becomes widespread who will need fast general purpose processors any more?

However, learning to program FPGAs isn’t a question we are going to be worrying about, we are not going to be doing the programming, someone, or rather something else is.

Artificial Intelligence is a long held goal of Scientists and Science Fiction authors alike, it’s got it’s ticket and has boarded the bus. The road may be long but like everything else I’ve written about much of the technology is here today. However unlike anything else I’ve covered AI will have a much greater impact on computing and wider society. AI is going to change everything.

There is a revolution coming, I didn’t said it’s going to be nice.



[1] Alan Turing, The father of computing.

[2] The Python programming language.

[3] Write Java in Python.

Beginnings of a project to do the same with Perl.


[4] A website just for Operating Systems πŸ˜‰

[5] 8 Gigabyte CompactFlash card.

[6] Cray 3

[7] 15GHz! I did say all the action was at 8 bits…

Microprocessor Watch Vol 116

[8] Program hardware with a variant of C.

Copyright (c) Nicholas Blachford February 2004

Disclaimer: This series is about the future and as such is nothing more than informed speculation on my part. I suggest future possibilities and actions which companies may take but this does not mean that they will take them or are even considering them.


  1. 2004-02-26 7:43 pm
  2. 2004-02-26 7:48 pm
  3. 2004-02-26 7:59 pm
  4. 2004-02-26 8:03 pm
  5. 2004-02-26 8:13 pm
  6. 2004-02-26 8:32 pm
  7. 2004-02-26 8:41 pm
  8. 2004-02-26 9:06 pm
  9. 2004-02-26 9:11 pm
  10. 2004-02-26 9:18 pm
  11. 2004-02-26 11:47 pm
  12. 2004-02-27 12:07 am
  13. 2004-02-27 12:18 am
  14. 2004-02-27 12:21 am
  15. 2004-02-27 12:24 am
  16. 2004-02-27 12:34 am
  17. 2004-02-27 12:36 am
  18. 2004-02-27 1:07 am
  19. 2004-02-27 1:07 am
  20. 2004-02-27 1:07 am
  21. 2004-02-27 12:33 pm
  22. 2004-02-27 2:17 pm
  23. 2004-02-29 12:18 am
  24. 2004-02-29 1:31 am
  25. 2004-03-02 10:43 am