Linked by Nicholas Blachford on Thu 26th Feb 2004 19:31 UTC
Editorial Today's computers are the result of many decades of evolution of techniques and technologies. From this process new techniques and technologies have sprung forth and some of these are really just starting on their evolutionary journey, in time they will change the face of computing but there's a road bump to get over first.
Order by: Score:
Thanks!
by bbrv on Thu 26th Feb 2004 19:43 UTC

Hi Nicholas, we enjoyed reading your articles. You write well and have a solid grasp of many of the moving parts, which makes your suggestions and thoughts worth considering.

Thanks!

R&B

terrible
by Omega on Thu 26th Feb 2004 19:48 UTC

Everything will be free, and you will have a machine as powerful as a Cray in every fridge. Dream on.

That series of articles is really terrible and disregards so many fundamentals (technical, business, law, etc) that it's just laughable.

this series is just a huge uneducated rant
by blahblah on Thu 26th Feb 2004 19:59 UTC

>> Computers will become adaptive, optimising themselves for the task you are doing. Want to encode a movie to MPEG? It'll transform itself into a giant MPEG encoder then tear through the entire movie as fast as the storage system can deliver it.

general purpose cpus will be ultra powerful (thus negating most asics)

asics will evolve from general purpose cpus guven programmable logic

so what is it? the only conclusion i can draw fomr this series is that in the future there will be stuff

convergence!
by kenny on Thu 26th Feb 2004 20:03 UTC
almost, but not quite
by PainKilleR on Thu 26th Feb 2004 20:13 UTC

Basic was specifically designed to be easy and became very popular as a result being supplied with most early personal computers. VisualBasic did the same trick and I think we will see the same happen again even for large applications.

Visual Basic took what was already there, then added drag & drop/point & click GUI development and simple component development. This point would've fit in much better with your eventual point in this section, but you seem to have missed it. Something else VB eventually proves to most developers is that complex applications need a language better suited to handling the complexity than VB. You can write large, complex applications in VB, but they only become a pain to maintain.

Another part of this trend has been the addition of ever more powerful libraries. You don't need to worry about the specifics of data structures anymore, you can just call up a library which manages it all for you. We now have software components which go to a higher level giving you specific functionality for all manner of different operations. Add to this the proliferation of virtual machines, application servers and middleware and you see the programmer is doing less and less of the programming and more and more of joining up the dots.

Except, of course, that the programming to "join up the dots" is sometimes more complex than the programming required for the individual dots. Beyond that, someone's still going to be programming better dots. If you don't know anything about data structure, how do you know which you need for the job at hand?

The next stage is to drop the complexities of Java and C# and do it all in scripting languages like Python [2] which allow easier, faster and cheaper software development. Microsoft are already doing this with the .net version of Visual Basic and it is on it's way for Java with projects like Jython [3].

How can VB.Net be an advancement beyond C# when the two operate in the same realm, with the same libraries and most of the same language features? The biggest difference between the two is the syntax, which makes C# much easier to use than VB for those familiar with C or C++, and consequently also makes C# lend itself to more complex programming somewhat better than VB. While I sometimes enjoy interactive and scripting languages like Python, I have to say that Python itself is probably a poor example of a language advancing easier/faster/cheaper programming, since many developers never deal with enforced use of space and indentation, and these have actual meaning in python.

As application servers and middleware become more powerful how long will it be before the vendors start shipping actual applications? All you will do then is customise the application to your own needs, you'll still need to understand logic but there will probably even be tools that even help you do that. Programming could become very different in the future and open to a lot more people, the specialist skills that development requires will be less and less required, at least for business development.

Some vendors will do this, and will probably help drive the industry to your envisioned goal, but overall most of them see more money in the services industry. People make very good money selling hard-to-understand services that have to be integrated into an environment, and then selling the integration services to go with it.

I think the long term trend for the software development industry is not looking good, but the trend for the home developer, the open sorcerer, is very different, quite the opposite in fact. I can see business development becoming so incredibly easy and thus incredibly boring that many developers will take to open source development simply for the challange, so they can tackle the complex problems in the language of their choice.

I can only wish that business development would become incredibly easy. I could then spend far less time actually working and far more time coming up with new ideas, making myself more useful to the company. As it stands, if Microsoft's drive is even moderately successful, the trend is towards being able to use the language of your choice with any given application, server, or service. You don't need to learn VBA any more to talk to Office, you can do it in VB, C#, C++, or any number of other languages (even Perl). Most programmers spend more time learning libraries and interfaces rather than languages when they're working. The 2 weeks I spent learning VB when I started working for this company (and the subsequent years spent learning it) has been left in the dust with the release of C#, allowing me to leverage the years I spent with C and C++, while still dropping into VB if I need to when a particular language element is giving me a hard time, and can be done more easily in that language. Even better, I can fire up C++ and explicitly drop in and out of managed code to isolate the less portable sections, rather than dealing with MFC and other MS-specific libraries, especially now that VC++ is so much more compatible with ISO C++ than it was before.

Learning a language becomes an easy task after you have a solid understanding of the features of that language (from some previous language, perhaps). It's learning the new interfaces, the new libraries that come along with each new piece of hardware or software brought into your environment that takes up most of the time of many business programmers. Making software more flexible and more adaptable to changes in the environment is what developers have to spend their time on today, and as time goes by it will quickly become the difference between a working developer and one trying to find a new job.

just for the record
by frank on Thu 26th Feb 2004 20:32 UTC

"Software" development started with physically changing valves in the early 1940s

just for the record :

http://irb.cs.tu-berlin.de/~zuse/Konrad_Zuse/en/Rechner_Z1.html

Software development with higher Languages starts amazingly
in an early stage of computing,
http://irb.cs.tu-berlin.de/~zuse/Konrad_Zuse/plank.html

More at the Unesco Memory of the World Register http://www.unesco.org/webworld/mdm/2001/nominations_2001/germany/zu...

But the article series was interessting, sometimes astounding.

cheers,
frank

>> Computers will become adaptive, optimising themselves for the task you are doing. Want to encode a movie to MPEG? It'll transform itself into a giant MPEG encoder then tear through the entire movie as fast as the storage system can deliver it.

general purpose cpus will be ultra powerful (thus negating most asics)

asics will evolve from general purpose cpus guven programmable logic

so what is it? the only conclusion i can draw fomr this series is that in the future there will be stuff


Yes, in the future, there will be stuff. The great thing is how easily the particular point is disproven, as MPEG-2 decoders (and encoders) are already common on video cards as it is. Computers are evolving into multiple specialized processors rather than general purpose processors that are reprogrammable for particular tasks. Not to mention that we have things like parallel processing and (obviously) multitasking becoming more prevalent. Why would I want my CPU to become a huge MPEG encoder when I can put a TV decoder card in my computer with an MPEG encoder built in that can already handle the quality of current programming. MPEG isn't going to be something you do as quickly as your storage system can handle it, instead it will be raw video and audio, because storage is becoming cheaper every day, and anyone that wants the best possible quality will move to lossless compression first, then compressionless encoding second (if such lossless compression proves to be too little benefit compared to the processor cycles it takes up, or proves to actually have some loss in it).

I'd much rather watch a movie in real time while it encodes it and sends it to my drive in the background (much like I do with music now) than have my entire computer reconfigure itself to the one task of encoding the movie. I think most people would agree with me on this.

But
by Buck on Thu 26th Feb 2004 21:06 UTC

The trend will be that less people care about technology and more about 'coolness', including UI coolness. Oh well, perhaps in the future computer won't have to do anything but just look cool and do all the same stuff - text notes and web browsing.

great
by daniel c on Thu 26th Feb 2004 21:11 UTC

i think that this series is excellente. the author did a great job and he's very clever. all the ideas are clearly exposed and the content is simply awesome.

well done, keep going!


daniel from buenos aires

arguments
by ozn on Thu 26th Feb 2004 21:18 UTC


PainKilleR has solid arguments, while the author does not.
People who don't know what they're talking about should shut up. My 2 cents ;P

v *yawn*
by Troll on Thu 26th Feb 2004 21:53 UTC
v wow - got to retrain to be useful in 20 years
by cmit on Thu 26th Feb 2004 22:08 UTC
v Codshite.
by Anonymous on Thu 26th Feb 2004 23:15 UTC
re: re: this series is just a huge uneducated rant
by hmmm on Thu 26th Feb 2004 23:47 UTC

I'd much rather watch a movie in real time while it encodes it and sends it to my drive in the background (much like I do with music now) than have my entire computer reconfigure itself to the one task of encoding the movie. I think most people would agree with me on this.

I watch my other movies, streaming them over the net while my box encodes away. But I put my encoding in the background so it only uses the extra CPU cycles I'm not using.

It might be nice to watch a movie while it is encoding, but I'd rather setup a batch process to encode all my DVDs as soon as I rip them and use my cluster of systems to handle the CPU requirements.

Linux and DVD::Rip do this nicely.

give me an F give me a P give me a ...
by aliensoldier on Fri 27th Feb 2004 00:07 UTC

FPGA, i still don't understand why this have not catched up faster. Xillinx should build a consumer PC based around it.

I always thinked the only way to have the amiga back for real would be to have it 100% FPGA based. Imagine an amiga without the limitation of custom chip but with all the avantage of them.

If PCI bus would be lot faster, PCI FPGA card would make for a nice BeBOX with a hardware media kit and translator also.

Where's part 4? 3? 2?
by woo on Fri 27th Feb 2004 00:18 UTC

Hey, Nicholas, the article is terrific indeed, but that's "Part 5" according to the title. I guess I've missed the other 4 parts ;) ) Where can I grab them?
I'm pretty eager to point the users of my community to this article, but I guess it'd be not very wise to start from point 5 though.

RE: Where's part 4? 3? 2?
by Eugenia on Fri 27th Feb 2004 00:21 UTC

The series of articles are marked as "editorials". Simply navigate to our "editorials" section from our menu and grab the other 4 articles.

FPGA? I don't think so.
by Steve on Fri 27th Feb 2004 00:24 UTC

More likely we will go to quantum based computers. Though they are still in their infancy, I do think that they will eclipse regular or 'classical' based computers in time.

Yes, there has yet to be a functional general purpose CPU built using quantum technology, but the possibilities of being able to double the processing power of a computer by only adding a single atom to the processor is too much to shove aside.

Not that I expect them to come out anytime soon.

Re: various
by nicholas Blachford on Fri 27th Feb 2004 00:34 UTC

bbrv wrote:
Thanks

You're very welcome :-)


Everything will be free, and you will have a machine as powerful as a Cray in every fridge.

Cray 1 did 66 MegaFlops, difficult to find something that _slow_ these days. Hint: Cray's advantage was not just high clock speed.

so what is it? the only conclusion i can draw fomr this series is that in the future there will be stuff

A mixture, some tasks are better done on a general purpose CPU, some better on a FPGA, some better on an ASICs (ASICs are lower power).

PainKiller
Good points, thank you.
but...

Why would I want my CPU to become a huge MPEG encoder when I can put a TV decoder card in my computer with an MPEG encoder built
The point about the entire computer turning into an encoder is just an example, it could be only part encoder.

What if you now decide TV and MPEG sucks and decide to use HDTV and H.264? You card gets replaced while my FPGA just gets reconfigured...

Oh well, perhaps in the future computer won't have to do anything but just look cool and do all the same stuff - text notes and web browsing.

Many industries have been getting away with this for years!

because only in my wildest dreams i can imagine all that.
If those are your wildest dreams I'd hate to see your boring ones...
Pretty much everything I described there exists TODAY. It's just not on your desktop yet.

Where's part 4? 3? 2?
...or search on Blachford

More likely we will go to quantum based computers
I'm sceptical of these, if you could double your processing power with an atom how do you know it's given you the correct result?

Quantum effect transistors on the other hand are a different matter and I can see them appearing.

Others:
Thanks

RE: Where's part 4? 3? 2?
by Eugenia on Fri 27th Feb 2004 00:36 UTC

>>Where's part 4? 3? 2?
>...or search on Blachford

Or, alternatively:
http://www.osnews.com/topic.php?icon=5
or even more alternatively:
http://www.osnews.com/search.php?search=blachford
:)

RE: RE: Where's part 4? 3? 2?
by woo on Fri 27th Feb 2004 01:07 UTC

Thanx Eugenia & Nicholas Blachford. That was quite helpful for me. Wonderful articles, I should note again!

quantum
by Steve on Fri 27th Feb 2004 01:07 UTC

I'm sceptical of these, if you could double your processing power with an atom how do you know it's given you the correct result?
I'm sure that Transistors that use quantum effects will happen because they are already at that scale. ;)

But creating a processor that works only by quantium effects may be the only way to go forward eventuatly. Physics limit the ability of 'normal' transistors because of the rules that they work by. Current processes work by ignoring or working against the quantum forces, which are undenyable at the size level that processors are at today.

It's hard to compare, but it's like going from classical physics to quantum physics. Nothing makes sence until you relearn everything you thought you knew.

PC Magazine
by Chris on Fri 27th Feb 2004 01:07 UTC

Yes, I feel like I am reading PC Magazine. This was terrible. All software is not free 20 years after it's creation. Companies aren't going to all be open, they're moving the other way toward continuing license fees!
An 8 bit processor is by no means a supercomputer. Do you know how large a number 8 bits can represent?
This is just mind boggling. Aaaaah, it hurts to read it.

Oh, and programming goes at least back to textile machines during the 19th century.

v ... my posts keep getting deleted!
by Reflekt on Fri 27th Feb 2004 08:45 UTC
Some points
by Treza on Fri 27th Feb 2004 12:33 UTC


- As a FPGA developper ( in VHDL & Verilog ), I can say that FPGA will never be as fast as custom chips for a given application. For example, for equivalent technologies, a FPGA-made general purpose CPU will always be slower than a special purpose chip, as the programmable interconnections takes place on the chip and time delays.
- To date, the FPGAs are preferred for low quantities as the initial cost of a ASIC is very high. The high volume of FPGA chips production make advanced technologies more affordables than an equivalent performance ASIC made on a less advanced process ( for example comparing a 90nm FPGA with a 150nm ASIC ).

- Most FPGAs are reconfigurables on the fly, this can be used for advanced signal processing for example ( like a programmable modem or video codecs ) but I don't think that it will really be useable for general purpose processing as the reconfiguration of a processor is an awfully complex task. What could be done is a configurable hardware "emulator" : With the same hardware, switching between an Amiga, an Atari, an Apple][, a Commodore 64, ... the goal would not be absolute maximum number crunching performance.

- The trend of FPGA manufacturers today for high performance CPUs is to integrate general purpose CPU in the FPGA fabric ( PowerPC for Xilinx, ARM for Altera ) rather than trying to build up a full fledged CPU directly in programmable logic. It can nevertheless be done for legacy hardware ( say an 8051 or a 6502 in the chip ), not very horsepower intensive tasks or very specific processing ( you could make a 13bits words processor counting in gray code with compressed data tranmission and base-7 floating point ... ).
- For a given FPGA, a CPU made in the programmable fabric will take more area and will be slower ( so more expensive ) than a fixed hard-made CPU.

- The evolution of semiconductor technologies will slow down significantly when the skyrocketing rise of the FABs, the unmanageable rise in power dissipation and lower initial quantities of ASIC design will intersect.


[ Anyone for correcting my English language mistakes ? ]









re: re: various
by PainKilleR on Fri 27th Feb 2004 14:17 UTC

* Why would I want my CPU to become a huge MPEG encoder
* when I can put a TV decoder card in my computer with an
* MPEG encoder built

The point about the entire computer turning into an encoder is just an example, it could be only part encoder.

What if you now decide TV and MPEG sucks and decide to use HDTV and H.264? You card gets replaced while my FPGA just gets reconfigured...


Fortunately for me, current TV decoder cards can handle HDTV resolutions and MPEG-4 (w/ H.264). Even with reprogrammable FPGA (and I have to say referring to reprogrammable chips as FPGA can be confusing, since many of the Intel Pentium line of CPUs are on FPGA sockets), you either have to be well versed in a programming language that can be used with the chip, or someone else has to be, and has to release the code for what you want to do.

As time has gone by, we've moved further away from generic chipsets, though complex specialized chips have become more programmable for different tasks. The problem there, though, is that the range of the programmability of specialized chips has so far been limited to particular areas, such as programmable shaders on graphics chips and programmable logic and I/O chips that handle low-level, high-throughput applications with very specific routines.

Perhaps in this case, though, I'm speaking more specifically in my own areas of knowledge and have missed something blatantly obvious. For instance, I'm only really aware of what current decoder cards can handle because I've been looking at them for the purpose of building a computer for my TV (rather than buying something like a TiVo that will be too specialized to really fulfill my needs), and most of my run-ins with programmable chips have come in my line of work, which relies heavily on replacing older specialized I/O solutions with PC-based solutions (though we still usually have to rely on PCI (and up until very recently ISA) I/O boards simply because of the number of I/O lines being handled and the general lack of other PC-based solutions to the problem).

@ Nicholas Blachford
by dpi on Sun 29th Feb 2004 00:18 UTC

Personally, i have respect for people like you who try to speculate on the future of computing. It is _extremely_ difficult, especially -and more and more- when it is about the longer run. Having read all of your previous articles, i'd like to urge you to re-evaluate your analysis before you post them. For example, by letting someone else read the article before you posing (my favorite own technique when i post an article partly because of possible grammer mistakes as well).

When i read your article, i stumbled on the following proposition:

"All software will be free (as in do what you want)"

If i understand this correctly, you claim both Free as in speech and free as in beer. How exactly do you think this will be realized then? Which economical model, if there is any at all? Who will pay the programmers? How will the programmers be "employed"? For example do you think programmers will employ themselve as in a sort of grassroot-company-system, or do you think this will become a gift-economy-alike, or [...]? What would be the trend, and why?

"Patents do not last forever, Everything patented today will be freely available in twenty years time."

Ok, but how can you be so sure about this? Regarding copyright, those have been extended on the run right before they'd expire. The very same _could_ become true for patents. So i'm wondering how you can be so sure about this assertion? Without any futher analysis on wether ie. my assertion regarding non-expiration will be true or false, it is my opinion you are not in a position to state what earlier said i'm quoting.

"As software advances all the techniques being invented will eventually become free and open for use by everyone."

What do you define as "technique"? Source code, because it explains _exactly_ how a program works, falls under the definition of "technique" from my point of view (imo it's an art first of all); i don't see how all source will become open. If you agree on the assertion that source code is a technique, i'm looking forward to your analysis on why all source code will be "open" or "available" for everybody between now and "eventually".

"The difference then between open and closed source will be one of functionality rather than one of technique."

Right. Here i can conclude that because you say the difference between open source and closed source code won't be one of technique, is that you don't see "source code" as part of the definition "technique". We disagree on that then. Perhaps we can discuss why you don't see source code as a technique while i do. More interesting however is when all techniques are open; isn't the functionality between open and source code about the same too?

You see? I think this is about a chicken/egg.

Finally, what i find extremely unfortunate is that you do not mention Trusted Computing in this regard. You do mention patents but do not state how to solve this. You ignore the Trusted Computing threat. This while these 2 are seen as dangers to the FLOSS world, by FLOSS advocates, civil right groups, et al. You _ignore_ these points. Lately i saw and readed a few articles and a video regarding Eben Moglen, and he adresses both and points out how important both are. A thirth threat i see is the development of quantum computing vs cryptography. I fear that once upon a time, people will think they're save with crypto and rely on crypto, while secret services & governments know how to easily "crack" them using quantum computing which has by then been developed in secret while it is not known yet to the common public it has been developed already.

That said, and excuse my wording if you feel offended, imo your vision is (regarding _this_ very point which i find intriguing) rather a dream than a realistic analysis on the future, though i mostly welcome any futher analysis on this subject.

Re: dpi
by Nicholas Blachford on Sun 29th Feb 2004 01:31 UTC

Personally, i have respect for people like you who try to speculate on the future of computing. It is _extremely_ difficult, especially -and more and more- when it is about the longer run.

Thank you, just wait for part 6 :-D

For example, by letting someone else read the article before you posing
Not a bad idea but I don't really have anyone to send them to. I am very careful checking them though. I do sometimes get complaints about my grammar but that's a British Vs American thing.

"All software will be free (as in do what you want)"

If i understand this correctly, you claim both Free as in speech and free as in beer.


That's in reference to the patents, I don't mean all companies will suddenly open their code but all the techniques that were patented previously will be free to use.

By "technique" I mean algorithm, not source code. It is possible to implement an algorithm in different languages in which case the source is completely different.

Who will pay the programmers?
I'm not saying it's good or bad, just that it will happen.
Open Source gives programmers "freedom" while at the same time removing their bread and butter. Also even in the same language you could have two different pieces of source to implement a single algorithm.

I for one disagree that all source should be free, the idea that you can make money from services is nonsense, it only works for big complex programs. Single programmers will find this very difficult, especially if they don't like doing support. You raise a very good point but at the moment there is no answer to it.

"Patents do not last forever, Everything patented today will be freely available in twenty years time."

Ok, but how can you be so sure about this?


I can't, but current law means patents expire after 20 years. I think that is a ludicrously long time for a rapidly evolving industry. It could be extended but I think that's a seriously bad idea.

Finally, what i find extremely unfortunate is that you do not mention Trusted Computing in this regard.

Could you explain your point further? I'm not sure what you mean.

For Government breaking codes, sure...
I'm not convinced on quantum computing but the government have big systems that's for sure, did you know the NSA have their own fab producing Alphas? (well, that's the rumour)

The British Government sold many countries German code machines after the 2nd world war, they didn't of course tell these countries that they'd long since figures out how to crack them...

P.S. Comments will be closed here at some point, fell free to send me an e-mail if you wish to continue.

What's the bump already? It's unclear. Now, if you wanted to say that it will be nice to personally track CMM and value at home for the languages and capabilities around, that might be one thing. If you think hardware innovation has to hit a wall because gate count can't double easily, that's unfounded. Ditto with pricing; just because India likes it doesn't mean the fun's over! (Well, until we find 20M cranky Indians to agree.)

If free firmware proliferates and machines fill each other up with junk, I can see a clear need to buy a new one from that extension of software. If the one unit caching all the RC5 and RSA3 keys for all the others fails, you go out and get the Agent Smith chip.

Once it is all working again, you buy solar cells that also cool the chips to solve the serious problems with overheating in 3D chips (water in the vias or nay) and quite probably treat human waste (another problem set to roost at the same time.)