Linked by Thom Holwerda on Tue 9th May 2006 21:25 UTC, submitted by luzr
OSNews, Generic OSes Torvalds has indeed chimed in on the micro vs. monolithic kernel debate. Going all 1992, he says: "The whole 'microkernels are simpler' argument is just bull, and it is clearly shown to be bull by the fact that whenever you compare the speed of development of a microkernel and a traditional kernel, the traditional kernel wins. The whole argument that microkernels are somehow 'more secure' or 'more stable' is also total crap. The fact that each individual piece is simple and secure does not make the aggregate either simple or secure. And the argument that you can 'just reload' a failed service and not take the whole system down is equally flawed." My take: While I am not qualified to reply to Linus, there is one thing I want to say: just because it is difficult to program, does not make it the worse design.
Thread beginning with comment 122970
To view parent comment, click here.
To read all comments associated with this story, please click here.
axilmar
Member since:
2006-03-20

You may mock signal-based programming, but why don't you try writing some examples yourself in an imaginary language and see the benefits? application writing becomes much easier with signal-based programming.

I admit the guys at rebelscience.org are a little over-the-top concerning the rest of their stuff, but I can not say that they haven't hit the nail in the head with the signal-based programming model.

Reply Parent Score: 1

nii_ Member since:
2005-07-11

Where can I find more on this technique?

Why an imaginary language? Is it natural and easy to do in standard languages like C or even Object Oriented languages like C++, Python or Ruby without having to follow difficult to follow rules etc?

Is there a language out there that does this naturally if it is so greatly advantageous to program that way?

Reply Parent Score: 1

Mapou Member since:
2006-05-09

Where can I find more on this technique?

There are several existing synchronous reactive programming languages out there such as Esterel and Signal. However, they do not go far enough, that is, they do not go to the individual instruction level like the COSA model. COSA also introduces several innovations not found elsewhere such as the automatic resolution of data and event dependencies.

The only problem is that current CPUs are optimized for the algorithm and a true signal-based environment like COSA would suffer a performance hit. My hope is that this will change soon because this is the future of computing. There is no escaping it. We've been doing wrong from the beginning but it's never too late to change.

Reply Parent Score: 2

Cloudy Member since:
2006-02-15

I admit the guys at rebelscience.org are a little over-the-top concerning the rest of their stuff, but I can not say that they haven't hit the nail in the head with the signal-based programming model.

I can. They've made the same mistake Brad Cox made when he invented objective-C. Software is not like hardware for a lot of reasons.

But here's a typical comment from the rebelscience site:

Unfortunately for the world, it did not occur to early computer scientists that a program is, at its core, a tightly integrated collection of communicating entities interacting with each other and with their environment. As a result, the computer industry had no choice but to embrace a method of software construction that sees the computer simply as a tool for the execution of instruction sequences.

I suggest the author read Minsky's early work. "signal-based" programming is the Nth in an infinite series of attempts to model the asynchronous processes that make up software as if they were synchronous. It is doomed to the same failure of all such attempts.

The closest anyone's gotten to a workable synchronous model was Tony Hoare, with Communicating Sequential Processes. (CSP).

Reply Parent Score: 2

Mapou Member since:
2006-05-09

I suggest the author read Minsky's early work. "signal-based" programming is the Nth in an infinite series of attempts to model the asynchronous processes that make up software as if they were synchronous. It is doomed to the same failure of all such attempts.

This makes no sense, IMO. Synchronous, signal-based applications abound. There are spiking (pulsed) neural networks, spreadsheet cells, etc... all over the place and they work perfectly.

Reply Parent Score: 1