Linked by Thom Holwerda on Fri 22nd Feb 2008 09:16 UTC, submitted by obsethryl
.NET (dotGNU too) "Previously, we have presented one of the two opensource licensed projects related to creating a C# kernel. Now it's the time to complete the set by rightfully presenting SharpOS, an effort to build a GPL version 3 + runtime exception licensed system, around a C# kernel of their own design. It is my pleasure and priviledge to host a set of questions and answers from four active developers of SharpOS, that is William Lahti, Bruce Markham, Mircea - Cristian Racasan and Sander van Rossen in order to get some insight into what they are doing with SharpOS, their goals, their different design and inspiration."
Permalink for comment 302020
To read all comments associated with this story, please click here.
RE[3]: So what ?
by g2devi on Fri 22nd Feb 2008 19:51 UTC in reply to "RE[2]: So what ?"
Member since:

Buffer overflow and memory leaks are only a small part of the issues operating systems face. A more serious issue is general *resource* leaks (which C# can do nothing about that can't be done with even C) and excess memory usage which anything but referenced counted garbage collectors are notorious for having (memory is eventually recovered, but you in the m-allocated memory), treating different things the same (i.e. if you try to treat register memory the same way you treat swap memory, you're going to have serious performance and behavioral issues), ignoring the hardware to make developer's lives easier like using the stack exclusively in preference to fast registers (if you ignore the hardware, the hardware will ignore you).

IMO, writing an OS in Java or C# might be good for educational purposes and OSes where latency makes speed a nonissue (e.g. a distributed OS over a slow network) or when you have tonnes of memory and CPU to waste. But I don't see it reaching the desktop any time soon. But it *might* change with time. After all, anyone who's had a 1MHz Commodore 64 or Amiga 1000 knows that those seriously underpowered machines could do wonders because of their tight coding (which included now taboo performance techniques like self-modifying assembly code). Compared to those standards, these days, even Linux is a memory hog for the sake of maintainability and functionality.

OTOH, given that applications of the average user keep finding ways to stress memory and CPU (Compiz, Video processing, Human Genome projects, Weather prediction, etc) in ways that we didn't even think of 10 years ago and the fact that Moores law has a limit which which we might reach within the next five years (i.e. atomic level), I'm skeptical.

Reply Parent Score: 2