Linked by WJMoore on Tue 1st Nov 2016 00:01 UTC
OSNews, Generic OSes

Redox, a Unix-like operating system written in Rust, recently rewrote its kernel:

Since August 13, the kernel is going through a complete rewrite, which makes the kernel space ultra-small (about the size of L4). Everything which can run outside the kernel in practice, will do so.

It is almost complete and will likely be merged in the coming week.

The reasons cited for the rewrite include memory management, concurrent design, SMP support, and 64-bit by default.

Order by: Score:
64Bit by default?
by Spiron on Tue 1st Nov 2016 00:53 UTC
Spiron
Member since:
2011-03-08

One has to wonder why it wasn't 64-bit by default in the first place

Reply Score: 2

RE: 64Bit by default?
by adkilla on Tue 1st Nov 2016 02:41 UTC in reply to "64Bit by default?"
adkilla Member since:
2005-07-07

Because there are embedded systems out there that are 32-bit only?

Reply Score: 5

Not magic after all
by kwan_e on Tue 1st Nov 2016 04:46 UTC
kwan_e
Member since:
2007-02-18

The major reason for the rewrite was incorrect and inefficient memory management in the old kernel. This causes crashes in userspace where the kernel has not mapped pages correctly.


I thought Rust was supposed to be completely safe and basically magical in its ability to prevent bugs.

Reply Score: 1

RE: Not magic after all
by DrJohnnyFever on Tue 1st Nov 2016 04:58 UTC in reply to "Not magic after all"
DrJohnnyFever Member since:
2012-03-07

You can still write code that is incorrect or has a flawed design. Rust is designed to eliminate common errors in dealing with memory pointers, one subset of bugs. It doesn't make a flawed algorithm work, just makes it a bit harder to accidentally botch an implementation of good design.

Rust isn't really any more "magic" than say Python. But unlike Python it is usable for programming traditionally left to C. Which is pretty unique.

Edited 2016-11-01 05:00 UTC

Reply Score: 4

RE[2]: Not magic after all
by Alfman on Tue 1st Nov 2016 15:52 UTC in reply to "RE: Not magic after all"
Alfman Member since:
2011-01-28

DrJohnnyFever,

Rust isn't really any more "magic" than say Python. But unlike Python it is usable for programming traditionally left to C. Which is pretty unique.


While it's true python shouldn't generate segfaults or experience memory corruption or anything like that, which are some of the most dreaded kinds of faults to track down. I'd argue that rust is safer than python on account of verifying code at compile time. Python won't even tell you until run time that you are using an undeclared variable. It will detect the error and throw an exception, but in a system that's designed not to fail, it's not great that the earliest detection of a fault is at the time of execution.

Reply Score: 2

RE: Not magic after all
by crystall on Tue 1st Nov 2016 13:35 UTC in reply to "Not magic after all"
crystall Member since:
2007-02-06

I thought Rust was supposed to be completely safe and basically magical in its ability to prevent bugs.


To deal with things like VM mappings you need to use unsafe code even in Rust. Once you do that all of Rust's safeties are essentially off (at least for that piece of code).

Reply Score: 5

RE[2]: Not magic after all
by dionicio on Tue 1st Nov 2016 16:19 UTC in reply to "RE: Not magic after all"
dionicio Member since:
2006-07-12

Don't believe RUST was design aimed as a VM framing. Just don't do that [with].

Reply Score: 2

RE: Not magic after all
by Alfman on Tue 1st Nov 2016 15:29 UTC in reply to "Not magic after all"
Alfman Member since:
2011-01-28

kwan_e,

I thought Rust was supposed to be completely safe and basically magical in its ability to prevent bugs.


I'm not familiar with this particular code, but it should be explained that rust supports two modes of operation, both safe and unsafe.

When in safe sections of code, which is the default, rust enforces integrity by verifying the correctness of each state transition, which is what makes rust robust. And the fact this this is done at compile time allows us to build safe zero cost abstractions. However when you are developing new primitives, you may have to jump into unsafe sections to perform operations (like raw/untyped memory access) that haven't been encapsulated by a safe call or library.

Most programs wouldn't have to do this, but when your implementing an OS or other system interface code, you may need some primitives that aren't implemented by the standard libraries, and rust lets you do that by letting you explicitly disable code safety checks.

"Unsafe code" in rust is essentially "normal code" in other static languages. One benefit of explicitly requiring unsafe to be tagged as such is that in a large project with millions of lines of code the scope of unsafe code is significantly reduced, making it much easier to find/audit/fix.

Edited 2016-11-01 15:33 UTC

Reply Score: 3

RE[2]: Not magic after all
by kwan_e on Wed 2nd Nov 2016 00:45 UTC in reply to "RE: Not magic after all"
kwan_e Member since:
2007-02-18

And the fact this this is done at compile time allows us to build safe zero cost abstractions. However when you are developing new primitives, you may have to jump into unsafe sections to perform operations (like raw/untyped memory access) that haven't been encapsulated by a safe call or library.


Surely, "zero cost abstractions" shouldn't need a library to encapsulate it. Or, the first thing to do is to leverage these "zero cost abstractions" to make unsafe code safe.

Reply Score: 2

RE[3]: Not magic after all
by Alfman on Wed 2nd Nov 2016 10:18 UTC in reply to "RE[2]: Not magic after all"
Alfman Member since:
2011-01-28

kwan_e,

Surely, "zero cost abstractions" shouldn't need a library to encapsulate it. Or, the first thing to do is to leverage these "zero cost abstractions" to make unsafe code safe.


That's exactly how the unsafe code sections in rust are used.

Do you take issue that developers need "unsafe sections" at all? The trouble with that is that it assumes that everything would be built in to the language from the get go. But satisfying everyone's niche requirements with a general purpose framework would result in a bloated framework becoming overspecialized.

Granted, there's always going to be a debate as to where to draw the line, should a framework satisfy 90% of users, 95%, 99? 100% is not realistic, there will always be someone who needs to go outside of existing work. It makes sense to me that operating system code is in this set where developers may need to define new specialized primitives.

Also, keep in mind that for all strings and abstract data type primitives in the standard library, their safety against corruption stems from the efforts of the developers who implemented them. Rust, at least for today, doesn't have any kind of AI to validate unsafe code. It's sort of like a "boot strapping" problem. Maybe the next generation of languages could try to tackle that, but in doing so, they might have to contend with Gödel's incompleteness theorem.

https://en.wikipedia.org/wiki/G%C3%B6del%27s_incompleten...

Edited 2016-11-02 10:22 UTC

Reply Score: 2

RE[4]: Not magic after all
by Alfman on Wed 2nd Nov 2016 11:33 UTC in reply to "RE[3]: Not magic after all"
Alfman Member since:
2011-01-28

Also, it isn't obvious from what I've written, but only the original primitives at the bottom of the stack would require unsafe sections, not the entire framework. For example, if you build a framework using entirely pre-existing rust primitives, you wouldn't need to write any unsafe code yourself.

The most common use of "unsafe" code in rust is actually linking to existing C libraries simply because they are so pervasive.

Reply Score: 2

RE[4]: Not magic after all
by kwan_e on Wed 2nd Nov 2016 13:06 UTC in reply to "RE[3]: Not magic after all"
kwan_e Member since:
2007-02-18

Do you take issue that developers need "unsafe sections" at all?


I don't have an issue with it at all. I'm just having a laugh at people and languages that try to promise the impossible.

I love me some "zero cost abstractions", and even some "negative cost abstractions", to force compile time errors if I can help it.

Reply Score: 2

RE[5]: Not magic after all
by Alfman on Wed 2nd Nov 2016 13:58 UTC in reply to "RE[4]: Not magic after all"
Alfman Member since:
2011-01-28

kwan_e,

I don't have an issue with it at all. I'm just having a laugh at people and languages that try to promise the impossible.


I don't know who/what you are referring to, something in this article?

I love me some "zero cost abstractions", and even some "negative cost abstractions", to force compile time errors if I can help it.


Well, I'd like for it to become standard practice for software. For better or worse, many applications have transitioned to "managed" languages like java, .net, and a plethora of scripting languages, which protect from corruption at run time, but are not zero cost abstractions.

Edited 2016-11-02 14:04 UTC

Reply Score: 2

RE[6]: Not magic after all
by kwan_e on Wed 2nd Nov 2016 16:05 UTC in reply to "RE[5]: Not magic after all"
kwan_e Member since:
2007-02-18

I don't know who/what you are referring to, something in this article?


No one specifically. Just people in general who chase the new fad languages and make overpromises. (And some people here who obviously couldn't take a joke). I recall there was another OS effort using Rust where the "docs" were nothing but just boasting about how Rust could solve all problems and was practically impossible to get wrong.

Well, I'd like for it to become standard practice for software.


I would like that too, but some people are scared or allergic to things like template metaprogramming or SFINAE, even though they've been around and used in industry for decades now.

Actually, zero-cost abstractions started out with Ada, with Alex Stepanov designing the first compile-time generic containers, but it was too limited for his needs.

I like being able to write high level, compile-time checked code like this:

GPIO::OE_REG oe = 0x44E07000u;
oe.set<GPIO::OE::_12>(1);
oe.get<GPIO::OE::_12>();


And have it compile down to code like this:

12c: e3073134 movw r3, #28980 ; 0x7134
130: e34434e0 movt r3, #17632 ; 0x44e0
134: e5932000 ldr r2, [r3]
138: e3822a01 orr r2, r2, #4096 ; 0x1000
13c: e5832000 str r2, [r3]
140: e5933000 ldr r3, [r3]
144: e12fff1e bx lr


It's literally WYSIWYG assembly, even though it's using high level constructs like classes and methods and the compiler will still scream at you if you get anything wrong.

https://www.youtube.com/watch?v=zBkNBP00wJE

Reply Score: 2

RE[7]: Not magic after all
by Alfman on Wed 2nd Nov 2016 17:01 UTC in reply to "RE[6]: Not magic after all"
Alfman Member since:
2011-01-28

kwan_e,

I know Ada is derived from pascal, but not much else. Does Ada have anything to protect developers from heap management bugs that are possible in pascal?

It's funny that each community has it's own dominant computer language. Obviously C tops the charts for operating systems since it's unix debut. Java is big for enterprise apps. Ada is a standard for government contracts. FORTRAN is big in the scientific community. Javascript is the leader for web browsers. PHP leads website programming.

These all have historical reasons for becoming dominant in their respective domains, but to someone being introduced to software field for the first time, it must seem highly arbitrary, I imagine ;)

Reply Score: 2

RE[7]: Not magic after all
by moondevil on Wed 2nd Nov 2016 19:12 UTC in reply to "RE[6]: Not magic after all"
moondevil Member since:
2005-07-08

I would like that too, but some people are scared or allergic to things like template metaprogramming or SFINAE, even though they've been around and used in industry for decades now.


Outside code written by big companies like Google, Facebook, Microsoft, Apple, what I usually see on the small to medium size companies it more C compiled with C++ compiler than anything else.

Reply Score: 2

RE[7]: Not magic after all
by james_gnz on Thu 3rd Nov 2016 10:41 UTC in reply to "RE[6]: Not magic after all"
james_gnz Member since:
2006-02-16

I like being able to write high level, compile-time checked code like this:
...
And have it compile down to code like this:
...

The Rust compiler uses LLVM for the backend, so it gets some optimisations (although apparently not as much as Clang, because LLVM better understands how Clang does things). Of course, that also means compilation isn't fully self-hosted, although I think focussing on the frontend is a pragmatic use of effort.
Anyway, maybe it'll turn out to be a fad, but I wouldn't write it off just yet. It seems like Mozilla is pretty serious about getting it into Firefox.

Reply Score: 1

Graphics stack
by taschenorakel on Wed 2nd Nov 2016 08:15 UTC
taschenorakel
Member since:
2005-07-06

Are there plans to support OpenGL or Vulkan? Supporting any of these is mandatory for modern user interfaces.

Reply Score: 1