Linked by Thom Holwerda on Wed 5th Jan 2011 22:09 UTC
Windows And this is part two of the story: Microsoft has just confirmed the next version of Windows NT (referring to it as NT for clarity's sake) will be available for ARM - or more specifically, SoCs from NVIDIA, Qualcomm, and Texas Instruments. Also announced today at CES is Microsoft Office for ARM. Both Windows NT and Microsoft Office were shown running on ARM during a press conference for the fact at CES in Las Vegas.
Thread beginning with comment 456195
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[5]: BC
by malxau on Thu 6th Jan 2011 15:20 UTC in reply to "RE[4]: BC"
malxau
Member since:
2005-12-04

True (providing one goes through the step of making the script executable after downloading it). This is an excellent reason to avoid the practice of simply downloading software from some random site, making it executable, and then running it. Fortuantely, it is entirely possible to install and run a complete Linux desktop (open source) software ensemble without ever once having to do such a thing.


Really?

1. It is possible, but very, very difficult, to get a booting system without taking binary code from a source you didn't generate yourself. Typically people use distributions as a starting point. But just like binary code on Windows, this relies on a chain of trust - that the binaries are not malware infested. If I want to create my own distribution tomorrow, users can't know whether to trust me or not. In the end, users have to decide trust by word of mouth - what works, what doesn't - just like Windows.

2. Even when compiling by source, it's common to blindly execute code. Consider how autoconf/configure scripts work. Do you really read configure scripts before running them? Source availability gives a means to ensure trustworthiness, but that is only as effective as user habits. As the volume of source running on people's machines increases, and assuming a human's ability to read code does not increase, the practicality of reviewing code decreases over time. Again, this relies on others reviewing the code, and building up communities based on which code is trustworthy and which isn't, which isn't that different to binary components above.

Reply Parent Score: 2

RE[6]: BC
by Nth_Man on Sun 9th Jan 2011 02:10 in reply to "RE[5]: BC"
Nth_Man Member since:
2010-05-16

With the source code, you can see what is going to be done, study it, modify it, etc. If you can't do it by yourself now, you can study how to do it or you can contract someone to do it, etc.

Without the source code it's not you who has the control, you don't control the software or control your computing. As Stallman says: without freedoms, the software is who controls the users.

Reply Parent Score: 1

RE[6]: BC
by lemur2 on Sun 9th Jan 2011 13:38 in reply to "RE[5]: BC"
lemur2 Member since:
2007-02-17

Typically people use distributions as a starting point. But just like binary code on Windows, this relies on a chain of trust - that the binaries are not malware infested.


It is not like binary code on Windows, because people who did not write the code nevertheless can download the source code, compile it for themselves, and verify that it makes the binary as distributed.

It is not just one isolated instance of one person doing this that builds a trust in the code ... the trust comes from the fact that a program such as gcc, and repositories such as Debian's, have existed for well over a decade, through countless upgrades and versions of the code, downloaded by millions upon millions of users over the span of that decade, with the source code visible in plain sight to millions of people the entire time, and not once has malware been found in the code.

Not once.

We can trust Debian repositories by now.

Edited 2011-01-09 13:39 UTC

Reply Parent Score: 2