Linked by David Adams on Tue 11th Oct 2011 20:08 UTC, submitted by lucas_maximus
Internet Explorer Microsoft has unveiled a website aimed at raising awareness of browser security by comparing the ability of Internet Explorer, Mozilla Firefox, and Google Chrome to withstand attacks from malware, phishing, and other types of threats. Your Browser Matters gives the latest versions of Firefox and Chrome a paltry 2 and 2.5 points respectively out of a possible score of 4. Visit the site using the IE 9, however, and the browser gets a perfect score. IE 7 gets only 1 point, and IE 6 receives no points at all. The site refused to rate Apple's Safari browser in tests run by The Register.
Thread beginning with comment 492664
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[2]: Comment by Gone fishing
by Alfman on Wed 12th Oct 2011 06:30 UTC in reply to "RE: Comment by Gone fishing"
Alfman
Member since:
2011-01-28

Neolander,

"Well, if the current Windows security model was not so badly broken, this wouldn't be needed..."

Unfortunately, it's not just windows, but I agree in principal.

"Running an arbitrary unsigned executable file as root just to install random software, with no trustworthy information on what this executable is going to do exactly, is insane in our century."

My grip is that too many security solutions focus on identity rather than inoculating harmful instructions. Digital signatures are rather pointless if I'm downloading arbitrary software/games where the author may as well be anonymous.

Now we could say "Well, if you don't know who ID Software are, you have no business installing their software", but IMHO this mentality misses the point that security <> identity.

There are many legitimate authors out there who have something to offer even though they lack a widely recognizable identity. Another problem with identity based security is that even correctly signed packages can do harm (deliberately or not).

The technical solution is obvious: sandbox all applications by default and require them to declare security profiles (like midlets) but I think we may have already talked about this. Companies have an anti-incentive to make secure app sandboxes - look at how apple used security as an excuse to tether users to their store instead of actually making apps secure.


"I hope this 'Windows Store' thing will fix that, but I fear that giving Microsoft a way to control the Windows software ecosystem will also bring its whole lot of problems... "

I don't care if they have a store, but I worry whether it will be 100% optional or if they'll try to reduce the compatibility/functionality of windows artificially for users/devs who want to sideload apps.

Reply Parent Score: 2

Neolander Member since:
2010-03-08

I share your opinion that all future OSs should use sandboxes as their primary security system. It is much better than letting untrusted software do whatever it wants with user files, let alone let it touch the system drive enough to install itself.

However, even within the realm of current software installation methods, I believe that the Windows way of setup.exe binary black boxes with root access is especially awful, and far behind what other desktop OSs have come up with as their main installation method since.

Reply Parent Score: 1

moondevil Member since:
2005-07-08

This is because there are many stupid developers out there.

Windows supports packages (*.msi) since Windows 2000.

It is not Microsoft blame if software houses still make use of Setup.exe installers.

Heck, I also know some Software houses that provide similar installers for UNIX systems!

Reply Parent Score: 3

RE[3]: Comment by Gone fishing
by oinet on Wed 12th Oct 2011 12:08 in reply to "RE[2]: Comment by Gone fishing"
oinet Member since:
2010-03-23


The technical solution is obvious: sandbox all applications by default and require them to declare security profiles (like midlets) but I think we may have already talked about this.


Tell me about it..

Reply Parent Score: 1

Neolander Member since:
2010-03-08

Well, imagine a desktop OS where software would be confined in a tiny part of the hard drive and couldn't touch any other file. It would have its binaries and data, system-wide and per-user config files, and that's it. It couldn't access anything else on the hard drive, including the user's home folder, without explicit user permission.

Said explicit permission could take the form of a command line parameter, double clicking a file with the proper association, a standard system "file open" GUI dialog... Or, for software which legitimately needs to access user files behind his back, like a backup service, an elevated privilege request for such access, that is displayed once, through a controlled system dialog, during installation.

We can imagine applying a similar philosophy to every other system service which has a "dangerous" side to it : real time process priorities, altering network configuration, power management features which turn hardware on and off, more generally direct hardware access...

Most of today's applications only have limited needs and could work very well with this much reduced level of security permission. But it would strongly reduce the amount of stuff which malware can do silently. Wiping your home folder ? Not possible anymore. Sending your private data to a third party without you knowing ? Not possible anymore. Putting a rootkit in your OS kernel during installation ? Not possible anymore, as software does not require dedicated installers anymore. Making a hidden trojan binary run silently on each user login ? Not possible anymore.

Sandboxing would not eliminate malware, but it would significantly higher the effort necessary to engineer it. Now, malware would have to do stuff in plain sight of the user. Privilege elevation dialogs would explain clearly what it is up to. So said malware would have to come up with a good justification for what it's doing, facing a cautious user who is not used to seeing meaningless "a program wants to make changes to your system" dialogs all the time.

Edited 2011-10-12 18:59 UTC

Reply Parent Score: 1

RE[4]: Comment by Gone fishing
by Alfman on Thu 13th Oct 2011 08:23 in reply to "RE[3]: Comment by Gone fishing"
Alfman Member since:
2011-01-28

oinet,

(Re application sandboxing)

"Tell me about it.."

Well most of what I propose has already been done, it just never made inroads in the market. If you were familiar with Java Web Start, then you should understand what I mean.

Most security we see in operating systems has gone towards protecting the OS files from malware (Win Vista makes this clear, Unix has always had this). However very little security has gone towards protecting the user's own files/apps from malware, which could be even more devestating to end users.

Consider: An early browser (can't remember if it was IE/NS) used to have a trivial vulnerability whereby a webpage could cause the browser to open up arbitrary user files (say in a frame), and then read the contents dynamically using javascript and communicate it back to the server. Now clearly this kind of vulnerability needs to be fixed, however the point is that a browser shouldn't have transpearent access to all user files in the first place. An app, even if successfully exploited, shouldn't compromise user data, and it would not be able to if it were run in a sandbox.


Sandboxing security is conceptually equivalent to running each app under it's own "user account", where instead of only isolating users, the OS isolates individual applications as well.

Today:
user->security context

Sandbox model:
user->appgroup->security context

This way, I could download and run a game from an untrusted source and run it with high confidence that it would not do harm to the rest of my system/files, even if it contained malware.

Doing this today manually for each and every application by default is unmanagable - imagine the burden of new user accounts for each user*app combination. Even things like selinux/apparmor are very difficult to use and don't offer the new security primitives that would make this integrate more naturally into user workflow.


A genuine app sandbox model has other benefits too: It would be available to users without any root access, which would address my prior gripe about my univerity account. Also, my shared hosting web sites could be isolated from one another such that a script vulnerability on one would not threaten the integrity of all my other web sites.

Reply Parent Score: 2

Neolander Member since:
2010-03-08

Okay, now I have a real computer at hand so I can post more detailed replies !

{Re : Security != identity}

I believe that digital signing does improve security a bit, although we agree that it is not a vital mechanism : people can use them to make sure that an application has not been altered by its distributor from the version which the author has distributed, simply by checking its signature against the author's key. This can in turn be used to make mirrors safer (not every developer has enough bandwidth to distribute his software by himself).

We agree, however, that identity is not security. You know that software has not been altered since the author packaged it, but if the package itself was malicious to begin with, something better is needed. Hence the need for sandboxing ;)

"I hope this 'Windows Store' thing will fix that, but I fear that giving Microsoft a way to control the Windows software ecosystem will also bring its whole lot of problems... "

I don't care if they have a store, but I worry whether it will be 100% optional or if they'll try to reduce the compatibility/functionality of windows artificially for users/devs who want to sideload apps.

This is one of my worries too. Apple have opened up Pandora's box by introducing unavoidable software repositories on iOS, now other OS manufacturers might be tempted to introduce a similar system.

Edited 2011-10-12 17:06 UTC

Reply Parent Score: 1