Linked by Thom Holwerda on Sun 10th Jun 2012 22:36 UTC
Google So, Google has made it very hard to install Chrome extensions outside of the Chrome Web Store - out of security concerns. In addition, they sprung this on users and extension developers without much consultation or consideration for their concerns. As always - understandable to protect users, but the handling has an almost Apple-like bluntness to it. Next up: how to jailbreak your browser?
Thread beginning with comment 521531
To view parent comment, click here.
To read all comments associated with this story, please click here.
Alfman
Member since:
2011-01-28

darknexus,

"Speaking from tech support experience, I'd say if a user doesn't know enough to google for that switch, they have no business side-loading. The more checkboxes you give users, the more they will check out of annoyance just to avoid the alert dialogs, and then your security becomes null and void."

The spread of malware happens because users lack the tools to make informed decisions. Often the choice is between "run" and "do not run" and the only information presented is to identity the software. Even knowledgeable users will be at a complete loss to know if something is harmful, so I fully agree that this type of security model is flawed. But I disagree very strongly with the "remedy" of a walled garden (even if more savvy users can disable it). It'd be both more open and more secure to add metadata about what the extension does and then enforce it in a sandbox. Given the right tools & information, users may be even more secure than simply trusting everything in google's repository.

Reply Parent Score: 2

darknexus Member since:
2008-07-15

The spread of malware happens because users lack the tools to make informed decisions.


I call bs. Malware spreads because users treat their computer as a magic box. They expect their computer to protect them, to do their common sense thinking for them, and to be the always-on tool. They don't wish to make informed decisions. I speak from experience with a large number of users, some of which have actually said this to me. I can't count the number of times I've heard "well, if that's not a safe program, I shouldn't be allowed to install it." I shit you not, I have heard these words.
If you're a carpenter, you don't expect your tools to maintain themselves. You don't expect your vehicle to keep itself going without maintenance, nor do you expect it to survive in tact if you drive it straight into a tree. Yet, people expect their electronics to magically just work no matter what sort of crap they put on them. They want that? Fine, but that comes at a heavy cost. They don't want to think, so fine, we don't make them think. However, we have to allow those who still wish to think and employ their common sense to not be limited by the idiocy of those who do not wish to use their brains. Therefore, a compromise is needed.
I wish these sorts of things weren't necessary. I wish people would use their damn brains for something other than watching crap tv and window shopping. The sad reality is, however, that the greater part of the market decides these issues and we could easily find ourselves buried by them. Allowing us to enable this ability in an easy way that is, at the same time, not obvious to those who don't have the brains to search for it anyway seems to be the best way to keep both groups happy.
And before anyone mentions it: Yes, I know how elitist and arrogant I sound. That's what happens when you see the same mistakes repeated over and over and over again, and every time they ask: "Why didn't my computer protect me?" I don't know what the situation is in other countries, but in the USA that's what I get 95% of the time. I'm almost glad for lockdowns imposed on these types of people, if only so they stop bothering me with the same crap and stop spreading their malware. As long as those of us who do know our stuff can legally and uncomplicatedly bypass said lockdowns, I have no problem with it whatsoever, as that approach keeps both groups happy.

Reply Parent Score: 9

Alfman Member since:
2011-01-28

darknexus,

"I call bs."

There's really no need for sarcasm. Your opinion is that it's ok to submit users to third party control for safety's sake, which is fair enough. I hope you are least aware that such philosophies, especially when taken collectively, tend to erode our freedoms over time.


"If you're a carpenter, you don't expect your tools to maintain themselves. You don't expect your vehicle to keep itself going without maintenance, nor do you expect it to survive in tact if you drive it straight into a tree. Yet, people expect their electronics to magically just work no matter what sort of crap they put on them."

You are speaking metaphorically about how physical tools relate to software. I don't like using metaphors since comparing different things as though they are the same is inherently flawed as details are worked in. But to be more complete the metaphor must account for how end user restrictions affect software. For example, your tools would need to refuse to work with unauthorised components that are never the less compatible. Artificially restricting tools would generally be considered a bad thing, even if the freedom to use the tools the wrong way may damage them.

"And before anyone mentions it: Yes, I know how elitist and arrogant I sound. That's what happens when you see the same mistakes repeated over and over and over again, and every time they ask: 'Why didn't my computer protect me?'"

To which I say, the goal should be addressing the lack of software sandboxing rather than having users acquire all their software from centralised sources.


"As long as those of us who do know our stuff can legally and uncomplicatedly bypass said lockdowns, I have no problem with it whatsoever, as that approach keeps both groups happy."

But you've completely overlooked that the walled garden approach (whether it can be disabled or not) doesn't directly solve any security problems on it's own. For that you need additional vetting, otherwise there's nothing in place to stop covert distribution of malware through official channels. In fact it creates a false sense of security that anything downloaded through official channels is safe. Though one may be happy under a false sense of security, it's still not something to be happy about. At best this lock down offers reactive security, which is better than nothing, but not as good as having the ability to run software in a security sandbox in the first place.

Reply Parent Score: 2

Laurence Member since:
2007-03-26

darknexus,

"Speaking from tech support experience, I'd say if a user doesn't know enough to google for that switch, they have no business side-loading. The more checkboxes you give users, the more they will check out of annoyance just to avoid the alert dialogs, and then your security becomes null and void."

The spread of malware happens because users lack the tools to make informed decisions. Often the choice is between "run" and "do not run" and the only information presented is to identity the software. Even knowledgeable users will be at a complete loss to know if something is harmful, so I fully agree that this type of security model is flawed. But I disagree very strongly with the "remedy" of a walled garden (even if more savvy users can disable it). It'd be both more open and more secure to add metadata about what the extension does and then enforce it in a sandbox. Given the right tools & information, users may be even more secure than simply trusting everything in google's repository.


Metadata can be faked. This method ensures that only people tech-savy enough to know how not to break their browser has enough control to break their browser.

Reply Parent Score: 2

Alfman Member since:
2011-01-28

Laurence,

"Metadata can be faked. This method ensures that only people tech-savy enough to know how not to break their browser has enough control to break their browser."

Can be faked to do what? Any metadata can be faked. But if the requested permissions are enforced by the sandbox and software attempts to escalate it's access above that specified in metadata, then it should be killed automatically. Furthermore the default max permissions should be restrictive enough such that the user needs to explicitly ok dangerous calls before the software will run.

The sandbox gives us much more security than we normally have when running extensions under blind faith. Although this could improve security for all extensions, I'd be open to removing sandbox restrictions from extensions that have already been vetted by google.

Reply Parent Score: 3