Linked by Thom Holwerda on Thu 21st Nov 2013 23:46 UTC
Internet & Networking

"We can end government censorship in a decade," Schmidt said during a speech in Washington. "The solution to government surveillance is to encrypt everything."

Setting aside the entertaining aspect of the source of said statement, I don't think encryption in and of itself is enough. Encryption performed by companies is useless, since we know by now that companies - US or otherwise - are more than eager to bend over backwards to please their governments.

What we need is encryption that we perform ourselves, so that neither governments nor companies are involved. I imagine some sort of box between your home network and the internet, that encrypts and decrypts everything, regardless of source or destination. This box obviously needs to run open source software, otherwise we'd be right back where we started.

Is something like that even possible?

Thread beginning with comment 577278
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE: Comment by pcunite
by Kroc on Fri 22nd Nov 2013 09:09 UTC in reply to "Comment by pcunite"
Kroc
Member since:
2005-11-10

This right here is one of the biggest bug bears I have.

Encryption != Identity.

Tying the trust of encryption to SSL CAs is the reason that even today most websites don't use HTTPS -- just broadcasting everything unencrypted over the web.

The browser vendors too should be blamed. Had Firefox allowed 'untrusted' certificates in the beginning then HTTPS would be standard and on by default for all servers, everywhere. This is not a security problem -- trustworthiness of the host (identity) is the responsibility of ECV certificates and the like, but that shouldn't force everybody else to have to run on HTTP!

Reply Parent Score: 3

RE[2]: Comment by pcunite
by moondevil on Fri 22nd Nov 2013 12:15 in reply to "RE: Comment by pcunite"
moondevil Member since:
2005-07-08

If you accept untrusted certificates, it makes SSL useless to prevent man-in-the-middle attacks.

You can still do them with trusted certificates, but with untrusted ones it is a piece of cake.

Reply Parent Score: 3

RE[2]: Comment by pcunite
by Lennie on Sat 23rd Nov 2013 09:28 in reply to "RE: Comment by pcunite"
Lennie Member since:
2007-09-22

Actually, there are multiple reasons:

There is the one you mentioned:
- certs signing takes time, knowledge and effort to get done. Certs are actually already free (!) or cheap (10 euros). You don't pay for the cert. You pay for that time and effort to talk to a CA.

But don't dismiss:
- SNI for HTTPS, no support in all browsers for virtual hostnames like for HTTP, so you need an IP-address per website (think about how we are running out of IPv4-addresses and the administrative overhead of configuring the server). Here you pay for configuration overhead and an IPv4-address.

Support for DNSSEC/DANE and SNI in browsers would help here.

Edited 2013-11-23 09:28 UTC

Reply Parent Score: 4

RE[3]: Comment by pcunite
by Alfman on Sun 24th Nov 2013 17:05 in reply to "RE[2]: Comment by pcunite"
Alfman Member since:
2011-01-28

Lennie,

Sounds like you've had a lot of experience navigating these muddied waters ;)

"- certs signing takes time, knowledge and effort to get done. Certs are actually already free (!) or cheap (10 euros). You don't pay for the cert. You pay for that time and effort to talk to a CA."

The thing is, they aren't all created equal. Many have bad support in browsers. And all the cheap CAs are of the automated variety, doing little more than contacting us via *insecure* email and http connections, pretty ironic right?

Another major problem with the CA model is that *everyone's* security gets reduced to the weakest CA in the browser, since that CA technically has the ability to forge signatures for any website whether they are even customers of the CA or not.


"Support for DNSSEC/DANE and SNI in browsers would help here."

Issues with complexity aside, I agree this is the way forward. It eliminates the security problems in relying on 3rd party CA's and also entitles everyone to certificates without having to buy them (everyone wins except for the CA's who loose big time).

It's great for academic theory, but in the real world ISPs, network equipment, and existing software are major hurdles with no easy answers. Look at initiatives like IPv6, jumbo packets, etc. In each case, we are all in firm agreement that the old standards are holding back technology, yet they're so deeply entrenched that we are barely any closer to deploying these things than we were 10 years ago.

I'm pretty convinced that the current internet will have to become completely unreliable before we will take migrations seriously.

Reply Parent Score: 2

RE[2]: Comment by pcunite
by Alfman on Sun 24th Nov 2013 16:16 in reply to "RE: Comment by pcunite"
Alfman Member since:
2011-01-28

Kroc,

"The browser vendors too should be blamed. Had Firefox allowed 'untrusted' certificates in the beginning then HTTPS would be standard and on by default for all servers, everywhere."

You are right. Mozilla has a long history of handling HTTPs certificates very poorly (starting with FFv3 they made unpopular changes I recall when they shifted policy from warning the user about unrecognized certificates to blocking the user completely). Their terrible support for self signed certificates makes it a continuous pain to use HTTPS on embedded devices (where the CA model is completely broken anyways) and even for websites where we cannot justify buying certs.

From a policy point of view, HTTPS connections to unverified peers is not less secure than plain HTTP, and would have the additional benefit of defeating passive surveillance techniques. Unfortunately, HTTPS implementations such as mozilla's have precluded the possibility of enabling HTTPS _everywhere_, consequently many websites that would have enabled HTTPS are left using plain text HTTP, and we're all much worse off given the widespread instances wiretapping.

Reply Parent Score: 2

RE[3]: Comment by pcunite
by WereCatf on Sun 24th Nov 2013 16:31 in reply to "RE[2]: Comment by pcunite"
WereCatf Member since:
2006-02-15

Their terrible support for self signed certificates makes it a continuous pain to use HTTPS on embedded devices


I don't know what you're talking about, it works the same on my mobile as it does on the desktop: you get a screen that warns about a non-CA-signed certificate and then you can either go away or allow that certificate.

From a policy point of view, HTTPS connections to unverified peers is not less secure than plain HTTP, and would have the additional benefit of defeating passive surveillance techniques. Unfortunately, HTTPS implementations such as mozilla's have precluded the possibility of enabling HTTPS _everywhere_,


I'm going to have to ask you what would you prefer then? If browsers just automatically accepted all certificates regardless of where or by whom they were signed you'd just immediately render most of the points for using HTTPS in the first place moot as it'd be utterly ridiculously easy to just do a MITM and redirect the traffic elsewhere. It would still be passive surveillance at that point, no better than now.

Reply Parent Score: 3