For the past several years, more than 90% of Chrome users’ navigations have been to HTTPS sites, across all major platforms. Thankfully, that means that most traffic is encrypted and authenticated, and thus safe from network attackers. However, a stubborn 5-10% of traffic has remained on HTTP, allowing attackers to eavesdrop on or change that data. Chrome shows a warning in the address bar when a connection to a site is not secure, but we believe this is insufficient: not only do many people not notice that warning, but by the time someone notices the warning, the damage may already have been done.
We believe that the web should be secure by default. HTTPS-First Mode lets Chrome deliver on exactly that promise, by getting explicit permission from you before connecting to a site insecurely. Our goal is to eventually enable this mode for everyone by default. While the web isn’t quite ready to universally enable HTTPS-First Mode today, we’re announcing several important stepping stones towards that goal.
It’s definitely going to be tough to get those last few percentages converted to HTTPS, and due to Chrome’s monopolistic influence on the web, any steps it takes will be felt by everyone.
I find Google’s aggressive push towards HTTPS everywhere to be be disingenuous and dubiously motivated. Let’s be honest, for a banking website or online store HTTPS is absolutely required, but is it really necessary for a blog that reviews chicken sandwiches or a page about building birdfeeders? Worse still, ignorant users assume that the HTTPS padlock symbol is an integral certificate of safety and legitimacy, as if it applied to the content of the page and the moral character of the site owner. Sure, HTTPS prevents network level snoopers and hackers from viewing or modifying the page while it’s in transit, but what if the most sinister individual in the whole process is at the creator’s end of the encrypted connection? There’s simply no automated way to protect naive or ignorant people against their own lack of common sense. Google’s heavy-handed insistence on HTTPS by even de-ranking non-compliant pages smacks of a power-grab, in line with Google’s typical corporate ethos of being the self-appointed gatekeeper of Truth and Right on the internet.
Google can’t solve peoples misunderstanding of HTTPS, but HTTPS makes everyone safer in a number of ways in addition to encryption and authentication. Especially with Lets Encrypt, there are no valid arguments as to why sites should not be HTTPS.
I strongly disagree. HTTPS creates a non-insignificant amount of additional CPU load on busy webservers that might already be running on limited resources. Plus the non-trivial complexity of setting it up and the additional maintenance burdens are simply unnecessary for some categories of websites. As I mentioned, there are no security requirements for a website about chicken sandwiches. But above all, the normalization of HTTPS everywhere and the availability of certificates without any validation of the website’s legitimacy makes many users conflate encryption with security.
I don’t want to drag out all of the reasons beyond encryption that TLS is great beyond just encryption but there are a lot of them. All reasons against it are pretty silly, Load of TLS is insignificant on an sandwich website. configuration is trivial.
I also believe the CPU overhead has become insignificant. The additional handshaking does add delays over high latency connection, So, maybe a ping more time to connect. Also naturally you’ll need more packets to account for the TLS framing overhead, but again not very significant.
The bigger problem I see is https appliances. Even those that support https often get blocked by modern browsers. These are generally on private IPs and cannot use normal certificate authorities and even if they did the signed certificates would inevitably expire. Some web appliances allow the users install their own TLS/SSL certificates, but it’s a painful process for normal users to go through. So even though encryption is warranted for IOT, IMHO the process to get HTTPS working is still too complicated.
True, but they could certainly do a better of job of not actively perpetuating/outright causing that misunderstanding. I do support for a hosting provider, and have lost count of the number of times I’ve had to respond to panicky EMail from customers, and explain that no, that scary-looking “Not secure” warning from Chrome doesn’t actually mean that their site has been compromised, or is even in any serious impending risk of being compromised – and, at MOST, what Chrome is alerting them to is a potential/hypothetical issue, that could result in problems under fairly narrow, specific circumstances (such as the site owner logging-into their CMS backend over HTTP on public WiFi). As tiresome, repetitive support requests go, that’s probably second ONLY to reassuring people who received those idiotic “bitcon sextortion” scam EMails.
Just off the top of my head, I think of at least one exception/a few variations of the same exception. First, there a non-trivial number of sites (at least among the ones I host) that were developed before Let’s Encrypt, back in the days when having an SSL certificate usually meant a 4-5x increase in annual hosting costs for a typical small-medium site – or even further back, from the days when SSL was pretty much only used for online shopping & banking sites. Even if you can easily just enable Let’s Encrypt for the site’s hosting, chances are there are going to be “mixed content” warnings – and that’s certainly not limited to older, I’ve dealt with sites built in the past 5-10 years with those issues, due to being built by semi-amateur developers who don’t understand basic stuff like the different between relative & absolute paths/URLs and when use one VS the other, etc.
And at least in my experience, the organizations finding themselves in that position tend to be the same ones with the least ability/fewest resources to fix that issue. E.g. small non-profits whose websites were built 6 or 7 years ago by a summer student intern, who has since moved halfway across the country, or taken a full-time job in another field, or can’t be reached because they treated their EMail address as disposable and just stopped checking [email protected] when they forgot their password (or all of the above), etc etc.
I agree with you in principle, *all else being equal* there’s no reason for a website to not use HTTPS today (and certainly no excuse not to have “works properly on HTTPS” as a requirement for any new site developed in at least the past 5 years). But what about the situations where all else is *not* equal? Do you seriously it’s important enough that you’d be willing to go to the local Easter Seals or Meals On Wheels branch, and tell them “sorry, while I know that you rely on having a website, yours doesn’t work on HTTPS and you don’t have the resources to fix it – and even though there’s no real reason your particular site actually needs it, the lack of it offends the sensibilities of some techies, so no online presence for you”?
rahim123,
These are separate issues. Even innocuous websites could leak personal information like email, passwords, etc. So even if it’s less valuable than banking information, there’s still a case to be made for encrypting it.
As for the moral character of website owners, well I’m not sure how a registry for that would even work. There are organizations like the BBB that try to give organizations a grade, and that’s the closest thing I can think of, but things like that can be very subjective.
You’re right that google’s dominant market position has made them the arbiter of the internet, which has significant consequences. However in this particular case I think it’s a stretch to call HTTPS by default a “power grab”.
Definitely, that wouldn’t work. There’s no fixing stupid. I don’t think that protecting users from general online threats should be Google’s role unless it’s a service they own and control. If they want to create a browser then make it display the darn webpage, and leave the technical and moral implications to the individual website creators and users.
Crypto by default may have merit, however we need to be frank about where the real barrier lies. Hint: it was never the browsers, but the websites. IMHO that’s where our attention and solutions need to be. It should be possible for websites to be encrypted right out of the box by default, and the rest will follow.
Unfortunately the 3rd party SSL certificate industry have a financial agenda that has held us back in the long run and we continue to get roped into this bad model. They have a purpose when it comes to validating actual “identity”, but for “domain validation” we’d certainly be better off without them. Here we are though. Ideally everyone with a domain name would self-publish their own certificates via the domain name system. We could have standard tools that do this securely and by default. It would be even more secure than the domain validated certificates we have today, which adds more parties, more steps, and more expense to the process.
There is letsencrypt, which removes some financial burdens.
https://letsencrypt.org/
Unfortunately though it’s forced to work within the old (ie current) CA system. “Domain validation”, as used by certificate authorities including lets encrypt, is completely unnecessary. We just need better standards for domain owners to prove control without 3rd parties. Crypto isn’t the barrier, pushing new standards is. And of course the CA industry are against anything that makes their business redundant.
TLS certs aren’t expensive, Letsencrypt is pretty simple to set up. Could there be a better system, sure, but this is what we have right now and it works. Something better will have to be significantly better in order to replace the established incumbents. There really is not conspiracy here. Any more than Mcdonalds is plotting to shut down Shake shack. The TLS providers are pretty tame, haven’t thrown their weight around much at all. They’re pretty much all slow, non innovative companies at their core.
Bill Shooter of Bul
Certificate authorities made sense when they verified the legal identities of corporations and people. But using it for domain verification is really a stupid approach. There was no reason to involve 3rd parties at all, adding parties actually reduces security, increases costs and complexity (whether it’s you or someone else who pays for it). I stand by my point, we’d be better off without domain validating CA’s. Ideally browsers would validate the domain information directly using self published certificates instead. The crypto works, but we have a standards problem. I could build a better system, but it would take the resources of an entity like google to actually spur uptake. It would be nice if the tech giants made a push to fix it, but it is what it is I suppose.
I respectfully disagree. Even if the system is objectively better from a technology standpoint, it doesn’t mean the incumbents will accept it. They are more concerned about their bottom lines than fixing the system, especially when the “fix” eliminates them as unnecessary, which would be the case here.
Sure, but to the extent they’re holding us back I’d say it is a problem.
A gppd half or more of original websites remain online, if generally abandoned. Firefox has already made using these old sites a pain. Requiring me to dig through settings to turn off https defaults. safari followed.
Look, https has its use case for 1% of web use. Banking taxes, government etc.
much of anywhere else, it’s useless unless you are freaked about privacy.