There’s been a lot of concern recently about the Web Environment Integrity proposal, developed by a selection of authors from Google, and apparently being prototyped in Chromium.
There’s good reason for anger here (though I’m not sure yelling at people on GitHub is necessarily the best outlet). This proposal amounts to attestation on the web, limiting access to features or entire sites based on whether the client is approved by a trusted issuer. In practice, that will mean Apple, Microsoft & Google.
Of course, Google isn’t the first to think of this, but in fact they’re not even the first to ship it. Apple already developed & deployed an extremely similar system last year, now integrated into MacOS 13, iOS 16 & Safari, called “Private Access Tokens“.
Ten bucks this bad thing Apple is already shipping will get far less attention than a proposal by Google.
Reading the technical document that evil Apple posted doesn’t take long, that’s a waste of money.
The mere existence of an APIs won’t call any attention to itself., It’s only when prominent websites start requiring the API that the public would notice this change. This is bad for alternative/customized browsers since they could be blocked over time. Sure, hypothetically, if websites ignore the APIs forever, then users will never notice a difference, but once websites like youtube, gmail, etc start using them for access, alternatives could be in bigger trouble.
BTW, I’m not sure if anyone else has noticed this but Google sets captchas more difficult and sometimes impossible for firefox users. Sometimes when I open websites that have a difficult captcha setting, I am forced to give up solving the captcha altogether under firefox. Google is abusing their control over websites via recaptcha to make the experience worse for firefox, yet when switching to chrome I consistently get in strait away. Newegg was one such site and as a customer of theirs blocked from making purchases., I complained vehemently until they fixed it. Of course it suits google just fine that users have a better experience under chrome, but I personally think that google’s self dealing should be punished with fines under antitrust law. It’s a blatant abuse of their monopoly power.
Alfman,
The CAPTCHA thing might be a side effect. As far as I know, some systems, including Google’s offers clickless verification:
https://www.engadget.com/2018-10-31-googles-new-recaptcha-doesnt-require-a-click.html
That probably requires browser support, and … surprise … Firefox does not come with the required level of support, so it falls back to a single click, or worse.
(I find all these bad for the future of the Web, though. Its design was supposed to be visual agent free from day one. In fact the semantic tags (HTML 4?) were to help non-visual parsing/browsing better. How can a speech only or text only interface work if everything requires javascript UI to verify “humanness”?)
sukru,
It’s just upsetting that they applied different security standards on chrome and firefox. You could technically be right that it’s an unintentional bug or side effect. Although personally I believe it’s intentional because google knows what they are doing. Still, because it’s proprietary, I concede that I have no way to prove it.
I mostly avoid sites using recaptcha, but next time if I remember I’ll try switching firefox’s user agent to chrome. Believe it or not I’ve seen this hack fix access problems before, including some federal government websites. That’s total BS because it proves that they were whitelisting and blacklisting certain browsers. This is not only discriminatory against legitimate users, but it offers zero protection from actual hackers who send whatever user agent they feel like.
I think recapcha’s days are numbered. AI has already won. Sophisticated bots are going to be there and nobody can stop them.
With this in mind I feel that the goal should no longer be to block arbitrarily sophisticated bots (and humans that fail to pass a captcha), but instead to block bad behavior. Alas this is still hard to accomplish, but at least the web will be better for it.
Alfman,
I too would prefer an open web, however, it could go either way. One is going back to a pure HTML model, where we have AI agents to help us (text to speech, summarization, automobile usage while driving, etc).
Or full on DRM with attestment starting with secure boot, and only using signed browsers to access, like what Netflix is doing today.
sukru,
Well, you are talking about an open web versus closed web. I too would prefer it be open.
However when I said that recaptcha’s days are numbered, it wasn’t because of ideological preferences,, but simply that “completely automated public turing tests to tell computers and humans apart” no longer work given the advancement of computers. The objective of captcha mechanisms are no longer feasible as computers have become better than real humans at solving them. Our personal preferences don’t really come into the equation, the technology is broken either way. Ironically recaptcha itself was being used to train AI to solve classification problems better than a human, a milestone I suspect they reached several years ago. It’s been of dubious value ever since.
As far as DRM restricted browsers and devices go, I am strongly of the opinion that owners should control their hardware rather than the other way around, However I believe that more restrictions are coming for consumer electronics. It will always be done in the name of security while conveniently yielding ever more control to the corporations building these devices. Just like netflix it seems unlikely to me that DRM will ever be 100% effective against hackers. But it could remain effective against the masses, and that might be all the justification they need to do it.