In 2018, we announced the deprecation and transition of Google URL Shortener because of the changes we’ve seen in how people find content on the internet, and the number of new popular URL shortening services that emerged in that time. This meant that we no longer accepted new URLs to shorten but that we would continue serving existing URLs.
Today, the time has come to turn off the serving portion of Google URL Shortener. Please read on below to understand more about how this will impact you if you’re using Google URL Shortener.
↫ Sumit Chandel and Eldhose Mathokkil Babu
It should cost Google nothing to keep this running for as long as Google exists, and yet, this, too, has to be killed off and buried in the Google Graveyard. We’ll be running into non-resolving Google URL Shortener links for decades to come, both on large, popular websites a well as on obscure forums and small websites. You’ll find a solution to some obscure problem a decade from now, but the links you need will be useless, and you’ll rightfully curse Google for being so utterly petty.
Relying on anything Google that isn’t directly serving its main business – ads – is a recipe for disaster, and will cause headaches down the line. Things like Gmail, YouTube, and Android are most likely fine, but anything consumer-focused is really a lottery.
goo.gl is also used for generating Maps and Photos shared links, presumably Google will still use this internally, just the public shortening service will close. If that is the case it is a good thing, that increases the security of goo.gl if they are only generated by Google apps.
Hmm, interesting take. It would make more sense that way because manually updating all their own links would be a massive project. So basically they keep it turned on only for pages that direct to an actual Google (or Google-owned) domain, and it has the side (branding) “benefit” of making it clear that you’re going to a Google-owned page when you click on a goo.gl link.
…so basically what they did with Google Code, keeping the infrastructure around when things like Chrome development depended on it.
Just had a look and sharing something from Google Maps generates link on https://maps.app.goo.gl – so probably a lot of infrastructure is still in use.
Thom’s take is the right take, IMHO. There’s just no reason to get rid of the service – and I have no doubt this will have ripple effects of breaking links in Google’s own blog posts, YouTube video descriptions, and documentation pages, unless they plan to crawl all of that and replace each instance of their own using goo.gl links, which in itself would be a massive project considering there are any number of backends involved.
It feels like Google these days is one of the worst offenders of losing sight of their customers in the interest of catering to advertisers and “balancing the budget” (sure, this is gonna make that much difference). The strange thing is that normally Google is all about anything that’s a source of analytics – and seeing as URL shorteners act as conduits for all kinds of links between webpages, Google should be receiving valuable data about who is heading where from where (at least assuming the browser is configured to send the Referer header cross-site).
The only clue they give to a possible “real” motivation is the last sentence:
This could be referring to “new and exciting” expansion of existing tools like Google Analytics, with which they have already managed to ensnare over half of all of the web (source: https://w3techs.com/technologies/details/ta-googleanalytics ). Or it could be referring to some in-development browser API and/or JavaScript tools they have up their sleeve, that somehow helps them gather more data than the URL shortener ever did (probably under the guise of optimizing the browsing experience by preloading popular links or some such nonsense).
If you can’t tell, I don’t trust Google to have users’ interests at heart anymore. They were once a company I respected and admired, but now they’ve even managed to break their core product and the freaking beating heart of the web itself for the past two decades, Google Search, in the name of more from advertisers, which to me is the last straw. I’m not quite ready to de-Google-ify my life but at this point it’s less a matter of if than when.
As I thought, they did have some tracking-enhanced alternative tech up their sleeve – in the original blog post they try to push “FDL” (Firebase Dynamic Links) as a replacement. But that’s already deprecated now too, LOL. https://firebase.google.com/support/dynamic-links-faq
This sort of thing is a secondary reason for one of the steps in the CAPTCHA-less spam prefilter on my contact form being a blacklist of known URL shorteners and pastebins.
As hinted at by a Robot9000-related comic that answered “But what if some humans get caught by this?” with “Mission f***ing accomplished”, you can get surprisingly effective at rejecting spam without punishing humans just by writing the X in “spell check, grammar check, and X” for “stuff I wouldn’t want from a human either”… at least as long as one of those checks is “This is a plaintext form. Reject link markup.” so the rest of the filters are mostly dealing with stuff that’s meant to be selling scams to site admins instead of SEO-gaming.
It’s surprising how it’s worked so well that I’ve been able to procrastinate making it stateful and implementing “if a URL from a site not on the ‘trusted to self-moderate well enough to be useless for spam’ list is present, show an hCAPTCHA” for years now.
…but yeah. “Please use the full, un-shortened forms of URLs.” messages for the win.
ssokolow (Hey, OSNews, U2F/WebAuthn is broken on Firefox!),
I don’t run public forums, but I agree that URL shorteners are bad practice. People reading a forum should know where a link goes to before clicking. Obfuscation of links is an antifeature. The reason they were created in the first place was to work around the technical limitations of twitter and text messages discouraging links. In that context URL shorteners were an evil necessity.
Regarding pastebins, I would agree they too should be avoided, but the problem is that most forums lack sufficient upload capabilities and are very deficient. Platforms including wordpress leave a whole lot to be desired…barely acceptable at times. If you have anything that needs improved formatting or images, then for better or worse pastebins and image uploaders may be the only option. Here on osnews I personally link to these services.
I can agree this is also bad practice because it means the links can/will go bad eventually. The problem though is that if they were to be banned without providing an alternative solution, it places severe constraints on the quality of information that can be conveyed. No images/charts/spreadsheets/code…this limitation is pretty bad for some contexts.
In the end, it’s all about “by definition, there cannot be a turnkey/drop-in anti-spam solution”.
I design my anti-spam around the situation and, in the cases I’ve done, the rule is “It’s gonna be sent to me as a plaintext e-mail either way, so I might as well block spam at the same time as preventing humans from bombarding my eyes with un-interpreted raw HTML or BBCode.”
My rule for pastebins is “If I observe a spammer using it as free hosting to hide the actual body of their spam from the spam filter and the spam is thus made minimal enough that you’d need the spam filter to have human-level intelligence to reject it without the risk of false positives, then it goes on the blacklist”.
Bear in mind that my decision to use the term “spam prefilter” was not made lightly. I designed it as something which will never have a false positive in the use-cases I designed it for, and which can be put in front of some other spam-blocking solution if I run into the limits of what can be achieved with that approach.
Part of that is that it’s meant for contact forms, which means that all the limitations fall away if the sender convinces me that it’s worth it to reply. They can link to whatever they want once the conversation has shifted to e-mail. (So it’s less a general-purpose spam prefilter and more a “cold-calling filter” optimized for the assumption that people may be too shy to check the “allow recipient to reply” box and provide an e-mail address.)
ssokolow (Hey, OSNews, U2F/WebAuthn is broken on Firefox!),
That’s fine if you have authority over the senders and/or you can accept some false positives. You’re referring to the spam problem as it exists specifically for forums. Captchas seem to be the dominant solution for forums. Sufficiently motivated senders can jump through hoops, but it still produces false positives and false negatives. It is what it is.
For me spam is a bigger problem when it comes to emails and phone calls, especially because business contacts can be wrongly flagged by anti-spam features.
Actually, I’m referring to it as it exists for contact forms and blog comments, and the whole point is that I don’t want to punish humans for what bots do. Thus, avoiding the need for CAPTCHAs as much as possible.
For e-mail, I hand out a different forwarding address to each legitimate sender, disable bayesian filtering, and configure my inbox to send everything received directly to trash. (In essence, instead of having one e-mail address, I have a bunch of revokable API keys which, if I can ever get around to writing a milter and Firefox extension, can be extended to lock themselves to a whitelist of the first domain they receive a message from.)
ssokolow (Hey, OSNews, U2F/WebAuthn is broken on Firefox!),
Oh I hate captchas too. And even more so as a paying customer. But given how pervasive spammers are it’s just one of those antifeatures you sometimes have to tolerate if you want to participate.:(
I use different custom addresses too, it does help with tracking. It’s not perfect because humans tend to CC emails, sometimes to very large groups. And if you apply to jobs on a job board, they sell resumes to employers (that’s the whole point of their existence), but spammers can/do buy the same data. If you don’t want to block the legitimate emails, you’re kind of stuck not blacklisting the addresses that legitimate senders might use.
Google thrived on the open web, and used their resources to make sure it stayed open in the past.
Some of the projects I remember:
– AMP: “Accelerated Mobile Pages”, when mobile news was crawling to halt with ads, they released a modern JavaScript based ecosystem to get it back on track
– Web Cache: When websites went down, Google search allowed users to look at the latest retrieved, “cached” versions so that the information would continue to be accessible.
– Chrome: As Internet Explorer became monoculture, and web rendering was slow, they released the fastest Web browser as open source
– Url Shortener: Many free shortener services were occasionally shutting down. That meant more broken links on the open Internet. Google offered one to end all as a permanent solution.
And others…
And except for Chrome, all other initiatives are either shut down, discontinued, or no longer favored.
I am not sure having an open and freely accessible Web is a high priority for Google anymore.
Don’t use shorteners. Simply as that.