Neil McAllister raises questions regarding the Web now that it no longer resembles Tim Berners-Lee’s early vision: Is the Web still the Web if you can’t navigate directly to specific content? If the content can’t be indexed and searched? If you can’t view source? In other words, McAllister writes, if today’s RIAs no longer resemble the ‘Web,’ then should we be shoehorning them into the Web’s infrastructure, or is the problem that the client platforms simply aren’t evolving fast enough to meet our needs
I think Niel McAllister’s article is wrong on so many fronts and such a disservice to all the innovation on the internet, and all the work that goes into extending the web’s capabilities that I feel compelled to tear it apart bit by bit.
“When Tim Berners-Lee first envisioned the Web in the 1980s, he saw it as primarily an information storage and retrieval system, based on the concept of hypertext.”
Good thing he can speak for what Tim Berners-Lee originally envisioned. *rollseyes* He seems to be claiming implicitly that since TBL’s vision, there have been no good contributions to the development of the web. *bzzt* The great thing is that all these emerging web technologies are complimentary, or at very least do not conflict. I know not everyone’s ideas of the future of the internet are compatible, but the ones whose ideas are incompatible are largely large, monied interests trying to increase their bottom line.
The great thing about the internet in TBL’s vision, and this is to cite the man himself, is that the openness and equal access in fact engenders many novel usages of technology. TBL would not want to see the internet restricted to the few “approved” media types, but he believes that there is room for all types of novelty on the internet. And I quote, “I want to see the explosion of innovations happening out there on the Web, so diverse and so exciting, continue unabated.”
see: http://dig.csail.mit.edu/breadcrumbs/node/144
“The static HTML document is largely a thing of the past. In its place is a diverse range of technologies, each of which falls somewhere along a continuum that spans from the flexibility and openness of Web 1.0, all the way to a closed, binary-only paradigm that’s more akin to traditional desktop software.”
This makes no sense whatsoever. What imaginary “flexibility and openness” of Web 1.0 technology existed then that does not exist now? Also, there is no continuum as Mr. McAllister claims. The reality is that there are a lot of open standards, with a few proprietary ones mixed in, for the latter of which his claims make a modicum of sense. Honestly, though, Flash is the only mainstream proprietary technology that I can think of. Silverlight might be in that in-between category of openness and proprietary technology, but for goodness sake, Microsoft is assisting the open-source community with a F/OSS implementation.
With regard to Flash, except for web pages that are nothing but .swf files without a containing HTML file, this same HTML file serves as a pointer to indicate the contents of the SWF file, and now that Flash is directly indexible, there’s really NO technology that can’t be indexed by a major search engine.
Mr. McAllister seems to lament later in the article the fact that some newer web technologies are not device-neutral. I can’t fathom the reasoning behind his complaint. I understand the beef with proprietary technologies like Flash, but the majority of web technologies are open standards and can be implemented on any device that has the hardware to display them. No matter how much you wish, some devices are just going to be impractical for viewing certain types of media. It’s important to allow content creators to find the best medium for expressing what they wish to tell, and the web enables all sorts of novel media which couldn’t really exist outside of the browser.
For example, you can’t (practically) view PNG files, and view complex AJAX websites on an Apple IIe. Just ain’t gonna happen. Technology moves on, and no amount of wishing is going to make the web text-only again. Complaining that the web has moved on beyond mere text-based pages is akin to complaining that you have to buy a television in order to watch Everybody Loves Raymond, and claiming that the producers of that show are wrong because they opted to present the story in moving-talking picture form, as opposed to in hardback book form.
The good thing is that due to the openness of most web technologies, companies ARE devising ways of indexing what is relevant in a new page despite the increasing complexity and web-2.0-iness of newer web sites. Take the example of adding video to the collection of media on the web, and compare that to adding a shelf of movies in a library. The movies are, in fact, indexible by the library without issue, because someone did the work to figure out the best way to make a useable index of movies. Search engines for hypertext didn’t exist at one time either, but as the need arose, they were duly created. I have no doubt that whatever future changes the web will bring, there will be a search engine on the heels of those changes that can help people find what they’re looking for.
When you think of the other issue, namely whether or not you’ll be able to run a program or view an article/movie/picture/other form of media on a given computer, the web, contrary to what Mr. McAllister claims, has vastly improved the situation. Rich Internet Applications should run regardless of operating system, or hardware platform, whereas with traditional applications, you have to invest in a different piece of hardware, or a different operating system in order to run applications which were not targeted for your platform. Such a distinction does not exist (ideally) on the internet today, as long as the physical constraints of the hardware allow the program to run well.
The unindexable binary blob is theoretically a huge problem, agreed, but we’ve moved toward more open technologies in the last 10 years as open technologies have caught up with Flash. Any new proprietary RIA environment used in lieu of an open one would have to be way far ahead of the open ones, and eventually will become irrelevant as the open technologies catch up. The potential for real damage is very small, and there are almost invariably workarounds.
It’s way too late at night to be ranting like this, and I’m going to go to bed to avoid ranting further, so please forgive me if I leave some thoughts unfinished, or some sentences incomplete. Neil McAllister has not convinced me yet of his mediocrity, but if he continues writing articles like this, InfoWorld will no longer have the pleasure of ad revenue from my page views, as long as I know that Mr. McAllister wrote the article.
And on a technical note, he claims that search engines could not extract textual information from PNG or JPEG files. This is patently untrue. Google would be free to implement an OCR engine if they so desired, and the semantic processing of images to enable a computer to recognize what is going on, though still primitive, is improving constantly.
Edited 2008-07-05 09:16 UTC
*claps*
That was more insightful then the actual article.
On a side note, the googlebot started indexing flash just the other day.
Oh noes… I hope it shows what results were found in gash… I mean, flash animations.
Welcome to a whole new world of Google spam!
Had the W3C not put pressure to comply with Web standards, we might be still seeing a lot of “designed for Internet Explorer” websites. To some extent, a fine open format such as the Portable Network Graphics never quite picked up because companies kept on using GIF simply because Internet Exlporer would not handle PNG.
I definitely agree with the concept of having an independent non profit-driven authority setting the standards. When profit-driven companies are lobbying to get “their” technology endorsed, we can never be sure that consumers will end up with the best product.
To me the web is HTTP delivered over TCP. When that changes, I’ll agree that the web has changed. The changes to the *content* of the web is just a natural evolution created by the free market, entrepreneurs and demand. No technology based industry stays still for long, and that’s the way I like it.
… would be awesome still if the average IQ wasn’t 65 and it wasn’t commercialized out the wazoo.