Linked by mufasa on Mon 10th Aug 2009 12:25 UTC
Web 2.0 The web browser has been the dominant thin client, now rich client, for almost two decades, but can it compete with a new thin client that makes better technical choices and avoids the glacial standards process? I don't think so, as the current web technology stack of HTML/Javascript/Flash has accumulated so many bad decisions over the years that it's ripe for a clean sheet redesign to wipe it out.
Thread beginning with comment 377921
To read all comments associated with this story, please click here.
Missing the point
by ba1l on Mon 10th Aug 2009 16:58 UTC
ba1l
Member since:
2007-09-08

I think the author is missing the point in a few places. For example:

A binary encoding would greatly increase network efficiency, minimizing much of the wasteful uncompressed text sent over the network since I estimate that HTML makes up approximately 5% of network traffic.


We already have an alternative to sending uncompressed text over the network. Compress it. All web browsers (and most other HTTP clients) support gzip and deflate compression, and virtually all web servers support it.

As long as you're encoding the same information, there isn't going to be much difference between a binary format, and a compressed text format. Markup compresses extremely well.

Graphic designers would use GUI tools exclusively to work with this binary format, which works out perfectly as nobody wants to muck around with a markup language like HTML anyway.


Except for the fact that writing HTML, especially with a good editor, is extremely fast. Any editor capable of dealing with everything HTML can do (and XML, for that matter) needs to be complex, and any less capable editor isn't really useful.

Just a note - Silverlight uses XAML, which is XML. It has a UI designer, but it's almost useless. It's far simpler, and more productive, to edit the XML directly.

Certain parts, like templates, are easier to edit using tools (like Expression), and designers would be using those. Just as designers working on web pages would be using a good CSS editor, rather than editing the CSS directly.

Flex uses much the same approach - the UI is defined using XML, and it can be separately styled. Using an extended version of CSS, in fact.

Yes, lots of web applications are badly written, mix functionality, presentation, appearance and data freely, and are written in such a way that you can't use the better tools that are available. That doesn't mean that they have to be.

The web, as composed of HTTP/HTML/Javascript/Flash today, is a highly inefficient and insecure internet application platform.


Insecure? Only because of legacy crap - the days when web browsers didn't bother with security at all. These days, browsers are actually very secure in themselves, and they're providing new functionality to make web applications more secure as well.

The reason there are so many security vulnerabilities in most web browsers is because they represent the primary attack surface. Any replacement would have the same problems here.

Compare the number of vulnerabilities found in major web browsers to, for example, Flash. Or Acrobat.

Reply Score: 5

RE: Missing the point
by happe on Tue 11th Aug 2009 01:37 in reply to "Missing the point"
happe Member since:
2009-06-09

We already have an alternative to sending uncompressed text over the network. Compress it. All web browsers (and most other HTTP clients) support gzip and deflate compression, and virtually all web servers support it.

As long as you're encoding the same information, there isn't going to be much difference between a binary format, and a compressed text format. Markup compresses extremely well.


It's not only a network latency/bandwidth problem. Computers don't like compression and text parsing. They will happily do the work, but not very fast.

Think about what happends when sending a compressed HTML/XML:

Sender:
1. Create text in buffer (memory and text creation overhead).
2. Compress the text (cycles and more memory touched).
3. Send data.

Receiver:
4. Receive data.
5. Decompress data.
6. Parse text.

With binary you skip step 2 and 5, while the rest have a smaller overhead.

Then people shout "interoperability" and "human-readable"!

Interoperability: When computers talk to each other they don't care about the encoding as long as they agree on the format.

Human-readable: This is only needed for debugging; then use a debugger! You just need a tool that can transform the binary to a human-readable format.

Imagine if machine code and the TCP/IP headers were text. Would you defend such a design?

Edited 2009-08-11 01:41 UTC

Reply Parent Score: 1

RE[2]: Missing the point
by ba1l on Tue 11th Aug 2009 02:08 in reply to "RE: Missing the point"
ba1l Member since:
2007-09-08

Why would you not compress the binary format?

A typical HTML page contains almost entirely text. Not markup - text. Content. That wouldn't change just because you're using a binary format, that binary format would still contain large amounts of text.

The solution? Compress it!

Then you end up with a compressed text format, and a compressed binary format, which are about equal in size. Both are easily machine-readable. Only one is human-readable. Simply using a binary format is not going to make the tools magically better, so you're really not going to get any better tools than we have now.

Reply Parent Score: 3