Linked by mufasa on Mon 10th Aug 2009 12:25 UTC
Web 2.0 The web browser has been the dominant thin client, now rich client, for almost two decades, but can it compete with a new thin client that makes better technical choices and avoids the glacial standards process? I don't think so, as the current web technology stack of HTML/Javascript/Flash has accumulated so many bad decisions over the years that it's ripe for a clean sheet redesign to wipe it out.
Permalink for comment 377993
To read all comments associated with this story, please click here.
RE: Missing the point
by happe on Tue 11th Aug 2009 01:37 UTC in reply to "Missing the point"
Member since:

We already have an alternative to sending uncompressed text over the network. Compress it. All web browsers (and most other HTTP clients) support gzip and deflate compression, and virtually all web servers support it.

As long as you're encoding the same information, there isn't going to be much difference between a binary format, and a compressed text format. Markup compresses extremely well.

It's not only a network latency/bandwidth problem. Computers don't like compression and text parsing. They will happily do the work, but not very fast.

Think about what happends when sending a compressed HTML/XML:

1. Create text in buffer (memory and text creation overhead).
2. Compress the text (cycles and more memory touched).
3. Send data.

4. Receive data.
5. Decompress data.
6. Parse text.

With binary you skip step 2 and 5, while the rest have a smaller overhead.

Then people shout "interoperability" and "human-readable"!

Interoperability: When computers talk to each other they don't care about the encoding as long as they agree on the format.

Human-readable: This is only needed for debugging; then use a debugger! You just need a tool that can transform the binary to a human-readable format.

Imagine if machine code and the TCP/IP headers were text. Would you defend such a design?

Edited 2009-08-11 01:41 UTC

Reply Parent Score: 1