Linked by Kroc Camen on Thu 12th Nov 2009 19:30 UTC
Google Google have created a new HTTP-based protocol "SPDY" (pronounced "Speedy") to solve the problem of the client-server latency with HTTP. "We want to continue building on the web's tradition of experimentation and optimization, to further support the evolution of websites and browsers. So over the last few months, a few of us here at Google have been experimenting with new ways for web browsers and servers to speak to each other, resulting in a prototype web server and Google Chrome client with SPDY support."
E-mail Print r 2   · Read More · 65 Comment(s)
Thread beginning with comment 394432
To read all comments associated with this story, please click here.
Applaud and boo, all in one
by deathshadow on Fri 13th Nov 2009 09:25 UTC
deathshadow
Member since:
2005-07-12

In a way I applaud the idea of addressing latency. Handshaking, the process of requesting a file is one of the biggest bottlenecks remaining on the internet that can make even the fastest connections seem slow.

To slightly restate and correct what Kroc said, every time you request a file it takes the equivalent of two (or more!) pings to/from the server before you even start receiving data. Real world that's 200-400ms if you have what's considered a low latency connection, and if you are making a lot of hops between point A and B or worse, have connections like dialup, satellite or are just connecting to a server overtaxed on requests - that could be up to one SECOND per file, regardless of how fast the throughput of your connection is.

Most browsers try to alleviate this by doing multiple concurrent connections to each server - the usual default is eight. Since the filesizes are different there is also some overlap over those eight connections, but if the server is overtaxed many of those could be rejected and the browser have to wait. As a rule of thumb the best way to estimate the overhead is to subtract eight, reduce to 75%, and multiply by 200ms as the low and one second as the high.

Take the home page of OSNews for example - 5 documents, 26 images, 2 objects, 17 scripts (what the?!? Lemme guess, jquery ****otry?) and one stylesheet... That's 51 files, so (51-8)*0.75==32.25, we'll round down to 32. 32*200 = 6.4 seconds overhead on first load on a good day, or 32 seconds on a bad day. (subsequent pages will be faster due to caching)

So these types of optimizations are a good idea... BUT

More of the blame goes in the lap of web developers many of whom frankly are blissfully unaware of this situation, don't give a **** about it, or are just sleazing out websites any old way. Even more blame goes on the recent spate of 'jquery can solve anything' asshattery and the embracing of other scripting and CSS frameworks that do NOT make pages simpler, leaner, or easier to maintain even when they claim to. Jquery, Mootools, YUI, Grid960 - Complete rubbish that bloat out pages, make them HARDER to maintain than if you just took the time to learn to do them PROPERLY, and often defeat the point of even using scripting or CSS in the first place. CSS frameworks are the worst offenders on that, encouraging the use of presentational classes and non-semantic tags - at which point you are using CSS why?

I'm going to use OSNews as an example - no offense, but fair is fair and the majority of websites have these types of issues.

First we have the 26 images - for WHAT? Well, a lot of them are just the little .gif icons. Since they are not actually content images and break up CSS off styling badly, I'd move them into the CSS and use what's called a sliding-background or sprite system reducing about fifteen of those images to a single file. (In fact it would reduce some 40 or so images to a single file). This file would probably be smaller than the current files combined size since things like the palette would be shared and you may see better encoding runs. Researching some of the other images and about 22 of those 26 images should probably only be one or two images total. Let's say two, so that's 20 handshakes removed, aka three to fifteen seconds shaved off firstload.

On the 12 scripts about half of them are the advertising (wow, there's advertising here? Sorry, Opera user, I don't see them!) so there's not much optimization to be done there EXCEPT, it's five or six separate adverts. If people aren't clicking on one, they aren't gonna click on SIX.

But, the rest of the scripts? First, take my advice and swing a giant axe at that jquery nonsense. If you are blowing 19k compressed (54k uncompressed) on a scripting library before you even do anything USEFUL with it, you are probably ****ing up. Google analytics? What, you don't have webalizer installed? 90% of the same information can be gleaned from your server logs, and the rest isn't so important you should be slowing the page load to a crawl with an extra off-server request and 23k of scripting! There's a ****load of 'scripting for nothing' in there. Hell, apart from the adverts the only thing I see on the entire site that warrants the use of javascript is the characters left counter on the post page! (Lemme guess, bought into that ajax for reducing bandwidth asshattery?) - Be wary of 'gee ain't it neat' bullshit.

... and on top of all that you come to the file sizes. 209k compressed/347k uncompressed is probably TWICE as large as the home page needs to be, especially when you've got 23k of CSS. 61k of markup (served as 15k compressed) for only 13k of content with no content images (they're all presentational), most of that content being flat text is a sure sign that the markup is probably fat bloated poorly written rubbish - likely more of 1997 to it than 2009 - no offense, I still love the site even with it's poorly thought out fixed metric fonts and fixed width layout - that I override with opera user.js.

You peek under the hood and it becomes fairly obvious where the markup bloat is. ID on body (since a document can only have one body what the **** are you using an ID for), unnecessary spans inside the legend, unnecessary name on the h1 (you need to point to top, you've got #header RIGHT before it!), nesting a OL inside a UL for no good reason (for a dropdown menu I've never seen - lemme guess, scripted and doesn't work in Opera?), unneccessary wrapping div around the menu and the side section (which honestly I don't think should be a separate UL), those stupid bloated AJAX tabs with no scripting off degradation, or the sidebar lists doped to the gills with unnecessary spans and classes. Just as George Carlin said "Not every ejaculation deserves a name" not every element needs a class.

Using MODERN coding techniques and axing a bunch of code that isn't actually doing anything, it should be possible to reduce the total filesizes to about half what it is now, and eliminate almost 75% of the file requests in the process... Quadrupling the load speed of the site (and similarly easing the burden on the server!)

So really, do we need a new technology, or do we need better education on how to write a website and less "gee ain't it neat" bullshit? (Like scripting for nothing or using AJAX to "speed things up by doing the exact opposite")

Edited 2009-11-13 09:27 UTC

Reply Score: 4

RE: Applaud and boo, all in one
by kaiwai on Fri 13th Nov 2009 10:09 in reply to "Applaud and boo, all in one"
kaiwai Member since:
2005-07-06

I think what pisses me off the most is the fact that I've made websites, I want for example geometric shapes but I can't do it without having to use a weird combination of CSS and gif files. Why can't the W3C add some even most basic features which would allow one to get rid of large amounts of crap. Heck, if they had a geometric tag which allowed me to create a box with curved corners I wouldn't need to use the frankenstein code I use today.

What would be so hard to create:

<shape type="quad" fill-color="#000000" corners="curved />

Or something like that. There are many things that people add to CSS that shouldn't need to be there if the W3C got their act together - where the W3C members have done nothing to improve the current situation in the last 5 years except to drag their feet on every single advancement put forward - because some jerk off in a mobile phone company can't be figged upping the specifications in their products to handle the new features. Believe me, I've seen the conversations and it is amazing how features are being held up because of a few nosy wankers holding sway in the meetings.

Reply Parent Score: 2

Kroc Member since:
2005-11-10

"SVG 1.0 became a W3C Recommendation on September 4, 2001" -- Wikipedia.

Reply Parent Score: 1

ba1l Member since:
2007-09-08

While it's hardly simple, SVG was actually intended for exactly this kind of thing. The problem is that only Webkit allows you to use SVG anywhere you'd use an image.

Gecko and Opera allow you to use SVG for the contents of an element only. Internet Explorer doesn't support SVG at all, but allows VML (an ancestor of SVG) to be used in the same way you can use SVG in Gecko and Opera.

So the functionality is there (in the standards) and has been there since 2001. We just aren't able to use it unless we only want to support one browser. Cool if you're writing an iPhone application, but frustrating otherwise.

As for your specific example, you can do that with CSS, using border-radius. Something like this:

-moz-border-radius: 10px;
-webkit-border-radius: 10px;
border-radius: 10px;

Of course, as with everything added to CSS or HTML since 1999, it doesn't work in Internet Explorer.

Blaming the W3C for everything hardly seems fair, considering that these specs were published almost a decade ago, and remain unimplemented. Besides, there are plenty of other things to blame the W3C for. Not having actually produced any new specs in almost a decade, for example.

Reply Parent Score: 3

cerbie Member since:
2006-01-02

.

Edited 2009-11-13 21:15 UTC

Reply Parent Score: 2

RE: Applaud and boo, all in one
by Kroc on Fri 13th Nov 2009 13:00 in reply to "Applaud and boo, all in one"
Kroc Member since:
2005-11-10

I agree absolutely.

Since Adam already spilled the beans in one of the Conversations, I may as well come out and state what is probably already obvious: There is a new site in the works, I'm coding the front end.

_All_ of your concerns will be addressed.

The OSnews front end code is abysmally bad. Slow, bloated and the CSS is a deathtrap to maintain (the back end (all the database stuff) is very good and easily up to the task).

Whilst we may not see eye to eye on HTML5/CSS3, I too am opposed to wasted resources, unnecessary JavaScript and plain crap coding. My own site adheres to those ideals. Let me state clearly that OSn5 will be _better_ than camendesign.com. I may even be able to impress you (though I doubt that ;) )

Reply Parent Score: 1

RE: Applaud and boo, all in one
by edmnc on Fri 13th Nov 2009 13:33 in reply to "Applaud and boo, all in one"
edmnc Member since:
2006-02-21

Google analytics? What, you don't have webalizer installed? 90% of the same information can be gleaned from your server logs


That there just means you don't use google analytics (or don't know how to use it). It is a very powerful peace of software that can't be replaced by analog, webalizer or digging through logfiles.

Reply Parent Score: 1

deathshadow Member since:
2005-07-12

That there just means you don't use google analytics (or don't know how to use it). It is a very powerful peace of software that can't be replaced by analog, webalizer or digging through logfiles.

No, it's just that the extra handful of minor bits of information it presents is only of use to people obsessing on tracking instead of concentrating on building content of value - usually making such information only of REAL use to the asshats building websites who's sole purpose is click-through advertising bullshit or are participating in glorified marketing scams like affiliate programs... such things having all the business legitimacy of Vector Knives or Amway.

Edited 2009-11-14 15:18 UTC

Reply Parent Score: 3