Developers should reduce the amount of JavaScript on their sites if they want to improve performance according to one web expert. And some leading websites, including and Wikepedia, score poorly on performance and should be overhauled said Steve Souders, Google chief performance engineer.

Souders, who was speaking at Microsoft developer conference TechEd, said that while developers were aware of the need to improve performance, they were tackling the wrong things. They were cutting the number of requests to the web server; shrinking JPEG sizes or employing a content delivery network vendor like Akamai Technologies, all of which had minimal effect said Souders.

"We used to tear apart the Apache [web server] code to figure out what Yahoo was doing," said Souders, who was previously at Yahoo.

But after performing a detailed analysis, Souders discovered something startling: Only 10 percent to 20 percent of the time it took to load a website could be attributed to the web server.

In fact, said Souders, the offending code was usually JavaScript, not because JavaScript files on a web page are large but because of the way browsers treat JavaScript.

"The first generation of web browsers decided that because they had to execute all of the JavaScript files in order, we might as well execute one while stopping all other downloads," he said, as well as preventing any other code from being executed or rendered.

That may have made sense a decade ago, but in today's era of PCs powered by dual and quad-core CPUs, it doesn't. And the cost of the delays created can be high.

Google has found that a 500-millisecond delay results in a 20 percent decrease in web traffic, while Amazon has seen a 100 millisecond delay cutting its sales by 1 percent, Souders said.

New and upcoming web browsers will be able to download JavaScript files while executing them. Internet Explorer 8, released last month, has this feature, Souders said, as do the upcoming Firefox 3.5 and Chrome 2.0.

Barring an overhaul of the JavaScript, the boost will stay small, Souders said.

To fix, Souders first recommends a free tool he created called Yslow that analyses and then grades how well a web page is designed for maximum speed. Originally developed for Internet Explorer, Yslow 2.0 is an add-on for Firefox integrated with the Firebug web development tool.

Using YSlow, users can see how much JavaScript is being loaded in the beginning, creating a bottleneck. Users can then split the JavaScript files, loading only the necessary JavaScript at the start and leaving the rest at the end after the words and images are already up, he said.

Yslow analyses 22 criteria in all. It is unsparing in its ranking. Popular Web sites such as, and Wikipedia, received a "C" from Yslow, while other popular sites earned an even worse "E."

"When I look at it, I feel like the teacher who hands out very severe grades," he said. Search engines with minimal content on the page, such as Google and Microsoft's, are among the rare sites that get an A from Yslow.

There are other tools besides Yslow for diagnosing performance bottlenecks. Microsoft offers the Visual Roundtrip Organizer, while AOL developed a now-open-source tool called PageTest.

All these tools judge website performance by a set of rules, though none of them matches YSlow's 22 criteria.

Souders' Powerpoint presentation is available online.