Loading a web page is surprisingly complex—whenever you click on a link, or otherwise type in a URL, the browser has to gather together a host of objects, including HTML files, JavaScript, pictures, video files, and more.
Each object is evaluated and then added to the page on the screen. The actual evaluation process can lead to the browser having to retrieve other, dependent objects, but in these instances, the browser does not know what these dependencies are until the first object’s been retrieved. If, however, they did know this information, then they’d be able to pull more files in one retrieval process. This would reduce the amount of back-and-forth jumping the browser needs to do between networks which, in turn, would reduce the amount of time it takes to load a page.
A team from MIT’s Computer Science and Artificial Intelligence Laboratory believes they’ve created a code that solves this predicament.
“As pages increase in complexity, they often require multiple trips that create delays that really add up,” explains Ravi Netravali, one of the researchers. “Our approach minimizes the number of round trips so that we can substantially speed up a page’s load-time.”
Referred to as “Polaris”, the program logs all the dependencies and interdependencies on a web page. It them compiles all of this information into a graph for the web page that a browser can then use to download page elements much more efficiently.
In a press release announcing the breakthrough, the researchers drew parallels between the program and the work of a travelling salesperson:
When you visit one city, you sometimes discover more cities you have to visit before going home. If someone gave you the entire list of cities ahead of time, you could plan the fastest possible route. Without the list, though, you have to discover new cities as you go, which results in unnecessary zig-zagging between far-away cities…
For a web browser, loading all of a page’s objects is like visiting all of the cities. Polaris effectively gives you a list of all the cities before your trip actually begins.
The code was tested on 200 different websites, including ESPN, Weather.com, and Wikipedia—some of the most robust web pages on the Internet today. On average, the code was able to load web pages 34% faster than a standard browser.
Worth noting: the code was written in JavaScript. This means it can be introduced to any website and used with unmodified browsers (it needs only to be running on the server in question—this will allow for it to automatically launch for any page load).
Immediately speaking, the researchers will present their findings at USENIX Symposium on Networked Systems Design and Implementation. Down the road, though, they hope to see the code integrated into browsers across the board, where it could “enable additional optimizations that can further accelerate page loads.”
Via MIT
Learn more about Electronic Products Magazine