The days of dial-up modems may be gone, but some websites are still surprisingly sluggish. The issue has gained urgency after Google recently introduced a change to its search algorithm that rewards sites that run faster. The search company says that users spend less time interacting with slower sites; adding to the issue are users visiting sites from mobile devices with spotty connections.
|Credit: Technology Review|
Aptimize, a startup based in Wellington, New Zealand, that launches its service in the United States today, says its software can speed up website load times, bringing increases of 200 to 400 percent in some cases. It says it can achieve these improvements entirely in software.
Ed Robinson, cofounder and CEO, says that companies often improve the speed of a website by throwing hardware at the problem. He contends that the fundamental problem is often the structure of the website itself.
To illustrate this point, Robinson talks about the way a website is loaded. When a user enters a URL, the browser has to carry out a series of tasks to load the page. First, it looks up the server it needs to visit, and then it contacts that server and retrieves the code that describes how the page should look. The browser has to follow the instructions in this code, often contacting the server several more times in order to load resources such as ads or images.
Robinson says that those round-trips to collect resources are a major culprit for slowing down websites. Even over a fast connection with a fast server, these steps take time, and any delays only exacerbate the problem. While it’s possible to design a website in a way that avoids these problems, he says the reality is that many aren’t written that way. Aptimize’s software optimizes for speed without requiring a customer to change anything about how the site is coded, either when it’s installed or in the future.
The software gets into the middle of the page-processing pipeline and makes it more efficient. It combines resources so they only have to be downloaded once. For example, it stitches any images that appear on the page into a mosaic, and sends just one image file to the browser, instructing the browser about how to slice up and display the mosaic.
Websites are often coded so that browsers load the same resources multiple times–for example, each time a user returns to the home page. Aptimize’s software identifies website resources that rarely change and tells the browser to cache them for longer than normal (about a year), which also speeds up loading times. And the software has a way of alerting the browser if one of these resources does happen to change. It also performs other optimizations, such as compressing files.
Robinson says that the company currently has about 120 customers, including Microsoft.com.
In upcoming months, he adds, Aptimize plans to focus on doing more to optimize sites for mobile devices. The company will add features such as the ability to detect a mobile device and adjust for its lower screen resolution by sending less information.
Speeding up websites isn’t just important for improving PageRank and performance on mobile devices, says Joe Skorupa, research vice president for data center transformation and security for Gartner Research. In many cases, even a few seconds of delay can cost businesses money. When it’s an e-commerce site, he notes, impatient customers will often leave without completing a sale if they feel a page is taking too long to load. Even for internal corporate websites, lost time can mean lost productivity, he says. If, for example, a Web application used in a call center takes 30 unnecessary seconds per call, that can force a company to hire additional staff members.
Skorupa says that the code that’s slowing down websites is often produced by the toolkits that developers use to speed up the coding process. Many companies can’t afford to spend enough time or money on optimizing their own code. So Skorupa believes there’s a lot of opportunity for companies like Aptimize to help optimize applications in new ways. “There is no shortage of bad code,” he says. “We see, unfortunately, little risk that the code coming in the future will be dramatically better than what we have today.”