If you’re in the IT industry, you most likely have a favorite news site or a blog you read regularly. And who could blame you? These sites are a great way to stay on top of the latest industry trends and breaking news. Since we depend so much on these sites, it’s important that they not only stay available but perform well for readers all over the globe.
We took a look at the top 20 IT news sites (according to Alexa) and evaluated them based on web performance best practices. All website homepages were monitored with the Constellix Sonar Lite Chrome extension for page load times using the waterfall check. We also used the Chrome developer console to determine page size and number of requests.
You may remember that just a few weeks ago, we analyzed the top shopping websites. This study helped us create a list of the best and worst practices for all kinds of online businesses, not just online retailers. This week, we will be evaluating the top IT news sites for adherence to our list of web performance best practices.
At the top of the list, ZDnet had the fastest page load times and was the #1 site on the Alexa top 500 IT News sites. This was unprecedented, as in our previous study the fastest website was #40. The difference was so surprising, we decided to take an in-depth look at how ZDnet was able to pull off their chart-topping speeds.
Our first study showed us there is a direct correlation between faster load times and smaller page size. However, ZDnet seemed to be an outlier with a page size a whole Mb greater than the average and six times larger than the smallest site.
But that wasn’t the only thing working against ZDnet, the site used a large number requests to render the page, coming in at 422 requests. That’s well over the average of 291 requests and landed ZDnet in 5th for most requests. This news site also suffered a penalty from YSlow, a web performance extension, that gave it a grade of F for the number of components requiring 4 or more DNS requests and an F for the number of external requests.
So how did ZDnet come out on top if it had a large file size and over 400 requests? Maybe the number of requests didn’t affect load times? We then compared page load times to the number of requests, but couldn’t find any trends that linked the two. It seemed ZDnet was able to overcome both weightiness and many requests by using a CDN.
A CDN, or content delivery network, uses Anycast technology to store caches (copies) of websites over dozens of locations around the world. The majority of the sites we analyzed used CDN’s, which likely explains why load times were so low even when page sizes were high.
But CDN’s can only do so much. Most websites are dependent on dozens, even hundreds, of requests to third-party hosted resources. Think of it this way, every widget or web font you use is considered a request. Each request uses a DNS lookup to find the domain that is hosting the resource. While these requests may only take a fraction of a second, they can add up when you have hundreds of requests.
The majority of the time we found that more requests tended to be attributed to larger page sizes. But this still didn’t seem to have an effect on page load times (as you can see in the graph above). As we saw earlier, page size seemed to only affect page load times about half of the time.
We then looked at request times, which is the time it took for all the requests to third party resources to load. We saw some pretty high request times, peaking at over 1 second for Technewsworld. That’s four times longer than the average request time and 152 times longer than the fastest request time.
Nearly a quarter of the domains had unusually high request times, coming in at about half a second. Since our average page load time was about 4 seconds, we figured half a second should have a pretty substantial effect on overall load time.
We found that the websites with longer load times were all over the map –from one of our fastest sites to some of the slowest. It seemed long request times could afflict any website, regardless of the number of requests or how many best practices were used.
In almost every case, it was a single request that was responsible for increasing the request time. For Internet.com, a request for a WordPress widget was delayed and took nearly half a second longer to load. May not seem like much, but that same site ended up with the second longest request time and cost it the first place spot for overall page load speed.
The Big Picture
When it comes to requests, less is always better. Think of it this way, every request you add is another risk. If you have the option to host the content on your own web servers, you should.
Limit images on your homepage! This includes background images, too. Minimalist design has taken over. But this trend’s popularity isn’t just because these designs look pretty. Minimalist websites hands-down outperform nearly every other kind of design. These designs use almost no images, leaving the visual appeal to white space and lines.
Minify everything you can. One website used an SVG logo at the top of their site that took over one second to load. This held up the rest of the site from loading and accounted for nearly half of the DOM load time. SVG’s can be compressed, which can cut load times in half. Another site had a GIF that took 8.3 seconds to load… While GIF’s may be a great way to show animated content, they can ruin your web performance. In some cases, embedding a video may offer better performance. If you’re determined to use a GIF, lower the frame rate and compress it (try free online tools, like this one).
Limit interactive content on the homepage. Carousels may be a great way to show a slideshow of your breaking news stories, but they can cost you half a second of more for your DOM interactive load time. Three of the bottom ten websites we looked at used carousels that impacted load time.
Ads. They are great to monetize your site but can be extremely costly to your web performance. Of the IT news sites we looked at, we saw a strong correlation between the number of dynamic ads and number of requests –which in turn increased the risk of longer load times.