It seems like the demand for websites to be performant and load instantly increases every year, and for business owners who need to deliver a fast experience, the stress is real.
The factors that go into a website’s speed are abundant and can get pretty technical, and understanding them is not always as simple as one might hope. A substantial amount of our work is helping client’s rehab their website performance, and so we thought it would be valuable to break down some of these concepts in detail and explain what they are and how they work.
If you research this topic further, you will find that there are actually hundreds of factors related to site speed, but we are going to cover the five most impactful ones. If you can optimize them for your website, you’ll be off to an excellent start.
Whenever we run initial scans on a slow website to begin diagnosing what issues lay underneath, we almost always immediately see large image files trying to be loaded by the browser.
Image compression is a term that refers to the actual file size of an image in kilobytes or megabytes. This is not to be confused with the dimensions of an image (it’s spacial size on a screen). You might assume that the larger an image’s dimensional size is, then the large it’s file size is as well. But that isn’t always the case.
Good image compression is the practice of crunching images down in file size to be as small as possible without losing image quality. The smaller an image’s file size is, the more easily (and quickly) it can be downloaded by a web browser. Shaving a few milliseconds off of each image’s load time across an entire website or page can drastically speed up performance.
In order to optimize image file sizes, you will need an image compressor. If you do not have access to design tools like Photoshop, fear not. Our favorite tool for “crunching” and optimizing images is a free web tool called TinyPNG Optimizer, which works with not only .png images, but .jpeg images as well.
Serving Scaled Images
Serving scaled images is a practice that Google Page Speed Insights favors in its scoring matrix when you scan a site. This is for good reason, Google ranks mobile responsive websites higher in search engine results pages (SERPS). For mobile users, images that are severed “at scale” on a website can save many bytes of data, especially for those in poor coverage areas.
The idea behind serving scaled images is pretty straight forward.
Images should be re-sized to the true dimension that they will render at in a particular page layout. This is preferable to manipulating an image’s size with HTML size attributes. For example, suppose you have an image that is displaying at 250px wide on your “About Us” page. Since it is rendering at that size, then its actual size should also be 250px. Our example image would not be considered “scaled” if its true size was 1050px wide and it was being forcefully sized down in HTML like this :
<img src=”https://website.com/image_file.jpg” width=”250″ />
The two most effective methods of managing JS are minifying it, and limiting its presence in above the fold content.
Content that is “above the fold” is what an end-user sees when a website first loads without the user scrolling down or interacting with the page. If large amounts of JS are crammed into this area of a site, web browsers will have to load it completely before loading the remainder of the page, which can lead to significant drag. A common method we use to avoid this, is programmatically deferring JS to the footer of the website.
Caching is a computer’s ability to store the output of a computation for a fixed period of time and retrieve that output without re-computing it.
Websites can be cached so that they can be served much faster to visitors while saving server processing power. There are many types of caching methods for websites, but the two most popular are browser caching and server-side caching.
Since most people visit their favorite websites with regular frequency, modern browsers can cache pre-compiled pages and serve them to the user.
Normally when someone visits a web page, the browsers sends a request to the server to retrieve the page. The server then grabs all of the components needed to render the page, and sends them to the browser to be parsed and rendered. This all takes place in the span of a few seconds, but despite that, it does still take up valuable time.
Browser caching skips all that. The browser saves a copy of the already assembled version of the page, and serves it to the user instantly without the need to parse it again.
This method of caching is similar to browser caching, except that the cached files and resources get cached (stored) on the server instead of inside the user’s computer browser. Instead of the server having to do the work of compiling a page’s components every time a request hits the server, a pre-assembled copy of the page can be served instead.
There are many more factors beyond these few that can be manipulated to improve a website’s speed. The ones we’ve shared here, though, are the things that have proven over and over to have the largest leaps in improvement on websites we’ve optimized.
Ever tried to optimize your site with no luck? Tell us how it went.