registration_complete

Registration completed successfully

sent-mail

the password link was sent to you by email

the password changed

speed

You have reached the limit of 4 URL checks for unregistered users

Register to gain access to an unlimited number of checks

Find Out All the Ways to Speed Up Site Loading

The website performance industry is a rather young branch of web development and is actively developing. The importance of speed of sites for the Internet business is already obvious, it becomes one of the factors of competition. That is why it is worthwhile to optimize the speed of the site and make investments in this area.

Test speed for specific URL
img-border Website speed test and optimization

Table of contents

 

 

Everyone knows that a slow site is bad. Because of the braking site, serious problems arise in the solution of everyday tasks. Sometimes it’s just annoying. Often, braking of the site is a breakdown, denial of service – people do not wait for downloads and leave. This is relevant for cases of radical braking of the site, for example, when the start of page rendering begins 8-10 seconds after the click.

Even with a relatively favorable situation with the site (with a fast download on the wired Internet and a modern computer), delays in downloading can lead to audience loss and lower conversion rates. For example, Amazon conducted an experiment in which it found out that every 100 ms (0.1 s) delays lead to a decrease in sales by 1%.

But more than half of the Internet audience today use mobile devices to access sites. So they can use slow channels for access and processors to download the site.

The third reason for the importance of the speed of the site is technical. Typically, slow sites consume an increased amount of hosting resources, which leads to additional costs. Brakes of the server part reduce the ability to experience problematic peak loads on the site.

Therefore, the speed of the site should be dealt with both from a technical and an economic point of view. In this article we will concentrate on the technical side of website acceleration.

 

Website speed: the main components

 

The speed of the site concerns two sides: client and server. To date, each of these parts is equivalent to the final result. But each with its own characteristics.

In order to understand what the loading time for a page on the site is, let’s take a look at the process. As a result, we will be able to understand where the server and client optimization capabilities are located.

The full process of downloading the site (first visit) is as follows:

  • DNS-query by site name.
  • Connection to the server by IP (TCP connection).
  • Establishing a secure connection when using HTTPS (TLS connection).
  • Request HTML page by URL and server wait (HTTP request).
  • Loading HTML.
  • Parsing an HTML document on the browser side, building a query queue in document resources.
  • Loading and parsing CSS-styles.
  • Loading and executing JS-code.
  • Beginning of page rendering, execution of JS-code.
  • Download web fonts.
  • Loading images and other elements.
  • End rendering of the page, execution of deferred JS-code.

In this process, some positions occur in parallel, some may change places, but the essence remains the same.

Server optimization deals with the stages from the first to the fourth inclusive. Steps 5 through 12 are client optimization. The time spent on each of these stages is individual for each site, so you need to get site metrics and identify the main source of problems. And here we turn to the question of how to get these metrics and interpret them.

 

Measuring website speed

 

The main question: what needs to be measured? There are many metrics for the speed of sites, but there are not so many basic ones.

First, this time to the first byte (TTFB – time to first byte) is the time from the start of the download process to the receipt of the first portion of data from the server. This is the main metric for server optimization.

Secondly, this is the beginning of the page rendering (start render, first paint). The metric shows the time until the end of the “white screen” period in the browser, when the page begins to draw.

Thirdly, it’s loading the main elements of the page (load time). This includes loading and interpreting all the resources for working with the page, after this mark, the page load indicator stops spinning.

Fourth, this is the full load of the page: the time before the end of the main activity of the browser, all the main and deferred resources are loaded.

These basic metrics are measured in seconds. It is also useful to have an estimate of the amount of traffic for the third and fourth metrics. Traffic needs to be known to assess the effect of connection speed on load time.

Now we need to understand how to test the speed. There are many services and tools for assessing the metrics of the speed of downloading sites, each of which is better for its task.

One of the most powerful tools is the developer panel in the browser. The most advanced functionality in the panel is in Chrome. On the Network tab, you can get metrics for the load time of all elements, including the HTML document itself. When you hover over an item, you can see how much time is spent for each step in obtaining the resource. To evaluate the full picture of the page load process, you can use the Performance tab, which gives full details up to the time of decoding the images.
If you need to assess the speed of the site without full granularity, it is useful to start an audit of the site (Audits tab), it will be conducted using the Lighthouse plug-in. In the report, we get an estimate of the speed for mobile devices (both integral in points, so according to our basic metrics) and several other reports.

To quickly evaluate client optimization, you can use the Google PageSpeed ​​Insights service or Sitechecker (we use API of Google PageSpeed Insights). Finally, it is useful to analyze the time of downloading the site from real users. For this, there are special reports in the web analytics systems Yandex.Metrics and Google Analytics.

Landmarks for the loading time of the site are as follows: the start of the rendering is about 1 second, loading the page within 3-5 seconds. In such a framework, users will not complain about the speed of the site and download time will not limit the effectiveness of the site. These figures should be achieved by real users, even in difficult conditions of mobile connection and outdated devices.

 

Server optimization

 

Let’s move on to the very acceleration of the site. Optimizing the server part is the most understandable and obvious measure for site developers. First, the server part is easily monitored and controlled on the side of system administrators. Secondly, with serious problems with server response time, the slowdown is noticeable for everyone, regardless of the speed of the connection or the device.

While the reasons for braking the server side can be very diverse, there are typical places to look at.

 

Hosting (server resources)

This is the reason for braking number one for small sites. For the current load of the site, there simply is not enough hosting resources (usually CPU and disk system speed). If you can quickly increase these resources, it’s worth a try. In some cases, the problem will be solved. If the cost of additional resources becomes higher than the cost of optimization work, you need to go on to the following methods.

 

DBMS (database server)

Here we are already turning to the source of the problem: the low speed of the program code. Often, most of the time a web application is spent on database requests. This is logical, because the task of a web application is to collect data and convert them according to a certain pattern.

Solving the problem of slow responses from the database is usually divided into two stages: DBMS tuning and query optimization and data schemes. Tuning DBMS (for example, MySQL) can give the acceleration several times, in case the tuning has not previously been done. Thin tuning can give an effect within a dozen percent.

Optimizing queries and data schemas is a radical way to accelerate. Due to this optimization it is possible to obtain acceleration by several orders of magnitude. If the change in the structure of the database can occur without intrusion into the program code of the site, then the optimization of requests such an intervention will require.

To identify slow queries, you need to collect statistics on the load on the database for a fairly long period of time. Then the log is analyzed and the candidates for optimization are identified.

 

Effect of CMS and program code

Quite widely believed that the speed of the site depends only on the CMS (“engine”). Site owners often try to split the CMS into fast and slow. Actually this is not true.

Of course, the load on the server depends on the code that is included in the CMS in use. However, most popular systems try to optimize for maximum speed and fatal problems with the speed of the site should not be.

However, in addition to the main CMS code, the site can contain additional modules (plug-ins), extensions and modifications from the developers of the site. And already this code can have a negative impact on the speed of the site.

In addition, speed problems occur when the system is misused. For example, the system for blogs is used to create a store. Or the system for small sites is used to develop a portal.

 

Caching

The most powerful and universal means of increasing server speed is traditionally caching. Here we are talking about server-side caching, and not about caching headers. If the calculation of the result (assembly of the page, block) requires significant resources, put the result in the cache and periodically update it. The idea is simple and complex at the same time: caching systems are built into programming languages, site management systems and web servers.

Typically, page caching allows you to reduce the page’s rendering time to tens of milliseconds. Naturally, in this case, the server easily experiences peaks of attendance. There are two problems here: not everything can be cached and the cache must be properly disabled (discarded). If problems are solved, caching can be recommended as an effective means of server acceleration.

 

Optimization of TCP, TLS, HTTP/2

In this part, we combined the subtle network optimizations that give server acceleration. The effect here is not as large as in other methods, but it is achieved exclusively by setting, that is, free.

Tuning TCP today is required for large projects and servers with a connection from 10G, the main thing to remember: the network subsystem is regularly updated with the release of new Linux kernels, so it’s worth updating. Correctly configuring TLS (HTTPS) allows you to obtain a high level of security and minimize the time to establish a secure connection. Good recommendations are released by Mozilla.

The new version of the HTTP protocol – HTTP/2 is designed to speed up the download of sites. This protocol appeared recently and is now actively used (about 20% of the share among websites). In general, in HTTP/2, the mechanisms of acceleration are actually laid, the main one is reducing the effect of network delays on page load time (request multiplexing). But the acceleration due to HTTP/2 is not always successful, so do not rely on this protocol.

 

Customer optimization

 

Unlike server optimization, the client is aimed at everything that happens in the user’s browser. Because of this, control is complicated (different devices and browsers) and many different optimization directions arise. We will look at the most effective and universal methods that can be used in almost any project.

 

Critical path optimization: CSS, JS

Critical path of rendering (critical rendering path) – a set of resources for starting the page rendering in the browser. Typically, this list includes the HTML document itself, CSS-styles, web fonts and JS-code.

Our task as speed optimizers is to shorten this path both in time (taking into account network delays) and in traffic (to take into account slow connections).

The easiest way to determine the critical path is to launch an audit in Chrome (in the Developer Panel), the Lighthouse plug-in determines its composition and boot time, taking into account the slow connection.

The main technique in reducing the critical path: we remove everything that is not necessary or can be postponed. For example, most of the JS code can be deferred before the page loads. To do this, place the JS resource call at the end of the HTML document or use the async attribute.

For delayed loading of CSS it is possible to use dynamic connection of styles through JS (waiting for the domContentLoaded event).

 

Optimizing Web Fonts

Connecting web fonts today has become almost the standard in design. Unfortunately, they negatively affect the speed of page rendering. Web fonts are additional resources that you need to get before you start drawing text.

The situation worsens because often pointers to font files are buried in a CSS file, which also does not come instantly. Many developers like to use public web font services (for example, Google Fonts), which causes even more delays (additional connections, CSS file).

The optimization rules consist in reducing the size of web font traffic and getting them as quickly as possible.

To reduce traffic, you need to use modern formats: WOFF2 for modern browsers, WOFF for compatibility. In addition, you need only include those character sets that are used on the site (for example, Latin and Cyrillic).

To influence the rapid display of web fonts, you can use the new link rel=”preload” specifications and the CSS property font-display. Preload will allow you to tell the browser as soon as possible about the need to download a font file, and font-display provides a flexible way to control the behavior of the browser in the event of file delay (wait, draw a spare, do not wait for a font more than three seconds)

 

Optimizing images

Images are the majority of the weight of a modern site. Of course, pictures are not as critical resources for a page as CSS and JS code. But for many sites, images are an important part of the content: remember any product card in the online store.

The main technique for optimizing images is to reduce their size. To do this, use the correct format and compression tools:

  • PNG for images with transparency and text;
  • JPEG for photos and complex images;
  • SVG for vector graphics.

In addition to these formats, new ones are being developed: for example, WebP from Google. This format can cover the area of ​​use of PNG and JPEG – supports lossy and lossless compression, transparency and even animation. To use it, it’s enough to create a copy of the images in WebP and give them to the browsers that support them.

For PNG, there are many optimization utilities that can be used to reduce the size, for example, OptiPNG, PNGout, ect and others. Also, internal optimization of data compression can be performed using zopfliPNG. The main idea of ​​such software is in selecting the optimal compression parameters, removing unnecessary data from the file. You need to be careful here: some utilities have a mode with loss of quality, which may not suit you (if you expect the same image to be output).

Optimization of JPEG is also divided into two types: lossy and lossless. In general, you can recommend the Mozilla JPEG package, which is specially designed for better compression in this format. For lossless optimization, you can use jpegtran, with losses – cjpeg.

 

Caching headers

This is the most simple method of client optimization. Its meaning is in caching the browser of rare resources: images, CSS and JS-files, fonts, sometimes even the HTML document itself. As a result, each resource is requested from the server only once.

If you are using Nginx, just add the directive:

add_header Cache-Control "max-age=31536000, immutable";

From now on, the browser has the right to cache resources for up to a year (which is almost forever). The new parameter “immutable” indicates that the resource is not going to be changed.

Of course, the question arises: what if we need to change the cached resource? The answer is simple: change its address, URL. For example, you can add a version to a file name. For HTML documents, this method is also applicable, but, as a rule, a shorter period of caching is used (for example, one minute or one hour).

 

Data Compression

Compulsory practice is the compression of any text data when transferred from the server to the browser. Most web servers have a gzip-compression implementation of responses.
However, simple compression activation is not enough.

First, the compression ratio is adjustable and should be close to the maximum.

Secondly, you can use static compression, that is, pre-compress files and put on disk. Then the web server will search for the compressed version and immediately give it away. Thirdly, you can use more efficient compression algorithms: zopfli (compatible with gzip) and brotli (new compression algorithm). Brotli will only work with HTTPS. Since these algorithms (especially zopfli) are costly to compress, we always use them in the static version.

To maximize the effect of compression on files, the minification process is preliminarily applied: cleaning up unnecessary translations of strings, spaces and other unnecessary characters. This process is specific to each format. Also, you should take care of compressing other text data on the site.

 

Using CDN

 

The application of CDN (content delivery network) for website acceleration is a very advertised measure, having a lot of marketing hull around the essence of technology.

 

Theory: why

Initially, CDNs were designed to offload the Internet channels of broadcast media sites. For example, when watching a live video, several thousand viewers create a very heavy load on the server’s bandwidth. In addition, to ensure uninterrupted quality of communication with large client and server removal is extremely difficult (due to delays and instability of the network).

The solution to this problem was to create a CDN, that is, a distributed network to which clients (for example, viewers) were connected, and the hosts of this network are already on the server (origin). At the same time, the number of connections to the server was reduced to one (several), and the number of connections to the CDN could reach millions due to the caching of content by the network.

Today, most CDNs position themselves as a means of speeding up websites, primarily by reducing the distance from the content to the client (the site visitor).

 

Possible effects

How can I speed up a site using CDN?

Yes, indeed the user connects, as a rule, to the near (by access time) network server and gets a fast process of establishing TCP and TLS connection. Further, if the content is on the CDN server, the user can quickly receive it. Thus, the load on our own server is reduced.

Secondly, CDN can not just distribute content without changes, but optimize it on its side and give it in a more compact form: compress images, apply compression to the test, etc. Due to such optimizations, you can get a shorter download time.

 

Disadvantages of using CDN

Disadvantages, as usual, continue the advantages: the object may not be in the CDN node cache. For example, it has not yet been requested or can not be cached (HTML document). In this case, we get additional delays between the CDN node and our server.

Despite the fact that CDNs are designed to speed up access to the site, there are situations when the network route will be less optimal than without the CDN.

Finally, content delivery networks are very complex systems, where crashes, instability and other problems are also possible everywhere. Using CDN, we add one more level of complexity.

 

We fix the result

 

Let’s say you managed to achieve good site speed. Users and owners of the resource are happy. On this you can forget about the issue of speed? Of course not. To achieve the constant quality of the site’s work, you must constantly maintain the site and monitor.

 

Acceleration Support

Any live web project is regularly updated, changes occur both in common templates (themes of design, interfaces), and content. Also, the program code (both client and server) is actively changing.

Each change can affect the speed of the site. To monitor this impact, you need to implement a system of synthetic site speed monitoring at the development stage. Thus, speed problems can be intercepted before users notice them.

To optimize the incoming content, integration of optimizing procedures into the content management system is required. First of all, this concerns image processing.

Acceleration of sites is a very dynamic area: new standards are emerging, their support by browsers is changing. Therefore, it is important to regularly audit the technology of the project, processes and software used.

 

Monitoring of real user speed

Synthetic testing in ideal laboratory conditions is very useful for assessing changes in the system code, but it is not enough. In the end, we want the site to work fast for real users. To collect such data there is a monitoring of the speed on the user side (RUM – real user monitoring).

To organize RUM, it is enough to connect one of the web analytics systems (Yandex.Metrica, Google Analytics) and see reports on the time of the site download. For more detailed and accurate data, you can use specialized speed monitoring services.

 

Сonclusions

The theme of site speed is extensive and affects many aspects of developing and supporting a web application: from server code to content. This means that getting good results is impossible without involving the development team.

The most important thing: remember about users, take into account the various conditions for using the site. Acceleration of the site is a process that occurs with different intensity throughout the life cycle of the project.