Beating latency for the better


The speed of website loading and many applications is partially determined by a factor called “latency”. When it comes to content delivery, the distance between the server and the end user plays a great role. The more data has to travel, the higher latency is. However, you can take certain measures to make it lower. Find out how to deal with latency effectively.

What is latency?

Latency is the time needed for a request to be sent from receiver to a server, processed and get back. In other words, this is a trip time from browser to server. This aspect mainly depends on the physical distance between the user and the server, and partially – on network quality. Other things that also define latency are storage delays, routers and propagation.

How to measure it?

Generally, latency is measured in milliseconds, but there are different ways to calculate it. For example, ping shows the delay time of how long the data is transferred in the network and back to the user. Faster ping is better, because it indicates that connection is fast and stable. Traceroute shows the time of every hop while data is sent to its destination. MTR combines both factors and provides an ample report that identifies the percentage of loss, average latency, etc.

Latency can’t be eliminated totally, but, fortunately, you can keep it under control with the following measures:

  1. CDN service is a technique that gives the most considerable effect. It caches web content on edge servers and reduces the distance between the server and the end user. CDN solution will be ideal for those who need better coverage for their customers and quick access in different regions.
  2. HTTP/2 lowers latency by decreasing round trips between receiver and sender thanks to parallelized transfers. However, this function can be implemented only in websites that support HTTPS.
  3. Make the number of external HTTP requests lower. If you prefer linking to the third party resources, their infrastructure should work quickly.
  4. Browser caching reduces latency considerably, because it eliminated the necessity for the browser to make extra requests to the server.
  5. CDN prefetching is required to make DNS lookups on a page in the background while user is visiting the website.
  6. Preconnect allows setting early connections before HTTP request gets to the server.

With these methods you will optimize latency and provide better user experience regardless of their location. Improve performance with bandwidth optimization:




Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s