What is Latency

what is latency

What is Latency Defined

Many people have likely heard the term latency being used before but what is latency exactly? In terms of network latency, this can be defined by the time it takes for a request to travel from the sender to the receiver and for the receiver to process that request. In other words, the round trip time from the browser to the server. It is obviously desired for this time to remain as close to 0 as possible, however there can be a few things at play preventing your website latency times to remain low.

Causes of Network Latency

The question of what is latency has been answered, now where does latency come from? There are 4 main causes that can affect network latency times. These include the following:

  • Transmission mediums such as WAN or fibre optic cables all have limitations and can affect latency simply due to their nature.
  • Propagation is the amount of time is takes for a packet to travel from one source to another (at the speed of light).
  • Routers take time to analyze the header information of a packet as well as, in some cases, add additional information. Each hop a packet takes from router to router increases the latency time.
  • Storage Delays can occur when a packet is stored or accessed resulting in a delay caused by intermediate devices like switches and bridges.

Ways to Reduce Latency

Latency can be reduced using a few different techniques as described below. Reducing the amount of server latency will help load your web resources faster, thus improving the overall page load time for your visitors.

  1. HTTP/2: The use of the ever prevalent HTTP/2 is another great way to help minimize latency. HTTP/2 helps reduce server latency by minimizing the number of round trips from the sender to the receiver and with parallelized transfers. KeyCDN proudly offers HTTP/2 support to customers across all of our edge servers.
  2. Fewer External HTTP requests: Reducing the number of HTTP requests not only applies to images but also for other external resources such as CSS or JS files. If you are referencing information from a server other than your own, you are making an external HTTP request which can greatly increase website latency based on the speed and quality of the third party server.
  3. Using a CDN: Using a CDN helps bring resources closer to the user by caching them in multiple locations around the world. Once those resources are cached, a user’s request only needs to travel to the closest Point of Presence to retrieve that data instead of going back to the origin server each time.
  4. Using Prefetching Methods: Prefetching web resources doesn’t necessarily reduce the amount of latency per se however it improves your website’s perceived performance. With prefetching implemented, latency intensive processes take place in the background when the user is browser a particular page. Therefore, when they click on a subsequent page, jobs such as DNS lookups have already taken place, thus loading the page faster.
  5. Browser Caching: Another type of caching that can be used to reduce latency is browser caching. Browsers will cache certain resources of a website locally in order to help improve latency times and decrease the number of requests back to the server. Read more about browsing caching and the various directives that exist in our cache-control article.

Other Types of Latency

Latency occurs in many various environments including audio, networks, operations, etc. The following describes two additional scenarios where latency is also prevalent.

Fibre Optic Latency

Latency in the case of data transfer through fibre optic cables can’t be fully explained without first discussing the speed of light and how it relates to latency. Based on the speed of light alone (299,792,458 meters/second), there is a latency of 3.33 microseconds (0.000001 of a second) for every kilometer of path covered. Light travels slower in a cable which means the latency of light traveling in a fibre optic cable is around 4.9 microseconds per kilometer.

Based on how far a packet must travel, the amount of latency can quickly add up. Cable imperfections can also degrade the connection and increase the amount of latency incurred by a fibre optic cable.

Audio Latency

This form of latency is the time difference between a sound being created and heard. The speed of sound plays a role in this form of latency which can vary based on the environment it travels through e.g solids vs liquids. In technology, audio latency can occur from various sources including analog to digital conversion, signal processing, hardware / software used, etc.

How to Test for Network Latency?

Network latency can be tested using either Ping, Traceroute, or MTR (essentially a combination of Ping and Traceroute). Each of these tools is able to determine specific latency times, with MTR being the most detailed.

The use of MTR allows a user to generate a report that will list each hop in a network that was required for a packet to travel from point A to point B. The report will include details such as Loss%, Average latency, etc. See our traceroute command article to learn more about MTR and traceroute.

What is Latency – In Summary

This article has hopefully helped answer the question of what is latency and provided readers with a better understanding of what causes it. Latency is an inevitable part of today’s networking ecosystem and is something we can minimize, but not completely eliminate. However, the suggestions mentioned above are important steps to take in reducing your website’s latency and helping to improve page load times for your users. After all, in today’s internet age, the importance of website speed comes down to milliseconds and can be worth millions of dollars in gained or lost profits.

Related Articles

Leave A Comment?