What is Latency?

High latency is bad news for your network

The term latency refers to several kinds of delays typically incurred in the processing of network data. A low-latency network connection experiences small delay times, while a high-latency connection experiences long delays.

Besides propagation delays, latency may also involve transmission delays (properties of the physical medium) and processing delays (such as passing through proxy servers or making network hops on the internet).

Latency and Network Speed

Though the perception of network speed and performance is usually understood as bandwidth, latency is the other key element. The average person is more familiar with the concept of bandwidth because that's the metric that manufacturers of network equipment typically advertise. Still, latency matters equally to the end-user experience. In slang terms, the word lag often refers to inferior performance on a network.

Laptop computer streaming data

John Lamb / Digital Vision / Getty Images

Latency Versus Throughput

On DSL and cable internet connections, latencies of less than 100 milliseconds (ms) are typical, and less than 25 ms is often possible. With satellite internet connections, on the other hand, typical latencies can be 500 ms or higher.

Excessive latency creates bottlenecks that prevent data from filling the network pipe, thus decreasing throughput and limiting the maximum effective bandwidth of a connection. The impact of latency on network throughput can be temporary (lasting a few seconds) or persistent (constant), depending on the source of the delays.

Although the theoretical peak bandwidth of a network connection is fixed according to the technology used, the actual amount of data that flows over the network (called throughput) varies over time and is affected by higher and lower latencies.

Latency of Internet Services

An internet service rated at 100 Mbps can perform noticeably worse than a service rated at 20 Mbps if it is running with high latency.

Satellite internet service illustrates the difference between latency and bandwidth on computer networks. Satellite possesses both high bandwidth and high latency. When loading a web page, for example, most satellite users observe a noticeable delay from when they enter the address to the time the page begins loading.

This high latency is due primarily to propagation delay as the request message travels at the speed of light to the distant satellite station and back to the home network. However, once the messages arrive on Earth, the page loads quickly, like on other high-bandwidth internet connections (such as DSL and cable internet).

Software and Device Latency

WAN latency occurs when the network is busy dealing with traffic to the point that other requests are delayed because the hardware can't handle all of it at maximum speeds. This affects the wired network, too, because the whole network is operating together.

An error or other problem with the hardware can increase the time it takes for the hardware to read the data, which is another reason for latency. This may be the case for the network hardware or the device hardware, like a slow hard drive that takes time to store or retrieve data.

The software running on the system can cause latency, too. Some antivirus programs analyze all the data that flows in and out of the computer, which is why some protected computers are slower than their counterparts. The analyzed data is often torn apart and scanned before it's usable.

Measuring Network Latency

Network tools such as ping tests and traceroute measure latency by determining the time it takes a given network packet to travel from source to destination and back, called round-trip time. Round-trip time is a measurement of latency, and it is the most common. Quality of service (QoS) features of home and business networks are designed to manage bandwidth and latency to provide more consistent performance.

Was this page helpful?