HomeTechnologyNetworkingWhat is Latency?
Technology·2 min·Updated Mar 9, 2026

What is Latency?

Latency in Networking

Quick Answer

Latency is the time it takes for data to travel from one point to another in a network. It is usually measured in milliseconds and affects how quickly information is received and processed.

Overview

Latency refers to the delay before a transfer of data begins following an instruction for its transfer. In networking, it is the time it takes for a data packet to travel from the sender to the receiver. Latency can be caused by various factors, including the distance data must travel, the speed of the network, and the processing time at each device along the route. Understanding how latency works is crucial for optimizing network performance. For instance, when you play an online game, high latency can cause delays between your actions and the game's response, leading to a frustrating experience. This delay can be influenced by the type of connection you have, such as fiber optics versus traditional copper wires, as well as the number of devices using the network at the same time. Latency matters because it directly impacts the quality of online activities, from video streaming to video calls. A lower latency means a smoother experience, while higher latency can result in buffering or lag. For example, a video call with a latency of 20 milliseconds will feel much more natural than one with a latency of 200 milliseconds, where pauses and delays can disrupt the conversation.


Frequently Asked Questions

High latency can be caused by several factors, including long distances between devices, network congestion, and slow routing of data. Additionally, the type of internet connection and the hardware used can also contribute to increased latency.
To reduce latency, you can try using a wired connection instead of Wi-Fi, which often has lower latency. Upgrading your internet plan, optimizing your router settings, and minimizing the number of devices connected to the network can also help.
No, latency and bandwidth are different concepts. Latency refers to the delay in data transmission, while bandwidth is the maximum amount of data that can be transmitted over a network in a given time. High bandwidth can help transfer data faster, but if latency is high, it can still slow down the overall experience.