Internet LatencyInternet latency is the time it takes for a client such as a mobile phone to get an initial response from an internet service such as a website. Internet latency impacts the speed of messaging, voice over IP, email, websites and some mobile apps.
Low LatencyLow latency describes a fast response time. Generally speaking, a latency of less than 10 milliseconds is considered low. internet backbone is a high capacity data route on the internet that connects different networks. Generally speaking, you will have lower latency if you are close to an internet backbone or multiple internet backbones. This is a common consideration in the selection of locations for data centers.
Packet LossPacket loss is a message between you and a host that is lost in transit. Networks experiencing problems may drop all or some of your messages. This may appear to the user as high latency as software may automatically retransmit failed packets but things become slow.
PingPing is a tool available on many computers to test your latency to a host. For example:
TracerouteTraceroute is another common tool for testing latency that provides more details than ping. It lists the hops that a message takes from client to server.
Wired vs WirelessGenerally speaking a fiber optic connection provides lower latency than wireless communications. Signals in a fibre optic cable travel at approximately 186,282 miles per second, the speed of light.
Latency vs BandwidthBandwidth is the total amount of data that can be theoretically transferred in a second based on the technologies and network you are using. Bandwidth is important for downloading large files quickly. Latency is important for transferring small amounts of data and getting a quick response back. For example, a connection needs both high bandwidth and low latency to stream an HD video feed of a live event in real time. However, if the event isn't live latency doesn't matter much because the video can be cached before it begins to play. In other words, on a high bandwidth and high latency connection a video might be slow to start but should work once it begins to play. It should be noted that "high bandwidth" is good and "high latency" is bad.
Bandwidth vs ThroughputThroughput is the actual bandwidth of a connection at a point in time. For example, a connection may have a bandwidth of 100 Mbps but an actual throughput of 5 Mbps during peak usage hours.
ProximityThe best way to reduce your latency is to have a high speed fiber connection that is physically close to the service you want to use. For example, a bank might lease data center space that is a block away from a major stock exchange to achieve low latency trading. This typically requires that the data center have a direct link to the exchange.Edge computing is the practice of deploying a service to multiple data centers to perform computation close to users. For example, a mobile app might deploy its backend services to dozens of data centers so that a user in Dubai is served from a data center in Dubai. This greatly reduces latency.content delivery network is a platform for delivering static files such as videos, pictures and html from data centers that are physically close to each user to reduce latency.
The response time of a technology.