Of Lag, Throughput and Jitter

Over the last few years quite a number of people have asked me to explain the difference between latency, throughput and jitter. I was surprised at how many of them were programmers <cough, errr software engineers>, but thinking things through, that's probably not surprising because most programmers don't usually end up at the sharp end of networking. So, here for your edification is the Alan Lenton guide to latency, throughput and, as a special bonus, jitter!

Perhaps the best way to explain it is using an analogy. Suppose we have a wedding party, The bride and groom have exchanged their vows, the photographs have been taken and it's time to travel to the reception. There are 24 limos to take the party to the reception. Everyone gets in. The cars line up behind the happy couple's car and it's time to move off.

Now, as it happens there is a single lane highway that just happens to go from where the party are now, to the hall where the reception is being held. As luck would have it, the highway is empty and the cars are able to proceed single file at top legal speed (70 miles/hour where I come from) to the reception. The time it takes each car to get to the reception (let's say 16 minutes) is the latency.

Of course, getting the whole party there takes longer than that, because the cars are spaced out for safety, so lets say it takes another 12 minutes between the time the first car arrives and the last one discharges its passengers. This is the throughput - two cars/minute. So in this scenario it takes a total of 28 minutes from the time the first car departs to the time the last car pulls in, the passengers get out and we can toast the happy couple with Verve Clicquot NV champagne.

OK - now let's look at a scenario where the road-making machines have been busy during the wedding ceremony and have widened the highway to a two lane affair (fast machines and slow grooms). This time the cars line up two abreast and set off. It still takes the same time for the first cars to get there, 16 minutes, but since they are arriving two at a time it only takes half the time, six minutes, between the first and last car's arrival - giving us four cars/minute. That makes a total of only 22 minutes for the whole caboodle.

Of course, if we had a six lane highway we could get all the cars through in two minutes bringing the total time down to 18 minutes. That's probably as much increase in throughput as is worth doing at this stage since the time cost of the latency is now much greater than that of the throughput.

So how would we reduce the latency? Well, we could try and reduce the distance travelled - take out all the bends in the highway and have it made into a straight line between the start and the finish. That's probably a little excessive, so the most obvious way to reduce the latency is to speed up the cars. So let's assume that there are no highway patrols and no speed cameras so we can double the speed. That means that it only takes eight minutes for the first cars to arrive - if you happen to have a six lane highway, the total time comes down to ten minutes from start to finish. That's not bad!

And jitter? Now this is interesting. In this case, imagine everyone has been sloppy about setting their cruise controls so that the speeds of the individual cars are slightly different. Not a lot different, just a tiny difference. This means that by the time the cars arrive they are just a little bit out of formation and the time between the arrival of each car is slightly different. We call this difference jitter. If the jitter is really bad, some cars might arrive out of order, and those of you who have attended weddings will know that this is serious, and can cause undying feuds...

So we now have latency, throughput and jitter. How do these apply to the Internet?

Well, modern networks divide the stuff you are sending (or receiving) into packets of data (the cars) and send them off to their destination. Network latency is considered to be the time it takes for you to send a packet and to receive an answer (a round trip, rather than the one way trip we used), throughput is considered to be the amount of data you send each second - usually measured in bits/second - and jitter is a measure of the difference in time between the arrival of packets.

And how does it affect you?

Well, if you are playing an online game the most important factor is latency (lag in the common game parlance). The longer it takes for you to get a response to your command from the game, the more difficult and frustrating it is to play. Of course, the acceptable lag/latency is different for different types of games. For a turn-based game a lag of several seconds will not make any appreciable difference, but for an on-line shoot 'em up, even a few tens of milliseconds can be too much. A lagged combat flight sim will have you shooting at opponents who are no longer actually in sight - even though most flight sims are slowed down by a factor of at least four!

The normal way to handle this is to use a probabilistic method of determining whether the bullet has hit. The easiest way to visualise it is to think of a cone of probability of bullets hitting, rather than a stream of single bullets. Unless of course you are lazy, in which case you offload the work to the client, instead of doing it at the server. You will then find yourself in an arms race with your players, who will repeatedly hack the client to give themselves an advantage.

If you are downloading a large file, the important factor is throughput. Large files have hundreds of thousands of packets. A several second initial lag when you ask for the file is nothing compared to the tens of minutes it takes between the first packet to arrive and the last one. The more packets you can get through the pipe at the same time (the more lanes), the shorter the download.

Finally, jitter matters if you are receiving streaming video and audio. In video, if the packets are arriving at slightly different intervals, and you are watching them in real time, then the picture will appear jerky. In audio, the voice or music is distorted by the timing.

Partly this is caused by congestion, and partly it's in the nature of the internet, which you should think of as being like a fishing net with computers and routers are each knot, rather than two tin cans with a piece of string stretched between them. In the internet different packets can take different routes, some of which are longer than others, and as we all know from school physics, even light travels at a finite speed, so longer routes will take a longer time.

There are solutions, of course, and the most common is buffering - you don't start playing the video until you have a chunk of it already downloaded, so you are never in a state where the next packet isn't already there, and each packet can be displayed at the correct time. Unfortunately, you can't delay people's phone calls in this way, which is why Voice over Internet Protocol (VoIP) can still be a little on the ropey side.

So now you have more than you ever wanted to know about latency, throughput, and jitter. Think of it as a little something for you to use to demonstrate your superiority at both dinner parties and wedding receptions - especially if you are the best man!


Confession - this started life as a short piece in my weekly newsletter, Winding Down. I've rewritten and extended it for this rag. Normally, I write fresh stuff for CVu, but this issue I plead a stinking cold striking just as the deadline came up, and exhaustion from having to actually work and commute - yes, surprise, surprise, I finally got a job!

Alan Lenton


This article originally started life as a slide in a talk about network programming at an ACCU Spring Conference. It was then fleshed out a little as an information piece in my 'Winding Down' weekly e-mail newsletter. The current, extended, version appears in the ACCU magazine 'CVu for March 2010.


Read other articles about computers and society

Back to the Phlogiston Blue top page


If you have any questions or comments about the articles on my web site, click here to send me email.