Low Latency and the Need for Speed

The phrase low latency is thrown around a lot in the streaming industry, but few people actually know what it means. Here is a breakdown of what we mean by latency and why it’s important to you.

What is latency?

When it comes to video, live hardly ever means live. If you’re at home watching a football game and your team scores a goal, the fans at the stadium saw that happen before you did. Sometimes, 30 seconds before.

Latency is what we call that delay – the time between when the camera records video, and when the video is displayed. The lower the latency, the less delay.

Why does latency exist?

In an ideal world, there would be no such thing as latency. However, it takes time to package and transmit data before it can be viewed online. There are a few factors which play a role in driving up latency:


If you are hosting an event in London and two people are trying to watch the live stream, then the person viewing from Manchester will have a lower latency than the person tuning in from Tokyo. That’s a simple fact of physics – like everything else the transmission of information is still bound by the indomitable speed of light.


The more people try to watch the stream, the more requests are going through, and the slower the stream. It works exactly like a motorway, the more cars on the road, the slower things move. The amount of traffic a stream receives affects how much delay there is.

When is low latency important?

Nobody loves latency, and for certain sectors in particular it can cause real problems.
• In live sports, low latency means that fans can tweet about a goal before the stream has caught up.
• News organisations often interview remote speakers by video, where noticeable delays between question and answer can be awkward.
• Meanwhile in betting, every second counts. Odds can change on the fly, and punters need to be able to place bets as close to the betting window closing as possible.

All in all, latency leads to revenue losses and a less satisfactory user experience. In these industries particularly, streaming technology has always been pushing for the lowest possible latencies.

The latest on latency

Recently latency has been a hot topic due to the discontinuation of Flash by Apple, Google and Android. Beforehand the lowest latency players were based off of RTP/RTMP which were Flash-based, and provided the faster service. But Apple, Google and Android have since dropped Flash and favour HTML5/HLS. This new standard is more scalable and secure, but typically boasts a higher latency of around 30 seconds as standard.

Thus StreamAMG has developed a solution which combines the adaptability of HLS with the speed of RTMP, with our low latency players coming in at just 4 to 6 seconds of delay!

We achieve this through a two-pronged attack pattern:
1. Our unique logic and streaming architecture which is achieved in a different way to the usual methods (trade secret)
2. Our extensive partnership with Akamai. We utilise the global Akamai Content Delivery Network (CDN) to regularly broadcast to audiences of over 150,000 concurrent users worldwide. This CDN has been vital in our services that we provide to our top tier clients such as the Supreme Court, GAAGO and the European Council.

Our solution is well-suited to delivering rapid, high quality streams live to locations around the world, regardless of location and traffic.

We at StreamAMG are enormously proud of our low latency solutions and delight in being at the forefront of these developments within the industry. If you would like to know more then why not get in contact at or speak with an expert at 0800 061 2361.