This week, Cloudflare introduced Concurrent Streaming Acceleration, a new technique for reducing the end-to-end latency of live video on the web when using Stream Delivery.
Live video is becoming an increasingly large part of content streaming. Cloudflare’s question is how close to real-time that live video really is. There are a few steps between the event taking place and the viewer seeing it on screen. They’re calling that time in between “end-to-end latency” and this update is a step toward reducing that latency period.
Check out Cloudflare’s blog post that breaks down the causes of latency, with these three main points:
- Chunked encoding splits up video segments into smaller pieces
- This can reduce end-to-end latency by allowing chunks to be fetched and decoded by players, even while segments are being produced at the origin server
- Some CDNs neutralize the benefits of chunked encoding by buffering entire files inside the CDN before they can be delivered to clients
Cloudflare sums up their solution to longer latency periods like this. “Put simply, we now have the ability to deliver un-cached files to multiple clients simultaneously while we pull the file once from the origin server.”
The update was made a couple of months ago, without any changes to settings required from customers. The results have been successful so far, with one example showing a 1.5 second improvement in latency. As the company put it, “live video just got more live!”
This announcement is part of Speed Week, showcasing new products and customer experiences about how Cloudflare has improved their internet experience.
Cloudflare is one of the world’s largest cloud network platforms. The company has a goal of giving your website users a faster, more secure, and more reliable internet experience.
Did you know we have a YouTube Channel? Every week we have a live Cord Cutting Q&A, and weekly Cord Cutting recap shows exclusively on our YouTube Channel!