-->
Save your seat for 纽约流媒体 this May. 现在注册!

Will Caching Still Impact Latency in a Post-HLS World?

Watch the complete panel from 流媒体 East Connect, Latency Still Sucks (and What You Can Do 关于 It) 在 流媒体 YouTube channel.

Learn more about low-latency streaming at 流媒体西部 2020.

Read the complete transcript of this clip:

凯西•布: A lot of the latency discussion in the last couple of years has been shaped by the dominant technology that we use to distribute streaming content: HLS. The very nature of that protocol means that the videos are chunked and you're going to have to have a certain number of chunks. And that adds a huge amount of latency to the whole process. With all of these UDP protocols that are coming out, we might be reshaping the way we deliver at scale and we might be heading towards a post-HLS or post-chunked media world. If we get there, I think a lot of these talks about caching are going to be reshaped. Hardware definitely does play a huge role in that--we could get ridiculous throughput through a box now. It's pretty easy to order up 40-gigabit line cards, SSDs, and all that. So these servers that we use for caching are immensely powerful. To turn that back around--if we move to a post-chunked media world, is that going to matter?

Marc Symontowski: 是的. If we can handle the scalability for those large-scale events in a real-life scenario, that's a different challenge than rebalancing caches. I think a lot of acceleration 在 edge could happen there, like optimized hardware for ingest portal and egress, so you can actually optimize certain protocols right at the edge.

杰森·Thibeault: 说得好. It's it's not one size fits all. One would think that your profile for your caching or your caching strategy has to be reflective of the kind of traffic and content you're pushing through it. 凯西的观点是, if we move away from segmented video delivery, or if it's something else, then obviously that would impact how those content deliverers optimize or tune those servers to improve the caching, or any caching is not even needed. 我不知道. 疯狂,疯狂的想法.

流媒体覆盖
免费的
for qualified subscribers
现在就订阅 最新一期 过去的问题
相关文章

How Open Caching Solves Content Delivery Bottlenecks

How does open caching solve content delivery bottlenecks? According to Gautier Demond of Qwilt, the biggest problem-solving advantage of open caching is removing the traditional bottlenecks of CDN networks while reinventing the overall analytics-sharing relationship with ISPs.

How Much Do Low-Latency Codecs Reduce Latency?

GigCasters' Casey Charvet and CenturyLink's Rob Roskin discuss the efficacy of new low-latency revisions to existing protocols to decrease streaming latency in this clip from 流媒体 East 2020.

Encoding Best Practices to Reduce Latency

How can streaming professionals fine-tune their processes to prioritize low latency at the encoding stage? Streaming Video Alliance's Jason Thibeault, GigCasters' Casey Charvet, and Haivision's Marc Cymontowski discuss strategies for reducing latency in this clip from 流媒体 East Connect.

Companies and Suppliers Mentioned