-->
Save your FREE seat for 流媒体 Connect this August. 现在注册!

了解低延迟流媒体的技术

What are some of the key enabling technologies of low-latency streaming? 首席技术官罗伯特·莱因哈特 videoRx, goes in-depth with this topic during a presentation from 流媒体 West 2022, covering areas such as server ingest and client delivery protocols like WebRTC, NDI, RTMP, and HLS.

“我们拥有的技术, when it comes to low latency include ingest protocols, 或者muxers——它们不一定是同义词, 但WebRTC更像是一个传输层,莱因哈特说. “It's not necessarily a specific way to mux audio and video, although, like HLS, it's a wrapper that muxes audio and video together. But when it comes to server ingest, we've got low-latency protocols. NDI, of course, had a huge boost during COVID because a lot of production workflows went to full NDI--people using NDI within their own private LANs on the cloud, 使用NDI, of course, 在位置, 或者从Teams或Zoom出来. NDI是一种非常流行的——免许可, for the most part--ultra-low-latency way to get video around that can be uncompressed or compressed, 这取决于你使用的是什么口味的NDI.

“RTSP/UDP, that's used more with the kind of security cameras, traffic cams out there. 同样,我们有各种各样的观众. I'm not presuming you all come from media and entertainment. 所以说到市政直播, I do work for various municipalities in British Columbia, and across the United States I've worked with the city of Colorado Springs, 加州的一些城市, 专门处理他们的交通摄像头. 他们也想要低延迟. 大多数摄像头都是IP摄像头 Axis 或者其他像Axis这样的供应商, and those are all RTSP pulls from those cameras into an infrastructure that hopefully won't add much more latency on top of it. 当然, RTMP Flash has been around for a long time and is gone, but RTMP is its legacy that even Facebook and YouTube today still use for ingest.

“So if you're doing a live stream on those platforms, you probably already know the latency is pretty high because we can't do RTMP out anymore. 我们可以用RTMP. Gradually, 这将被逐步淘汰, but because there's such an infrastructure investment on top of RTMP, I don't think it's gonna go away next year or the year after that. RTMP is probably gonna be a legacy protocol that sticks around, and it can actually be pretty low latency--even ultra-low latency, and I'll go back to that slide in just a second that I skipped to from Wowza that basically refers to tuning all of these different protocols and muxers that might be out there.

“For server ingest, I put the popular ones on the left-hand side. 它缺少SRT,而SRT应该在那里. 当然,SRT是由 Haivision. 它是开源的, and SRT is quickly becoming a popular replacement to RTMP, 特别是如果你使用的不是H的编解码器.264. 如果你想开始使用现代编解码器,比如 HEVC or AV1, you're not gonna be able to use RTMP very easily to do that. 尤里·乌多维琴科 Softvelum has adapted an RTMP variant that will work with other codecs, but that's very specific to the infrastructure that his company's been working on.

“Client delivery of course is how we're consuming these streams, not just how a server might be talking to a point of origin or a remote location for client delivery. 我们已经使用HLS-实现了标准的HTTP传输DASH 以及它的CMAF变体. WebRTC, of course, is there for client delivery as well.

“我们仍然有网络套接字服务. nanocosmos, whose tagline was ‘around the world in about a second,使用web套接字的回放机制. So web sockets wasn't necessarily an end-to-end delivery for them. It was a client delivery that was easier to scale than WebRTC. 你仍然会看到一些web套接字实现. Web sockets have been around in browsers for a long, long time, and it's just a generic socket. You could send whatever you want over a web socket: data, audio, video. It's not necessarily an easy protocol to work with because, again, it was not necessarily designed for sending video and audio and video around the web like WebRTC was.

“苹果HLS的延迟时间约为30秒. Typically they put it down in the 18+ seconds of latency column. I would say your average HLS latency is 30 seconds, mainly because people are using ten-second chunk sizes and a three-chunk playlist. So you multiply 3 chunks times 10--not including any kind of delays between those chunks being delivered across CDNs--and you're looking at 30 seconds. It's not too hard to reduce that latency down to six seconds. 你会在HLS下看到, Wowza put it just after five seconds and you could get a two-second chunk size and keyframe interval times three. Again, 如果你开始最小化你的块大小, just multiply it by the number of chunks that are listed in the manifest, 这就是你的平均延迟. 你还需要花些时间, just because of transports between edges and your origin potentially. But I would say if you have a two-second segment size and it's on a playlist that's repeated three times, then you're probably looking anywhere from 6-10 seconds of latency in a tuned playlists like that.

这并不难做到. Anyone who's got a media server can tune their packaging to that and not have to go through many jumps in not too many hoops to do that. As we get closer though to this sub-one-second latency, we start to get into more different technologies like WebRTC, you can see is first and foremost in this near real time. 我们接近250毫秒. 一般来说, I think most people are looking to achieve under 500 milliseconds, if not under 300 milliseconds of latency when they're using WebRTC. 当然, there's a cost associated with that. WebRTC doesn't scale as easily as any of these http methods of delivery, 所以你得做相应的预算, whether it's building out your own WebRTC infrastructure using someone else's.

“我们现在有 低延迟HLS (LL-HLS)已经发布好几年了. I remember when Roger Pantos came to 流媒体 West 2019. He was talking about 低延迟HLS for the very first time at the conference. And so we've had some time over COVID to see how that's gonna evolve. The original 低延迟HLS spec had some HTTP version 2-specific PUT commands in it, that they've since removed it so that CDNs don't have to use it. And instead they're using this preload hint that you can put into manifests that are specifically 低延迟HLS. 当然 you could tune any of the others, like RTMP.

“那时候 HQ Trivia 是一款非常受欢迎的益智游戏吗, I had a couple of clients that were trying to get on that same bandwagon, and we were using RTMP libraries in native apps to play RTMP in a smartphone app. So if you're building your own customized playback technology, then you have a lot more options still if it's not gonna be strictly within the domain of the browser. So RTMP could be an option for playback if you were building a custom environment for it these days.

“I wouldn't put too much weight into RTMP playback just because we've got options that are a lot more mature now, like WebRTC. 就在几年前, WebRTC didn't have the kind of cross-platform and cross-browser acceptance or standards that we have now. 现在你不用太担心H了.264 vs. VP8. Those are still codecs that are in play that might need some transcoding back and forth, 取决于您的工作流程. But it's come a long way, and again, COVID accelerated that."

了解有关低延迟流的更多信息 流媒体东部2023.

相关文章

Best Practices for Evaluating Streaming Tech Vendors

What are some of the best practices for evaluating streaming tech vendors? Nadine Krefetz of 现实的软件 discusses the various factors at play in making these decisions with LaShawn McGhee of Revry, Starz电视台的罗伯·柯林斯, and Anil Malhotra of Bango in this clip from 流媒体 Connect 2023.

CDN77's Juraj Kacaba Talks Low-Latency Streaming and the Edge

CDN77's Juraj Kacaba sits down with Tim Siglin to discuss low-latency streaming and the edge in this interview from 流媒体东部2023.

Vindral CEO Daniel Alinder Talks Latency, Sync, 8K, and Vindral

Tim Siglin of Help Me Stream Research Foundation sits down with Daniel Alinder of Vindral to discuss latency, sync, 8K, and Vindral in this 流媒体东部2023 interview.

How Consumers Influence Streaming Tech Development

How much does consumer feedback impact streaming app and platform innovation and evolution? Is there an influencer niche for everyday streaming consumers? Fandango的Rema Morgan-Aluko, LG的马修·杜金报道, and Vizio's Greg Barnard discuss how seriously they take user feedback and how they apply it in this clip from 流媒体 Connect 2023.

How to Monitor and Troubleshoot Your 在线直播 Workflow

End-to-end workflows for live streaming at scale are complex affairs, with troubleshooting challenges at each stage when delivery breaks down. 派拉蒙的专家小组, Amagi, TAG VS, and Nomad Technologies discusses key best practices for troubleshooting large-scale streams when the pressure is on.

Latency vs. 大规模直播的质量

How much streaming reliability and quality are worth trading for ultra-low latency, 什么时候一个比其他的溢价? Amagi的Brian Ring, Dolby,大卫·哈桑报道, Nomad Technologies的亚当·米勒说, 派拉蒙的科里·史密斯, and Norsk CMO Eric Schumacher-Rasmussen discuss in this panel from 流媒体 West 2022.

What's the Best Chunk Size for Low-Latency 在线直播?

The best chunk size for low-latency streaming is dependent on a number of factors based on different use cases, and there is often a need for some compromise and tradeoffs in quality or speed. Nadine Krefetz, Consultant, 现实的软件, 特约编辑, 流媒体, asks three industry experts what their chunk size preferences are for their requirements.

交互式视频的低延迟流媒体

Interactive streaming is the future for high-quality ultra-low latency applications that will unlock unique and unprecedented experiences for users

延迟如何阻碍混合生产和云生产

CNN+ Live Operations Manager Ben Ratner discusses how even "ultra-low latency" complicates hybrid (cloud and on-prem) workflows in this clip from 流媒体 Connect 2022.

SVC和低延迟视频会议

Andy Howard and Tim Siglin discuss the importance of low latency in video conferencing and how popular video conferencing tools like Zoom, Teams, and WebEx use scalable video encoding to guarantee access and lower latency in this clip from their panel at 流媒体 Connect 2022.

webtc和低延迟流

Millicast Chief Revenue Officer Ryan Jespersen discusses how WebRTC reduces streaming latency in this clip from 流媒体 Connect 2022.

提及的公司及供应商