Video Streaming Protocols Compared

A video streaming protocol establishes an agreed upon standard for transmitting and receiving video streams or files. In particular, it defines how the files are broken up into smaller IP packets, transmitted to an end user, and recombined into the video stream to display on various devices such as a web browser, iPhone or Android app, or smart TV. It may also be displayed on a highly secure, proprietary system, such as CCTV, security, and various IoT systems.

While there are many video streaming protocols currently in use, the good news is that as CPU speeds, local memory, and Internet bandwidth have all seen explosive growth over the last few decades, the need for highly proprietary software and hardware has dramatically dropped. State-of-the-art video streaming protocols are rapidly replacing the old standards, as these newer protocols tend to be open source (non-proprietary), codec-agnostic, and highly compatible with the broad market of consumer and enterprise-grade end-user devices.

Let’s discuss the major video streaming protocols, including older and well-established protocols such as RTMP and RTSP, HTTP-based protocols such as HLS, MPEG-DASH and CMAF, and the latest state-of-the-art protocols, WebRTC and SRT. We will also bid a fond farewell to MSS and HDS.

RTMP and Variants

RTMP, or Real-Time Messaging Protocol, has been one of the dominant protocols in video streaming for a good 20 years now. It was originally developed by Macromedia (now Adobe) to run on its proprietary Flash Player. Flash technology was popular for a while, but was phased out a couple of years ago, and will soon be obsolete.

However, RTMP continues to be used for the ingesting of raw video on a video server or CDN and is here to stay, due largely to legacy software and hardware in order to maintain compatibility.

There are several flavors of RTMP that add various enhancements such as:

  • RTMPT – Tunneled through HTTP
  • RTMPE – Encrypted
  • RTMPTE – Tunneled and encrypted
  • RTMPS – Encrypted via SSL
  • RTMFP – Uses UDP instead of TCP for the transmission protocol

Since RTMP cannot be used for displaying videos via HTML5, it is still in widespread use primarily as a “first mile solution” – using an RTMP-enabled encoder to take the source video and transfer it to a CDN, where it would then be distributed to end users via a more modern protocol.


RTSP, or Real-Time Streaming Protocol, is another of the old-school technologies that is no longer used for viewing videos, due to lack of compatible players in the consumer market. RTSP is also a low-latency protocol similar to RTMP but requires use of its own transmission protocols, RTP (Real-Time Transport Protocol) and RTCP (Real-Time Control Protocol). RTSP remains widely used for IP cameras, surveillance, CCTV, drone cameras, and IoT due to being pulled based (as opposed the RTMP’s push approach), as well as due to its low latency and strong legacy footprint making many of these devices compatible with RTSP streaming.


HLS, or HTTP Live Streaming, was developed by Apple in 2009, and now dominates the marketplace from the consumption perspective, and is one of the most widely used streaming protocols today. As of 2019, an annual video industry survey has consistently found it to be the most popular streaming format for consuming live video. What makes HLS so appealing? First of all, it uses a newer technology called Adaptive Bitrate streaming, or ABR. Adaptive bitrate streaming monitors the “last-mile” connection to the end-user device and adapts the bitrate according to available bandwidth. This means that if your available bandwidth suddenly drops, HLS will reduce the amount of data it is sending from the video server, usually by replacing frames with lower-resolution images until the bandwidth can again support full resolution or simply dropping some frames to reduce the framerate. Add in some buffering and many network issues are easily handled from the consumer side, although you may see a second or two of lower resolution video and/or a stutter or two in playback. From the first mile to ingest video, network problems still need to be addressed with specialized solutions such as broadband bonding technologies optimized for RTMP streaming ingestion.

The other reason HLS has become industry standard is that after its introduction, Apple (eventually) ensured that HLS was supported by a wide range of operating systems and devices, including Google Chrome browsers, Android, MacOS, Microsoft and Linux devices, smart TVs, and HTML5 video players. 

While HLS is a highly attractive solution for video streaming, it does have higher latency compared to other standards, which is the tradeoff required to deliver the highest quality video at all times. To address the latency issue, there is now a variant of HLS called Low-Latency HLS (LL-HLS). This variant specifically addresses the latency issue, and can reduce the latency to under 3 seconds without significantly affecting the video quality. This is a great step forward, but currently this is still an emerging protocol that does not have widespread support across the industry. Also, this remains an Apple-proprietary solution.


MPEG-DASH, or Moving Pictures Expert Group Dynamic Adaptive Streaming over HTTP, is an open source, non-proprietary alternative to HLS. While still gaining traction it has all the features and benefits of HLS, with the exception of not being supported on Apple devices – no surprise there. But MPEG-DASH supports ABR and delivers high quality video to many supported devices including all Android devices, Chrome, Firefox, Safari browsers and many smart TVs.


CMAF, or Common Media Application Format, is a new and emerging standard that uses shorter data segments and achieves lower latency (3-5 seconds) for HTTP-based streaming. CMAF can be used with both MPEG-DASH and HLS to achieve more simplicity and lower latency for streaming media. In particular, CMAF standardizes and simplifies the container format for the streaming media chunks, allowing for a single fragmented MP4 file independently from the presentation protocol (HLS, DASH).


WebRTC, or Web Real-Time Communications, was originally designed for peer-to-peer (browser-to-browser) communications, particularly chat and VoIP. Its use has been extended to more general video and voice streaming in near real-time and has the lowest latency of any of the other protocols, clocking in at under one second latency.

WebRTC is free to use, open source, and is supported by Apple, Google, Microsoft, Mozilla and others. WebRTC currently powers many of the most popular video conferencing solutions, including GoToMeeting, Google Meet, WhatsApp and Messenger.

Since WebRTC is built upon peer-to-peer technology, it is well suited for the above video conferencing solutions, but is not designed with scale in mind. So if you need to reach a large audience with a real-time stream, WebRTC may not be appropriate.


SRT, or Secure Reliable Transport, is another modern, free open source streaming protocol from Haivision. SRT is intended to provide “high-quality, low latency secure video over unreliable public networks, including the internet.”

SRT has several very attractive features that makes it a great addition to the best video streaming protocols including:

  • Secure streams using 128/256 bit AES encryption
  • Outstanding quality protecting against packet loss, jitter, and bandwidth fluctuations
  • Low latency even during difficult network conditions
  • Open source available for free directly from GitHub

SRT utilizes UDP transport protocol as opposed to TCP, which is the responsible transport for most of the other protocols. SRT has implemented application layer controls that allows for some TCP-like behavior using the faster, simpler UDP transport, such as retransmission of lost/corrupted packets, packet sequence numbering and sender/receiver acknowledgements. These protective approaches however, leave a lot to be desired for video ingest over unreliable connections such as 4G/5G networks, as the algorithms are not able to cope with types of packet losses that are common in wireless networks. For the ingest side, we still recommend devices that provide more advanced algorithms to protect against network problems inherent in unreliable and unpredictable networks.

Haivision’s SRT Alliance, created to support the adoption and development of this open-source standard, has helped to create a community of more than 300 technology vendors as they continue to drive SRT to be the “defacto low latency video streaming standard in the broadcast and streaming industries.”

The SRT Alliance has many of the biggest international names in tech as members, including Microsoft, Telestream, Alibaba, Google Cloud, Tata Communications, and others.


HDS, or HTTP Dynamic Streaming, and MSS, or Microsoft Smooth Streaming, are two streaming protocols on their way out. They were developed by Adobe and Microsoft respectively, both about 15 years ago. HDS used adaptive bit rates and worked well in the Adobe Flash player, which is now obsolete, so this protocol figures to continue to decline in use.

Similarly, Microsoft developed MSS for use with its Silverlight media player using ABR, but the overall performance of MSS is not on par with the latest HTTP-based streaming protocols and Microsoft has recently discontinued support for MSS.


We’ve given a broad overview of most of the relevant video streaming protocols in use today. While there are still some legacy protocols being widely used (primarily RTMP and to a lesser extent RTSP and SRT) these are used most often in the “first mile”, where the raw video gets compressed, turned into IP packets, and sent along to the next hop, typically a video hosting site or CDN.

For “last mile delivery”, when the compressed file is decoded and sent to a video player for playback (either in real time or on demand) the leading streaming protocols are HLS (and LL-HLS), MPEG-DASH (and CMAF), and webRTC.

And if the first and/or last mile does not have access to wired Internet or WiFi, there are devices, such as the Streamer, that can bond multiple 4G/LTE/5G cellular modems into a single high-bandwidth, low-latency Internet connection. This solution is codec and streaming protocol agnostic and can provide rock-solid connectivity for live streaming under the most challenging conditions.

Rob Stone, Mushroom Networks, Inc. 

Mushroom Networks is the provider of Broadband Bonding appliances that put your networks on auto-pilot. Application flows are intelligently routed around network problems such as latency, jitter and packet loss. Network problems are solved even before you can notice.



© 2004 – 2024 Mushroom Networks Inc. All rights reserved.

Let’s chat. Call us at +1 (858) 452-1031 or fill the form: