stringtranslate.com

HTTP Live Streaming

HTTP Live Streaming (also known as HLS) is an HTTP-based adaptive bitrate streaming communications protocol developed by Apple Inc. and released in 2009. Support for the protocol is widespread in media players, web browsers, mobile devices, and streaming media servers. As of 2022, an annual video industry survey has consistently found it to be the most popular streaming format.[2]

HLS resembles MPEG-DASH in that it works by breaking the overall stream into a sequence of small HTTP-based file downloads, each downloading one short chunk of an overall potentially unbounded transport stream. A list of available streams, encoded at different bit rates, is sent to the client using an extended M3U playlist.[3]

Based on standard HTTP transactions, HTTP Live Streaming can traverse any firewall or proxy server that lets through standard HTTP traffic, unlike UDP-based protocols such as RTP. This also allows content to be offered from conventional HTTP servers and delivered over widely available HTTP-based content delivery networks.[4][5][6] The standard also includes a standard encryption mechanism[7] and secure-key distribution using HTTPS, which together provide a simple DRM system. Later versions of the protocol also provide for trick-mode fast-forward and rewind and for integration of subtitles.

Apple has documented HTTP Live Streaming as an Internet Draft (Individual Submission), the first stage in the process of publishing it as a Request for Comments (RFC). As of December 2015, the authors of that document have requested the RFC Independent Stream Editor (ISE) to publish the document as an informational (non-standard) RFC outside of the IETF consensus process.[8]In August 2017, RFC 8216 was published to describe version 7 of the protocol.[9]

Architecture

HTTP Live Streaming uses a conventional web server, that implements support for HTTP Live Streaming (HLS), to distribute audiovisual content and requires specific software, such as OBS to fit the content into a proper format (codec) for transmission in real time over a network. The service architecture comprises:

Server
Codify and encapsulate the input video flow in a proper format for the delivery. Then it is prepared for distribution by segmenting it into different files. In the process of intake, the video is encoded and segmented to generate video fragments and index file.
  • Encoder: codify video files in H.264 format and audio in AAC, MP3, AC-3 or EC-3.[10] This is encapsulated by MPEG-2 Transport Stream or MPEG-4_Part_14 to carry it.
  • Segmenter: divides the stream into fragments of equal length. It also creates an index file that contains references of the fragmented files, saved as .m3u8.
Distributor
Formed by a standard web server, accepts requests from clients and delivers all the resources (.m3u8 playlist file and .ts segment files) needed for streaming.
Client
Request and download all the files and resources, assembling them so that they can be presented to the user as a continuous flow video. The client software downloads first the index file through a URL and then the several media files available. The playback software assembles the sequence to allow continued display to the user.

Features

HTTP Live Streaming provides mechanisms for players to adapt to unreliable network conditions without causing user-visible playback stalling. For example, on an unreliable wireless network, HLS allows the player to use a lower quality video, thus reducing bandwidth usage. HLS videos can be made highly available by providing multiple servers for the same video, allowing the player to swap seamlessly if one of the servers fails.

Adaptability

To enable a player to adapt to the bandwidth of the network, the original video is encoded in several distinct quality levels. The server serves an index, called a master playlist, of these encodings, called variant streams. The player can then choose between the variant streams during playback, changing back and forth seamlessly as network conditions change.

Using fragmented MP4

At WWDC 2016 Apple announced[11] the inclusion of byte-range addressing for fragmented MP4 files, or fMP4, allowing content to be played via HLS without the need to multiplex it into MPEG-2 Transport Stream. The industry considered this as a step towards compatibility between HLS and MPEG-DASH.[12][13]

Low Latency HLS

Two unrelated HLS extensions with a Low Latency name and corresponding acronym exist:

The remainder of this section describes Apple's ALHLS. It reduces the glass-to-glass delay when streaming via HLS by reducing the time to start live stream playbacks and maintain that time during a live-streaming event. It works by adding partial media segment files into the mix, much like MPEG-CMAF's fMP4. Unlike CMAF, ALHLS also supports partial MPEG-2 TS transport files. A partial media segment is a standard segment (e.g. 6 seconds) split into equal segments of less than a second (e.g. 200 milliseconds). The standard first segment is replaced by the series of partial segments. Subsequent segments are of the standard size.[16] HTTP/2 is required to push the segments along with the playlist, reducing the overhead of establishing repeated HTTP/TCP connections.

Other features include:

Apple also added new tools: tsrecompressor produces and encodes a continuous low latency stream of audio and video. The mediastreamsegmenter tool is now available in a low-latency version. It is an HLS segmenter that takes in a UDP/MPEG-TS stream from tsrecompressor and generates a media playlist, including the new tags above.

Support for low-latency HLS is available in tvOS 13 beta, and iOS & iPadOS 14.[17]On April 30, 2020, Apple added the low latency specifications to the second edition of the main HLS specification.[18]

Dynamic ad insertion

Dynamic ad insertion is supported in HLS using splice information based on SCTE-35 specification. The SCTE-35 splice message is inserted on the media playlist file using the EXT-X-DATERANGE tag. Each SCTE-35 splice_info_section() is represented by an EXT-X-DATERANGE tag with a SCTE35-CMD attribute. A SCTE-35 splice out/in pair signaled by the splice_insert() commands are represented by one or more EXT-X-DATERANGE tags carrying the same ID attribute. The SCTE-35 splice out command should have the SCTE35-OUT attribute and the splice in command should have the SCTE35-IN attribute.

Between the two EXT-X-DATERANGE tags that contain the SCTE35-OUT and SCTE35-IN attributes respectively there may be a sequence of media segment URIs. These media segments normally represent ad programs which can be replaced by the local or customized ad. The ad replacement does not require the replacement of the media files, only the URIs in the playlist need to be changed to point different ad programs. The ad replacement can be done on the origin server or on the client's media playing device.

Server implementations

Notable server implementations supporting HTTP Live Streaming include:

Usage

Supported players and servers

HTTP Live Streaming is natively supported in the following operating systems:

Windows 10 used to have native support for HTTP Live Streaming in EdgeHTML, a proprietary browser engine that was used in Microsoft Edge (now referred to as Edge Legacy) before the transition to the Chromium-based Blink browser engine. Edge Legacy was included in Windows 10 up till version 2004. It was replaced by Edge Chromium in version 20H2. Along with Windows 11, Microsoft released an updated Media Player that supports HLS natively.

Clients

Servers

Live Encoders

VOD encoders

See also

References

  1. ^ Pantos, R.; May, W. (2017). "Playlists". HTTP Live Streaming. IETF. p. 9. sec. 4. doi:10.17487/RFC8216. ISSN 2070-1721. RFC 8216. Retrieved Jan 15, 2020.
  2. ^ Lederer, Stefan. "2022 Video Developer Report" (PDF). Bitmovin. Retrieved 25 October 2023.
  3. ^ Jordan, Larry (10 June 2013). "The Basics of HTTP Live Streaming". Larry's Blog. Larry Jordan & Associates. Retrieved 18 June 2013.
  4. ^ "MPEG-DASH vs. Apple HLS vs. Smooth Streaming vs. Adobe HDS". Bitmovin. March 29, 2015.
  5. ^ Chen, Songqing; Shen, Bo; Tan, Wai-tian; Wee, Susie; Zhang, Xiaodong (2006-07-09). "A Case for Internet Streaming via Web Servers". 2006 IEEE International Conference on Multimedia and Expo. pp. 2145–2148. doi:10.1109/ICME.2006.262660. eISSN 1945-788X. ISBN 9781424403677. ISSN 1945-7871. S2CID 9202042.
  6. ^ Songqing Chen; Bo Shen; Wee, S.; Xiaodong Zhang (2007-07-23). "SProxy: A Caching Infrastructure to Support Internet Streaming". IEEE Transactions on Multimedia. 9 (5): 1062–1072. CiteSeerX 10.1.1.74.4838. doi:10.1109/TMM.2007.898943. ISSN 1520-9210. S2CID 870854.
  7. ^ Pantos, R. (30 September 2011). "HTTP Live Streaming". Internet Engineering Task Force. Retrieved 18 June 2013.
  8. ^ "History for draft-pantos-http-live-streaming". Retrieved 2017-04-17. Stream changed to ISE from None
  9. ^ Pantos, Roger; May, William (August 2017). HTTP Live Streaming. doi:10.17487/RFC8216. RFC 8216. Retrieved 2017-09-05.
  10. ^ Roger, Pantos; William, May. "HTTP Live Streaming". tools.ietf.org. Retrieved 2017-01-23.
  11. ^ What's New in HTTP Live Streaming. Apple Developer.
  12. ^ Siglin, Tim (16 June 2016). "HLS Now Supports Fragmented MP4, Making it Compatible With DASH". StreamingMedia.com.
  13. ^ Grandl, Reinhard (15 June 2016). "WWDC16: HLS supports Fragmented MP4 – and gets MPEG-DASH compatible!". Bitmovin.com.
  14. ^ Low-Latency HLS. Apple Developer.
  15. ^ "The community gave us low-latency live streaming. Then Apple took it away". 2019-06-14. Retrieved 2019-06-17.
  16. ^ "Apple Developer Documentation". developer.apple.com. Retrieved 2022-08-10.
  17. ^ Speelmans, Pieter-Jan (2020-12-09). "Low-Latency Everywhere: How to implement LL-HLS across platforms". Theo. Retrieved 2021-03-11.
  18. ^ Pantos, Roger (2020-04-30). "HTTP Live Streaming 2nd Edition". IETF. Retrieved 2020-04-30.
  19. ^ "Video CDN | Video Streaming | Stream Delivery | Fastly". www.fastly.com. Retrieved 2020-10-01.
  20. ^ "Encoding Guide". Limelight Orchestrate Video Support. Limelight Networks. Archived from the original on 2013-08-01. Retrieved 14 November 2013.
  21. ^ "Module ngx_http_hls_module". nginx.org.
  22. ^ "hls-server". npm. 12 February 2018.
  23. ^ "Storm Streaming". Storm Streaming. Retrieved 2021-07-30. Output devices: HLS, MPEG-DASH, WebSocket, RTMP
  24. ^ "Unreal Media Server". umediaserver.net. Retrieved 2021-07-30. Unreal Media Server supports ingesting live streams from wide range of live software and hardware encoders that send streams over WebRTC, RTMP, RTSP, MPEG2-TS, HLS,
  25. ^ "Android 3.0 Platform Highlights". Android Developers. Archived from the original on 2011-01-28.
  26. ^ "webOS 3.0.5 Updates". Archived from the original on 2012-01-22.
  27. ^ "Simplified Adaptive Video Streaming: Announcing support for HLS and DASH in Windows 10". Internet Explorer Team Blog. 29 January 2015.
  28. ^ a b Siglin, Tim (1 November 2010). "First Look: Microsoft IIS Media Services 4". StreamingMedia.com. Retrieved 30 July 2011.
  29. ^ Chan, David (November 26, 2010). "iPad App Review: SlingPlayer". Blogcritics. Archived from the original on April 15, 2014. Retrieved April 14, 2014.
  30. ^ Scott, Andrew (27 Feb 2015). "Audio Factory: an overview". Internet Blog. BBC. the only on-demand assets will be AAC HLS. ... We are still talking to manufacturers and many are confident that they will be able to provide their users with access to all 57 of our HLS AAC streams at 320 kb/s within a few weeks or months.
  31. ^ Shen, Yueshi (2017). "Live video transmuxing/transcoding: FFmpeg vs TwitchTranscoder, Part 1".
  32. ^ "Supported media formats". Android Developers.
  33. ^ "HTTP Live Streaming (HLS) | Can I use... Support tables for HTML5, CSS3, etc". caniuse.com.
  34. ^ "Firefox for Android 50.0, See All New Features, Updates and Fixes". Mozilla.
  35. ^ Giles, Ralph; Smole, Martin (28 November 2017). "DASH playback of AV1 video in Firefox". Mozilla Hacks – the Web developer blog.
  36. ^ "Firefox for Android Beta 59.0beta, See All New Features, Updates and Fixes". Mozilla.
  37. ^ Slivka, Eric (15 November 2010). "Hints of 'iTunes Live Stream' Service Found in iTunes 10.1". MacRumors.
  38. ^ "#2943 (Support for HTTP Live Streaming as a client)". VLC bug tracker. 9 July 2009.
  39. ^ "Playing HLS streaming video with VLC player - The VideoLAN Forums". forum.videolan.org.
  40. ^ "Windows 8 - HTTP Live Streaming". www.3ivx.com.
  41. ^ "3ivx - Xbox Live Developer Partner Program - Component Provider". www.3ivx.com.
  42. ^ NV, THEO Technologies. "HTML5 Video Player – THEOplayer". www.theoplayer.com.
  43. ^ Player, Radiant Media. "Version History - Radiant Media Player". www.radiantmediaplayer.com.
  44. ^ "dailymotion – Medium". Medium.
  45. ^ "hls.js demo page". Archived from the original on November 20, 2015.
  46. ^ "Orange-OpenSource/hasplayer.js". GitHub.
  47. ^ "Spark". Spark.
  48. ^ "google/shaka-player". GitHub.
  49. ^ "Shaka Player Demo". shaka-player-demo.appspot.com.
  50. ^ "Fluid Player - HTML5 video player". www.fluidplayer.com.
  51. ^ "Fluid Player Documentation". docs.fluidplayer.com.
  52. ^ "fluid-player/fluid-player". GitHub.
  53. ^ "QMPlay2 - Qt Media Player 2". October 22, 2023 – via GitHub.
  54. ^ "marakew/AvProxy". GitHub.
  55. ^ Ozer, Jan (2015). "Review: Bitcodin, a Cloud Video Encoding Service From Bitmovin". www.StreamingMediaGlobal.com.
  56. ^ "Delivering HLS Video - Brightcove Learning". support.brightcove.com.
  57. ^ "MediaGoom. Essential Web Streaming".