Low Latency Content Streaming
NW EXP was intentionally created to accommodate streaming from external providers as well as through the NW Content Service.
Streaming through the Content Service offers complete flexibility to support end-to-end content flow in various scenarios, whether conventionally via SDI on-site in an OB truck, optionally utilizing a NW AVCoder, remotely in a production setting, or directly in the cloud.
In general all common industry formats for audio and video are supported, ranging from SRT, RTMP, WHIP/WHEP, etc. for point-to-point connections over to HLS, DASH, MP4, etc. for end-user delivery including all common video and audio codecs.
The content ingest and transcoding for end-user distribution can occur either in a single step or across different facilities. For instance, streaming might progress from SDI to SRT to the cloud, where transcoding and packaging take place for distribution through a Content Distribution Network (CDN).
The Content Service seamlessly integrates with the Origin Service, enabling integration with a custom CDN to channel the entire content flow through it.
Multiple methods are available to synchronize content across diverse audio, video, and data feeds, encompassing automated procedures through embedded timecodes within the video/audio content or the content itself, along with manual adjustments accessible within the Console. The fundamental objective is to maintain the real-world timestamp when a frame was captured and use this data to synchronize all video, audio, and data feeds precisely.
As each setup varies, this task can be intricate; hence, NativeWaves streamlines this process by accommodating different methods of extracting time information from diverse content streams in different formats automatically. In instances where time information is absent, the system assumes certain values.
The ultimate aim is to automate this process extensively. Therefore, it's advisable for producers to incorporate timecodes in the video feeds whenever feasible.
Having timing information that correlates the PTS timestamp of the video feed with real-world event time (e.g., frame 85 captured at event time ~12:00:01.500) is ideal. This synchronization can be achieved by embedding metadata within the video feed or utilizing timing information like Program Date Time in a HLS manifest.
At NativeWaves, we believe that low latency is crucial for streaming live events, whether it's at the venue, on the go, or at home. Conventional streaming methods that introduce delays of over 30 seconds have proven inadequate, particularly when compared to the much faster delivery of traditional TV signals. Furthermore, we want to prevent scenarios where users receive event notifications on their mobile devices before witnessing the actual moment in the live stream, causing unwanted spoilers with a significant delay.
Several low latency streaming technologies have emerged in the last few years that can reduce latency to a minimum. As part of the NativeWaves Cloud, we offer a Content Service which is a low-latency HTTP-based streaming service that leverages all the developments of the last 5 years in this field (CMAF, LL-HLS, HESP, …) and achieves a sub-three second latency lens-to-screen that is scalable with conventional CDNs for a broadcast audience of millions of viewers. This system can also be tuned to achieve sub-second latency at the venue including DRM protection.
NativeWaves EXP leverages existing implementations of video/audio players such as ExoPlayer (Android), AVPlayer (iOS), and Hls.js (Web). Rather than developing its own playback core, NativeWaves EXP closely manages playback and seeking within the content by utilizing these well-established player frameworks.
Since NativeWaves EXP relies on the underlying player implementations, the low latency streaming capabilities are dependent on the features and performance offered by each platform's player. Currently, NativeWaves EXP achieves sub-three second latency from lens-to-screen with LL-HLS across all platforms.
Furthermore, we are continuously exploring opportunities to expand the streaming protocol support. This involves collaborating with both commercially available player integrations and exploring custom player integrations.
The ability to integrate with existing CDN vendors is also a key aspect from a business standpoint. Broadcasters and streaming platform providers often have contracts in place that optimize the cost of data transfer per gigabyte, making it cost-effective for accommodating large audiences.
Specifically for in-venue streaming, WebRTC-based systems have emerged that are generally more challenging to scale cost-effectively but can reduce latency even further (< 100 milliseconds end-to-end) which makes it an interesting choice for various use cases.
Our guiding principle is to prioritize the best possible watching experience, which is why we adopt a partnership-driven strategy. We actively engage in close collaboration with leading streaming tech providers to collectively push boundaries and allow for effortless integration within established broadcast and streaming systems.
NW EXP also facilitates the combination of WebRTC and HTTP-based streaming for individual audio/video feeds, offering the advantages of both technologies: ultra-low-latency and scalability combined with instant replay functionailty.