When people refer to quality video today, they often cite HD 1080i. However, there is some confusion about what this means.
If you capture, broadcast and display raw video at 1080i, with no attempt at compression, you will be sending 1.493 Gbits/sec. If you assume you will need network overhead of about two percent, then you need bandwidth of 1.523 Gbits/second. And if you include the audio, you need still more.
In order to provide live streaming from a mobile production platform, as TV News crews do every day, you must employ encoding that compresses this video to an economically feasible level. It is common today for such compression to be applied to a 1080i video stream with delivery on bandwidth at less than 10 Mbps. This is achieved in a variety of ways.
Lossless compression can be realized in a variety of ways, for example, when there are a lot of identical pixels in a contiguous area because of a large uniform background. But in order to get the compression to the practicable level, at some point it is typically necessary to employ lossy compression in a number of ways, for example, by interpolating pixels based on nearby pixels. You can then still display the final result in 1080i. But how much lossy compression can you apply before picture degradation becomes apparent to the viewer, or before it becomes unacceptable to the viewer? The optimal goal for a given encoder, cost aside, would be to compress only to the point where the human eye would begin to notice degradation if any further compression were applied – call that point the sweet spot.
However, the sweet spot for a give encoder at 1080i will depend on many things – the specific encoder employed, its algorithms, the amount of motion in the scene, the amount of color, the eye of the viewer, the knowledge of the viewer, the display screen, ambient light, etc…. For a given group of people, if you varied the encoding rate at which you displayed 1080i for them, the perceived sweet spots selected by the individuals would very likely be a range.
KenCast (News - Alert) uses encoders in its live streaming coverage products that appear to yield a sweet spot within a range between three to eight Mbps. That is SUBJECTIVE.
KenCast ran a one-hour demonstration in October with a major carrier. The live streaming broadcast was done from a portable production platform that captured, broadcast, and displayed at 1080i. The broadcast was done by bonding four wireless IP networks (three Verizon (News - Alert) LTEs, one Verizon 3G). These were publically shared networks on which available bandwidth fluctuated continually. The bonding yielded an aggregate bandwidth that was consistently sustainable in the range of three to four Mbps, with spikes at times that at the high point was slightly over six Mbps, and with the some dips of which the deepest recorded was at 1.5 Mbps.
We used Adaptive Bit Rate (ABR) Encoding which enabled dynamically changing the encoding rate to match the available aggregate bandwidth of the moment. With ABR encoding it was possible to continuously capture, broadcast, and display the video at 1080i. The quality of the picture varied, of course, with the bandwidth of the moment and the resulting encoding rate. Did it reach or meet and/or stay at or above the viewers’ sweet spot? This was up to each viewer to decide.
However, the good news is that the wireless IP carriers are now providing enough bandwidth with their 3G and 4G networks that it is possible for even small independent producers to enter the realm of true HD for their live video streaming of events from portable production platforms -- with nothing more than a phone call.
TMCnet publishes expert commentary on various telecommunications, IT, call center, CRM and other technology-related topics. Are you an expert in one of these fields, and interested in having your perspective published on a site that gets several million unique visitors each month? Get in touch.
Edited by Rich Steeves