Trends in R&D on Optical and IP Transmission Technology for Program Production

Takuya Kurakake

Advances in communication technology such as the internet are increasing the use of optical fiber and Internet Protocol (IP) lines for the transmission of materials used in live program production. Here, we describe R&D trends in such use, provide an overview of optical and IP transmission technology used in live program production, and discuss issues concerning 8K support and the use of IP. As R&D for addressing these issues, we also explain trends in the optical transmission of high-bit-rate 8K program material over long distances and IP production systems designed for increasing the efficiency of live program production using IP transmission technology.

1. Introduction

For the transmission of live production materials from a venue to a broadcasting station and for the distribution of materials among studios within a station, there is increasing use of optical fibers in addition to the conventional coaxial cables and wireless links. The reason is that transmission over a broadband, low-loss optical fiber*1 can increase the capacity and extend the distance dramatically compared with conventional methods. The adoption of local area networks (LANs) for low-cost optical IP transmission and routing in recent years has enabled the bidirectional transmission of large volumes of data in various signal formats regardless of the physical location of the connected devices. That has resulted in a movement to take advantage of those features to implement a more efficient program production system. Here, we review the evolution of optical and IP transmission technology for live program production, including the related technology. We then describe issues related to implementing program production using 8K long-distance transmission with IP and explain the trends in R&D for achieving greater efficiency for such systems in the production of live programming.

2. Development of related technology

Before presenting an overview of optical and IP transmission technology use in live program production, we outline related technology, which includes (1) how the electrical interface (mainly coaxial cable) used to connect video devices has developed with the evolution of broadcast media and (2) how optical fiber technology has developed in telecommunications sector*2, which has a far greater market scale than broadcasting.

2.1 Broadcast media progress and changes in device interfaces

Japanese TV broadcasting began in 1953, with black-and-white programming followed by color broadcasting, Hi-Vision (high-definition television; HDTV)*3, and then the launch of the 4K/8K satellite broadcasting service in December 2018. The evolution of broadcast media has been accompanied by the development of new electrical interfaces for connecting the video devices used in program production. The changes in the main broadcasting media and device interface since the introduction of Hi-Vision are shown in Fig. 1.

In the 1990s, individual coaxial cables were generally used to carry high-definition analog component signals (Y, Pb, Pr)*4 between devices. From the mid-1990s, however, there was steady progress in the standardization of digital transmission systems. In 1998, a serial transmission method known as High-Definition Serial Digital Interface (HD-SDI) for uncompressed high-definition digital signals (about 1.5 Gbps) compatible with 1080/60I*5 was standardized by the Society of Motion Picture and Television Engineers (SMPTE) *6 as 292M2) *7. That brought the digital interface into the mainstream.

Since 2000, R&D on 4K/8K Ultra-High-Definition Television (UHDTV) has been vigorous, and HD-SDI was initially adopted for the device interface. In 2006, the use of multiple 3G-SDI, which was standardized as SMPTE ST 4243) for 1080/60P*8, was adopted for 4K/8K. Currently, 12G-SDI, standardized as SMPTE ST 2082-14) in 2015, is mainstream. The standard methods are used as an interface for connecting devices with a transmission distance of about 100 m. Long-distance video transmission thus now requires optical fiber transmission technology.

Because the device interface described above (SDI) is not compatible with the 120 Hz frame frequency standard of 8K, a new interface was developed and standardized5).

Figure 1: Transitions in main broadcasting media and the device interface

2.2 Transitions in optical fiber communication technology

(1) Backbone network6) 7)

The interest in fiber optic communications increased significantly in 1970, when the potential for practical use was demonstrated by the announcement of an optical fiber with a transmission loss of 20 dB/km by Corning and the development of a semiconductor laser that operates continuously at room temperature by Bell Laboratories. In the 10 years that followed, there was a transition from the level of basic research to the possibility of practical application through achievements such as reduction of the transmission loss from 20 to 0.2 dB/km. In Japan, the Nippon Telegraph and Telephone Public Corporation (currently, NTT) began the commercial use of optical fiber transmission in 1981.

Subsequent progress in optical fiber communication technology for telecom carrier backbone networks is shown in Fig. 2. Telecom signals are mainly for voice, which is low in bandwidth and easily digitized, and digital data from computers, etc. Digital transmission has been used from the early days of optical fiber transmission. Even when the terminal equipment at both ends of the transmission line is expensive, as typified by submarine cables, fibers are sufficiently investment-effective when the capacity per fiber can be increased or the repeater distance can be extended. For that reason, vigorous R&D continued and progress has been rapid Fig. 2.

The initial increase in transmission capacity per fiber came from improvements in time division multiplexing (TDM) based on advances in optical devices. In the 1990s, fiber amplifiers came into practical use, and capacity was subsequently increased with wavelength division multiplexing (WDM) technology. In the 2010s, a further increase in capacity was achieved with multilevel modulation technology*9 using digital signal processing and other such means. Various optical devices such as semiconductor lasers and optical fiber connectors*10 that were developed at that time are also being used in LAN systems.

Concerning the type of signals handled by telecom backbone networks, the demand for voice communication dominated the demand for data communication until 2000 or so, but since then, data communication demand has increased and voice traffic has decreased. As a result, packet transmission has become the more efficient approach for voice as well, and a transition to a closed IP network*11 referred to as Next-Generation Network (NGN), which began in 2008, continues to move forward. The same transition is also occurring in the access network*12 as well as the backbone network. For example, the Nippon Telegraph and Telephone Corporation (NTT) plans to complete the transition to full-IP NGN by 2025.

Figure 2: Evolution in optical fiber communication technology for carrier backbone networks (Ref. 7)

(2) LANs

Local area networks (LANs) are used to connect devices within small areas such as ordinary homes, offices, laboratories, and factories. In the beginning, they were implemented with various types of technology, but Ethernet8) became the mainstream after standardization of Fast Ethernet in 1995. Fast Ethernet provided a speed of 100 Mbps with a simple cost-effective mechanism*13. The optical fiber tranmission for 100 Mbps Ethernet has also been standardized, and was commonly used in backbone networks for large buildings*14. Since the advent of gigabit-per-second Ethernet, the optical fiber cable has taken the lead over the unshielded twisted pair (UTP) Ethernet cable in terms of standardization and widespread use. Ethernet speeds have continued to increase and the standard is now up to 400 Gbps. The data rates for the SDI used in video signal transmission and Ethernet are compared in Fig. 3. SDI includes coaxial cable transmission, and the achievement of higher speeds is considered difficult, so LAN technology is also being investigated as an interface for 4K and 8K video signals. Ethernet switches are increasing in capacity (the number of ports) and decreasing in price*15; thus, interest in using them for audio and video signal switching (routing) is increasing.

Figure 3: Comparison of SDI and Ethernet data rates19)

3. Use of optical and IP transmission in live program production

The history of optical and IP transmission use in live program production is outlined in Table 1. Optical transmission was first used in around 1980 and progressed to application for long-distance transmission and multichannel distribution. In the 2000s, R&D on 8K became active, 10 Gbps Ethernet was standardized for LANs, and the cost of gigabit Ethernet continued to decrease. Investigation of the use of LAN technology began and investigation into the use of IP technology moved forward somewhat later. Section 3.1 presents the history of the use of optical and IP transmission technology in the production of live programming, and section 3.2 explains the use of LAN and IP technology.

Table 1: Use of optical and IP transmission in live program production

3.1 Optical technology

Optical transmission technology was introduced to program production in response to the demand for long-distance transmission, which is problematic with a coaxial cable, and the need to increase capacity by transmitting multiple signals over a single cable. The use of optical fiber transmission for a live golf event in Japan in 1978 is an early example9). At that time, digital transmission of video signals was difficult because video signals have a much larger bandwidth than audio signals, so video continued to be transmitted as standard TV composite signals*16 or HDTV component signals (Y, Pb, and Pr) until the mid-1980s. The respective signals were generated by modulating the emission intensity of a light-emitting diode (LED) or semiconductor laser diode (LD) and transmitted over individual multimode fibers up to a distance of several kilometers*17.

Our research on optical fiber transmission of program material began in earnest in the 1980s, as practical high-definition broadcasting had gained traction, and FM multiplexed optical transmission technology was developed with the objective of long-distance multichannel transmission, which was not previously achieved10). In that technique, the carrier waves are frequency-modulated (FM) by the Y, Pb, and Pr component signals of the Hi-Vision material, respectively. The frequency-multiplexed signal is then used to modulate the light intensity. The technique was applied in a live transmission device for experimental Hi-Vision broadcasting from the 1988 Seoul Olympics. In the same year, in-station transmission equipment for standard TV implemented with the technique was introduced in the network operation center of the NHK Broadcasting Center.

In response to progress in the digitization of Hi-Vision program production in the early 1990s, technology was developed for transmitting uncompressed digital Hi-Vision signals at high speed (about 1.5 Gbps at that time) over long distances using optical fiber cables. For the efficient use of frequencies and clock recovery, a paired selected ternary (PST) code was adopted*18 and a PST codec chip was developed to implement a practical digital transmitter for Hi-Vision11). That device was used for broadcasting the 1992 Barcelona Olympics and on many other occasions.

In the mid-1990s, practical technology for transmitting an optical signal modulated with serial data from an uncompressed digital Hi-Vision signal (about 1.5 Gbps) became available. We began research on high-density wavelength division multiplexing for optical transmission that is capable of distributing various types of high-data-rate signals such as Hi-Vision and standard television, and in 1995, feasible technology for multiplexing 32 optical signals at wavelength intervals of 1 nm was achieved. In cooperation with the NHK Engineering Administration Department and Broadcast Engineering Department, we developed a distribution system for Hi-Vision signals between the network operation center of the NHK Broadcasting Center and the studio and introduced the system in 199812).

In the 2000s, we began full-fledged R&D on 8K Super Hi-Vision (referred to simply as 8K below). At the same time, we also started R&D on optical fiber transmission technology for uncompressed 8K signals and IP transmission technology for compressed 8K signals for use in program production13) 14) 15) 16). The work on 8K long-distance transmission is described in chapters 4 and 5.

3.2 LAN and IP technology

Beginning in the mid-2000s, the incorporation of high-speed, general-purpose Ethernet and IP systems in live production systems was investigated as means of reducing transmission and routing costs with widely used LAN technology. One result of that work is the SMPTE ST 2022 standard 17), which was issued in 2012 and is still in wide use for data transport in broadcasting stations.

Subsequently, interest in features such as bidirectionality and expandability afforded by IP technology increased, and work began on using the technology to build production systems that were more flexible*19. Also, when transmitting high-resolution 4K and 8K video as SDI signals, multiple SDI signals must be connected and transmitted, which leads to the problem of system complexity. The use of IP technology was also studied as a solution to that problem, and various manufacturers developed and commercialized their own superior IP transmission systems. In particular, systems such as those installed in outside broadcasting vans can be highly independent, and their weight can be reduced by using an optical fiber rather than a coaxial cable. For those reasons, there are multiple cases of such IP transmission systems being introduced in Japan.

On the other hand, however, standardizing the IP transmission method to ensure the interoperability of devices would enable many manufacturers to enter the market and thus reduce the cost of introduction. For that reason, international standards organizations desired an early formulation of such a standard, and an IP transmission method for program production was standardized as SMPTE ST 2110 in 201718). The main differences between ST 2022 described earlier and ST 2110 are listed in Table 2. Typical usage is shown in Fig. 419).

With ST 2022, the SDI signal that is used at the broadcasting site serves as the interface, and only the part of the signal that is to be transmitted is sent over the IP network ( Fig. 4(a)). This is because the serial component signals (video, audio, and ancillary data, and so forth) that are superimposed in the SDI format are sequentially converted to IP format, and the resulting signal is suited to point-to-point transmission.

ST 2110, on the other hand, was intended to achieve greater flexibility in program production systems with IP technology. The IP signal serves as the interface, so each essential signal can be translated as a separate stream and point-to-multipoint transmission is possible. As shown in Fig. 4(b), for example, the video signal from a venue or studio can be transmitted to the studio control room of broadcasting station A, and after switching, the video can be sent to the venue and to the audio mixing room of broadcasting station B. The audio can be sent to the mixing room of broadcasting studio B.

Although the standardization of IP transport is proceeding in this way, IP production systems differ from SDI systems in that the IP addresses of program production devices and controls, such as switch routing, must be set rather than the simple connection of cables.

A prerequisite for remote production, which is expected to be implemented with IP technology, is that remote devices can be controlled via an IP network. In a production system that uses SDI, the device control functions use vendor-specific methods. The initial production systems that applied IP technology also used proprietary methods, but progress in IP transport standardization created the opportunity for an open standard for control methods as well. Discussions of control methods at the Advanced Media Workflow Association (AMWA)*20 moved forward, and a control method standard for device discovery and addition to the network was issued in 2016. The AMWA control method20) 21) was standardized with the cooperation of program production equipment vendors and broadcasting stations and was adopted as the recommended control method by the Alliance for IP Media Solutions (AIMS)*21. Also, the European Broadcasting Union (EBU) announced that it will require vendors to support the AMWA control system22).

SMPTE, AMWA, EBU, and the Video Services Forum (VSF)*22 created the Joint Task Force on Networked Media (JT-NM)23) to discuss a roadmap and standard implementation methods for the development of IP and cloud production facilities.

As described above, the standardization of the components of an IP control system has been progressing, and examples of use around the world have appeared. Work on using these technologies to achieve greater efficiency in program production systems is described in chapters 4 and 6.

Table 2: Differences between ST 2022 and ST 2110
Essentials
(video, audio, supplementary data, etc.)
Synchronization Routing
ST 2022 SDI is converted to IP with the essentials in SDI format *1 (Transmitted as a single IP stream) Frequency synchronization with RTP *2 timestamping, etc. Unicasting and multicasting both possible
ST 2110 Conversion to IP for each essential signal Terminals are synchronized to a common time source by PTP *3 Multicasting is basic
Figure 4: Use scenarios for ST 2022 and ST 211019)

4. Issues related to 8K video transmission and increasing the efficiency of program production

4.1 Use of optical transmission technology

The transmission rates for 8K video are very high: about 144 Gbps for uncompressed full-featured 8K (frame frequency of 120 Hz, chroma sampling*23 of 4:4:4, and bit depth of 12) and about 40 Gbps for uncompressed full-resolution 8K (frame frequency of 60 Hz, pixel sampling structure of 4:2:2, and bit depth of 10), the specifications currently used for program production. Thus, the increasing relative technical difficulty is a major issue in the development of 8K video transmission equipment and long-distance transmission.

The lines that are available for use in live 8K program production, etc., include dark fibers*24, IP lines provided by telecom carriers, and Ethernet lines. When dark fibers are used, a system that is capable of economically extending the relay distance is needed. For the use of IP lines provided by a telecom carrier or Ethernet lines, the line cost must be reduced with low-delay, high-quality image compression, and error measures must be applied to mitigate the degradation of link quality such as packet loss and fluctuations in packet arrival time (jitter).

4.2 Issues related to using IP technology to improve efficiency in live program production

As described in chapter 3, standardization of the basic part of the IP production system has been moving forward and its application to routine program production (repeated tasks in the network configuration and production equipment configuration) has begun mainly in overseas countries*25. The main issues for the future are listed below.

  • Construction of a large-scale system
    How and to what extent system redundancy, stability, and ease of updating should be secured and how systems that can handle 2K, 4K, and 8K can be constructed, etc.
  • Application to nonroutine program production tasks
    How to achieve efficiency in the construction and updating of the network system for program production.
  • Software for the future
    How to further enhance functions with software and construct program production systems using general-purpose IT equipment such as servers and the cloud.
  • Ensuring security
    What level of security is needed for a closed network that is dedicated to program production, what risks should be assumed, and what measures should be adopted to ensure security?

From the viewpoint of the increase in scale and efficiency described above, one issue is the development of network monitoring techniques for the IP production system. The IP network enables easy multiplexing of multiple signals, such as for video, audio, and control, but also introduces the problem of possible packet loss from network congestion, which does not occur in SDI systems. It is therefore important to monitor IP flow (the flow of the series of IP packets) through the IP lines in real time. Also, signals in an IP network are transmitted through a number of network switches, so the transmission route of each IP flow must be known to determine points of failure and deal with failures. While IP network monitoring technology is required, conventional network monitoring technology is not capable of real-time monitoring with the accuracy required by a program production system.

Although security basically requires the application of IT technology as needed, safety in the sharing of devices also requires a mechanism for preventing contention in device control.

5. R&D trends in long-distance transmission of 8K materials

This chapter focuses on our R&D for the long-distance transmission of signals that carry 8K materials. Of the issues described in section 4.1, we proceeded with an investigation toward the maximum use of the results of technical development in the telecom field for the long-distance transmission of signals at 40 Gbps or higher. For the five years from 2007 to 2011, we were commissioned by the New Energy and Industrial Technology Development Organization (NEDO) to develop "Next-Generation High-Efficiency Network Device Technology." Assuming future 8K distribution within the broadcasting station, we developed technology for multiplexing four signals using the optical time division multiplexing (OTDM) of optical channel transport unit-3 (OTU3) frames26), which have a transmission rate of 40 Gbps27).

Subsequently, the onset of practical 100G Ethernet brought expectations of lower prices from the widespread use of devices. We investigated the transmission of uncompressed full-featured 8K signals over 100G Ethernet. Specifically, we studied video clock recovery on the receiving side28) and measures against packet loss29) to achieve stable, low-delay transmission. We produced a prototype to demonstrate the results.

One problem with using dark fibers is the time-consuming and costly installation of erbium-doped optical fiber amplifiers (EDFAs)*26 at intervals of a few tens of kilometers along the transmission line. To address that problem, we conducted experiments on using a distributed Raman amplifier, in which the transmission fiber also serves as the amplification medium*27, together with Return-to-Zero Differential Quaternary Phase Shift Keying (RZ-DQPSK) modulation*28, which is robust against wavelength dispersion, to achieve the transmission of full-featured 8K video over a distance of 300 km without repeaters. The results showed that it is possible to extend the transmission distance by 100 km relative to the case in which the distributed Raman amplifiers were not used30).

When investigating the repeaters used with dark fibers, compatibility with the equipment interface must be considered. The U-SDI signal5) interface supports systems up to uncompressed full-featured 8K video. The U-SDI uses a baseband signal for multilink optical transmission*29 and can transmit uncompressed 4K and 8K video that has various parameters with a 10.692 Gbps nonreturn to zero (NRZ) signal (10G link signal). We developed a long-distance transmission method for uncompressed full-featured 8K video30) that conforms to that standard and can extend the transmission distance by applying an error correction code31). We also developed an experimental device that implemented the method and conducted joint experiments with the National Institute of Advanced Industrial Science and Technology (AIST) using a dispersion-compensating fiber (DCF)*30 and an optical fiber amplifier. The long-distance transmission of uncompressed 8K video*31 between NHK STRL and AIST Tsukuba (173 km) was successful32).

Concerning technology for using telecom carrier lines, we have made progress in the development of technology*32 for transmitting full-featured 8K video over the 10 Gbps lines provided by various carriers using a mezzanine compression technique*33 such as JPEG-XS33) *34. We have also developed a method for mapping compressed video signals to Realtime Transport Protocol (RTP) payloads for improving video degradation caused by line packet loss34) and error correction and redundancy functions for improving reliability and robustness against continuous packet loss35) *35. We are also investigating mezzanine compression for transmission that enables the unified treatment of 2K, 4K, and 8K signals to further increase the efficiency of live program production36).

6. R&D trends in IP production systems for more efficient live program production

The monitoring of IP flows on the network interfaces described in section 4.2 should be carried out at the network switch, but doing so might affect switch performance. Therefore, we developed a device that can monitor IP flow in multiple lines in real time ( Fig. 5)37). The prototype monitoring device comprises a network switch that is equipped with a field-programmable gate array (FPGA) and an aggregation application. The device is inserted into the line that is to be monitored and copies only the incoming packets that require analysis for output to the aggregation application. The application measures packet loss and jitter, etc. The original packet input to the FPGA-equipped network switch is output from the output interface that is coupled to the input interface.

We are also investigating the use of an IP network for device sharing to improve efficiency and extend the existing production functions of studios. Sharing production equipment over an IP network differs from having the devices at hand in that the devices can be accessed by anyone, which creates the possibility of production being affected by unintended operation when a device that is in use receives a control command from a different program production. To avert that problem, we developed a mechanism for disabling control from another program production while a device is in use by extending the AMWA IS-04*36 application programming interface (API)*37 to enable the control device to decide whether or not to execute the received control command38)39).

Future program production systems are likely to be implemented with general-purpose IT equipment and software, and we are also investigating mixed 2K, 4K, and 8K systems and the future shift toward software implementation of functions. Having in mind a future system in which software implementation makes it possible to increase or decrease cloud resources according to program production scale, such as the number of cameras and resolution (2K or 8K, etc.), we developed a system that divides 4K or 8K signals into multiple 2K signals in units of pixels40). We also demonstrated the feasibility of performing 8K video switching processing, etc., by parallel execution of multiple instances of 2K processing software running on a general-purpose server.

Figure 5: Concept of real-time IP flow monitoring system37)

7. Conclusion and future development

We have described the evolution of optical and IP transmission technology for use in program production and related issues as well as trends in our own research and development. Optical and IP transmission technology is taking on an increasingly active role in program production because of the broadband and low-loss features of optical fibers, the low cost of LAN equipment, and the many-to-many connectivity of IP networks. We can expect that the use of optical and IP transmission technology in program production will continue to increase with advances in communication technology.

With the adoption of IP technology, the movement to build systems using general-purpose IT equipment and software and to maintain constant system evolution through software has been gaining momentum in various fields, and broadcasting is no exception. Advanced software is susceptible to bugs that may affect system operation, so various backup methods must be investigated.

We believe that our role is to conduct continuous research and development to solve both existing problems and those that emerge in the future and to work towards the standardization of the results as necessary to enable the widespread adoption of optical and IP transmission technology.