The use of frequency-division multiplexing (FDM) goes back almost a century. In telegraphs, for example, several low-rate signals were carried over a relatively wide bandwidth channel using different carrier frequencies for each signal. Guard bands were used to make sure that each signal would not overlap:

                However, such an approach isn’t efficient nowadays for obvious reasons. The first actual orthogonal frequency-division multiplexing (OFDM) scheme dates back to 1966 when Robert W. Chang published his pioneering work on the synthesis of band-limited orthogonal signals for multi-channel data transmission. The main idea behind OFDM is to divide the frequency selective channel into a number of parallel, frequency-flat sub-channels. By making the sub-channels narrowband, the individual channels experience almost flat fading, thus simplifying the receiver design. Using Fourier transforms, Chang improved on classic FDM technology by making the sub-channels overlap:

                In 1967, Saltzberg analyzed and demonstrated the performance of efficient parallel data transmission systems. He concluded that the strategy of designing an efficient parallel system should concentrate on reducing crosstalk between adjacent channels instead of perfecting the individual channels themselves. At that time, the difficulty of sustaining orthogonality with an analog system was a big issue. As the number of subcarriers increased, the modulation, synchronization, and coherent demodulation produced complicated OFDM circuitry requiring costly hardware. This lead to an impractical analog implementation of the Fourier transforms using oscillators. The frequency drift of the latter created inter-channel interference (ICI), causing orthogonality failure.

Throughout the development of OFDM technology, there have been a number of remarkable contributions. The first milestone came about in 1971 when Weinstein and Ebert used a discrete Fourier transform (DFT) to perform baseband modulation and demodulation in the receiver. This evolution made today’s low-cost OFDM systems possible. Inter-symbol interference (ISI) and ICI were mitigated by using a guard time between the symbols and raised cosine windowing in the time domain. Weinstein and Ebert also added a guard interval in the case of multipath channels. Even though the proposed system did not achieve perfect orthogonality among the subcarriers over a time-dispersive channel, it was nevertheless an important contribution.

We need to wait until 1980 for the orthogonality problem to be solved by Peled and Ruiz. In their OFDM scheme, cyclic extensions (now commonly referred to as “cyclic prefix”) replace the conventional null guards of the OFDM symbol. This cyclic extension converts the linear convolutive channel to simulate a channel performing cyclic convolution, thus ensuring orthogonality over a time dispersive channel and eliminating ISI completely between subcarriers as long as the cyclic extension remains longer than the impulse response of the channel:

                As shown in the figure above, the length of Tis long enough to absorb the tail of the delayed copy signal. Orthogonality can be preserved by removing the cyclic extension before doing the fast Fourier transform.

It’s important to understand after readying my short history of where CP-OFDM comes from that development continues to this day. A lot of research around OFDM is still being done, including peak-to-average-power ratio problems, channel estimation and equalization, and synchronization (in time and frequency). You can find some postings on these topics at our blog, www.nutaq.com/blogs.

Reference

[1] The History of Orthogonal Frequency Division Multiplexing, Nick LaSorte, W. Justin Barnes, Hazem H. Refai, IEEE GLOBECOM 2008.