Fourier analysis represents a perfect square wave as the infinite sum of harmonically related sine waves. The requisite sinewave components comprise all the odd harmonics of the fundamental square-wave frequency with their respective amplitudes set according to this pattern: 1, 1/3, 1/5, 1/7, and so forth. Even harmonics are not required. The Fourier theory calls out an amplitude pattern that guarantees a sum, which—in the limit—approximates a square-wave shape with amplitude π/4, or 0.78537316.
Figure 1 illustrates the first few terms of the harmonic series. The first waveform is a simple sine wave at the fundamental frequency. The next waveform shows the fundamental plus its third harmonic, the next takes harmonics to the fifth, and so on. The more harmonic terms you add, the more "squarish" the waveform becomes, but it still looks wiggly.
Notice in Figure 1 that the peak ripple amplitude is the same for successive approximations using 3, 5 or 7 harmonics. As you move towards more and more harmonics, the edges square up considerably and the ripples become faster and less significant in the flat parts of the waveform, but the peak ripple error right next to each rising or falling edge does not change. Every transition creates a little "burst" of ripples of before and after the transition. The burst will always have the same amplitude regardless how many more harmonics you add.
In the limit at you approach in infinite number of harmonics, the burst frequency increases without bound, and the burst duration shrinks to zero, so that the waveform develops flat tops and bottoms between transitions. The ripples remain confined to a narrow region on either side of the transitions. Eventually, the harmonic approximation converges in a least-mean-square error sense to the final squarewave shape.
Many engineers wonder how many harmonic terms they must take to adequately represent a good square wave. In other words: What bandwidth does a digital transmission system need? Let's explore that question by first examining the amplitudes that the harmonic-sum model uses. The Fourier theory applies only to an infinite sum, which never occurs in practice. If you have a finite sum, with only a finite number of sinewave components, why not try other amplitudes different from the ideal infinite- sum values? Instead of chopping off the infinite sum after the seventh harmonic, with subsequent amplitudes dropping abruptly to zero, the right side of Figure 2 smoothly tapers the harmonic amplitudes so that, by the time you get to the last one, its amplitude is small. That fact mitigates the discontinuity at the end of the harmonic sequence, making a better waveform.
Both the number of harmonics and the precise schedule of their proportions affect the quality of the result. Stated differently, when considering the adequacy of bandwidth, not only the -3-dB frequency of your system but also the precise shape of its entire transfer function may affect your result. Harry Nyquist fleshed out that notion in his theorem of data transmission, concluding that, given infinite complexity in the receiver and perfect control over the exact system-transfer function, the bandwidth must equal or exceed half the system data rate to achieve reliable binary communication.
Suppose that a system transmits data with a baud frequency of B. The fastest alternating pattern you can send is the sequence 1, 0, 1, 0, ... . That pattern is based on a repeating cell two bits long, 1 and 0, so the pattern has a fundamental frequency equal to only half of B. According to Nyquist, if you pass that much of the signal, you can in theory recover all the data.
However, you don't have access to a receiver of infinite complexity or perfect control over the system-transfer function. You have a digital system that needs a wide-open eye with plenty of margin for setup and hold, so you must design your system to accommodate the rise and fall time, not just the baud rate.
I maintain that the bandwidth required to preserve a rise and fall time of T as it propagates through a system equals approximately one-half divided by T. If your rise and fall times equal 10% of the data-baud interval, so that T equals 0.1/B, then you can express the required system bandwidth of 0.5/T, after suitable algebraic manipulation, as approximately five times B. That simple calculation may explain why people occasionally say the fifth harmonic is all you need. I hope it is clear from this article that the situation is considerably more complex than that. The desired amount of ripple and the desired edge rise/fall time both play a crucial role in determining how many harmonics, and therefore how much bandwidth, you need.
I will end this article with a quote from one old digital-system designer who, when asked how many harmonics of a data sequence are necessary for data transmission, replied, constitutionally speaking, "I take the fifth".