This DesignCon 2003 discussion panel hosted by Henri Merkelo of AtSpeed Technologies had to do with advanced transceiver architectures for high-speed backplane applications.
On behalf of Accelerant Networks, I warmed up the audience with a 10 -minute summary of the case for multi-level signaling (specifically PAM-4) in the backplane environment. This presentation differs from my earlier DesignCon 2000 conceptual presentation Multi-Level Signaling because here I get to show actual measured eye patterns using the Accelerant PAM-4 transceiver operating at 6 Gb/s on an ordinary serial backplane that was originally designed for only 2.5 Gb/s.
What follows are my slides and speaking notes.
As part of today's panel discussion I'd like to throw out a few quick thoughts about high-speed serial links.
In coming up with this presentation I was inspired by a fable. It's the story of a contest of speed between a quick but feckless rabbit (the hare) and more slowly-moving, but very determined, tortise.
What that has to do with the title of this presentation you will see a little later.
To begin, let's look at the fundamental effects that limit the distances we can achieve at very high speeds.
First, there is the dielectric loss. This relates to the fact that all insulating materials, when placed in a microwave oven, heat up. The amount of heating may be very great or very little, but the heating effect is always present. The degree of heating in the presence of microwaves is specified by the loss tangent of the material.
For example, you may have purchased some "microwave safe" dishes. These have a relatively low dielectric loss tangent, and so don't heat up very much in the microwave oven. Such a material would make a fair ceramic substrate for the conveyance of high-speed digital signals.
Other materials, like chicken, heat up tremendously when exposed to microwaves. In essence, the chicken absorbs quite a bit of the microwave energy to which it is exposed. That makes chicken a terrible insulating media for use in a high-speed digital product [laughter]. For one thing, it's difficult to get anything to stick to it, because of the grease, and for another the dielectric losses will quickly absorb your fast-moving transients.
For a material with a flat loss tangent (meaning that it does not vary with frequency), you lose a certain fraction of your signal power every time it wiggles up and down. Total losses therefore vary in proportion to how often you wiggle.
Mathematically, we say that dielectric loss (in decibels) grows in direct proportion to frequency.
Materials in popular use for high-speed digital applications are specified with a worst-case (i.e., flat) loss tangent.
The actual loss tangent may not be flat, but from a worst-case design perspective you must assume that it is flat and equal to the worst-case specification at all frequencies.
The skin effect losses are a form of resistive trace loss. These losses are a manifestation of the basic principle that any conductor, upon the passage of current, heats up. Since part of your signal power is converted to heat, there is that much less power available at the end of a long trace to activate the receiver.
The exact amount of resistive heating depends on the distribution of current density within the cross-section of the trace.
What's important when considering the skin-effect is to remember that the interior of a trace is self-shielded by the top and bottom surfaces of the conductor.
I show in this next figure a photomicrograph of the cross-section of a typical pcb trace. On the right I've marked the effective thickness of shielding (skin depth) required to provide a significant degree of self-shielding effect.
No magnetic fields, and thus no currents, penetrate to the interior of the conductor. Therefore, all the high-frequency signal current rides in a shallow band just underneath the surface of the conductor.
The self-shielding effect becomes progressively more pronounced at higher frequencies, further restricting the useful current-carrying cross-sectional area of the trace.
The net result is that the effective AC resistance of a trace, and thus your signal loss in dB, rises in proportion to the square root of frequency.
Both the dielectric and skin-effect loss mechanisms tend to shrink the fastest-moving portions of a digital signal.
If you want to know precisely how much shrinkage to expect, get a copy of my new book, High-Speed Signal Propagation, which contains all the equations you need to produce these time-domain displays of skin-effect and dielectric-effect loss mechanisms.
Equalization can help.
Here I show an example of a fixed transmit-based equalization scheme. This is called "pre-emphasis".
It is based on the observation that the signal after each changing edge seems to have some initial difficulty crossing the receiver threshold. To counteract this tendency, the transmitter amplifies each changing edge, and then quickly reverts to a more nominal sustaining level for the long, flat parts of the transmitted signal.
- A simple fixed equalizer can often add 50% to the attainable distance in a digital channel. Given zero additive noise, you can theoretically equalize a channel to achieve arbitrarily high bandwidths, but two factors limit this achievement:
- Circuits are never completely noise-free.
- The faster you go, the more elaborate you must make your equalizer.
In a backplane environment, most of the noise comes from crosstalk. The next measurement comes from one of the 350 data channels held in a physical library of backplanes by Accelerant Networks. We've been cataloging backplane performance there for some time. The pictures always look like this.
- The signal amplitude (in dB) falls off in proportion to frequency. That means the signal amplitude in volts falls off exponentially with freqeuncy.
- The crosstalk amplitude generally grows in proportion to frequency.
- Between the two effects, the usable channel bandwidth is severely pinched.
In this example, binary operation at 3.125 Gb/s (which yields a Nyquist rate, or maximum alternation ratre, of 1.565 GHz), can reliably take place with a signal to crosstalk ratio (SCR) of about 20 dB.
Doubling the speed of operation to 6.25 Gb/s (a Nyquist rate of 3.125 MHz) drops the SCR considerably.
In this case the SCR drops to a value less than 6 dB, an unacceptable point of operation.
This figure illustrates what I call a bandwidth cliff, a point where the operating characteristics fall off so rapidly with frequency that ordinary binary operation above that speed is not possible.
Note that equalizing this channel would not improve its operating characteristics. It is the SCR, not the absolute received signal amplitude, that limits the performance.
Mankind has a long history of experience with bandwidth-limited channels. As you can see here, in 350 B.C. the bandwidth of this channel was limited by the speed of the slave's arm, and the accuracy of the water-clock.
Here is the story of the signaling system of Aneas, as told in "The Early History of Data Networks", by Gerard J. Horlzmann and Bjorn Pehrson, IEEE Computer Society Press, 1994 ISBN 0-8186-6782-6. I quote from p. 24,
Polybius, in The Histories, Book X, describes:
The first person who we reliably know developed a telegraph was Aeneas (not the Aeneas from Vergil's Aeneid.) Aeneas was a well-known author in ancient Greece, who lived around 350 B.C. and wrote works on military strategy. Only part of his main work, "The Art of War", still exists, and unfortunately it does not contain the description of his telegraph.
We do have a clear description of Aeneas's design by the historian Polybius (ca. 200-118 B.C.). In "The Histories" Polybius first described, in crystal-clear prose, the limitations of plain beacon fires.
"I think that as regards the system of signaling by fire, which is now of the greatest possible service in war but was formerly underdeveloped, it will be of use not to pass it over but to give it a proper discussion. It is evident to all that in every matter, and especially in warfare, the power of acting at the right time contributes very much to the success of enterprises, and fire signals are the most efficient of all the devices which aid us to do this. For they show what has recently occurred and what is still in the course of being done, and by means of them anyone who cares to do so even if he is at a distance of three, four, or even more days' journey can be informed. So that it is always surprising how help can be brought be means of fire messages when the situation requires it. Now in former times, as fire signals were simple beacons, they were for the most part of little use to those who used them. For the service should have been performed by signals previously determined upon, and as facts are indefinite, most ot them defied communication by fire signals. To take the case I just mentioned, it was possible for those who had agreed on this to convey information that a fleet had arrived at Oreus, Peparethus, or Chalcis, but when it came to some of the citizens having changed sides or having been guilty of treachery or a massacre having taken place in the town, or anything of the kind, things that often happen, but cannot all be foreseen--and it is chiefly unexpected occurrences which require instant consideration and help-- all such matters defied communication by fire signal. For it was quite impossible to have a pre-concerted code for things which there was no means of foretelling."
The Greek historian Polybius then goes on to describe how Aeneas in 350 BC used PWM to communicate complex messages with a binary signaling scheme. This is how Polybius described it, over 2,100 years ago:
"Aeneas, the author of the work on strategy, withing to find a remedy for the difficulty, advanced matters a little, but his device still fell far short of our requirements, as can be seen from his description of it. He says that those who are about to communication urgent news to each other by fire signal should procure two earthenware vessels of exactly the same width and depth, the depth being some three cubits and the width one. Then they should have corks made a little narrower than the mouths of the vessels [so that the cork slides through the neck and drops easily into the vessel] and through the middle of each cork should pass a rod graduated in equal section of three finger-breadths, each clearly marked off from the next. In each section should be written the most evident and ordinary events that occur in war, e.g., on the first, "Cavalry arrived in the country," on the second "Heavy infantry," on the third "Light-armed infantry," next "Infantry and cavalry," next "Ships," next "Corn," and so on until we have entered in all the sections the chief contingencies of which, at the present time, there is a reasonable probability in wartime. Next, he tells us to bore holes in both vessels of exactly the same size, so that they allow exactly the same escape. Then we are to fill the vessels with water and put on the corks with the rods in them and allow the water to flow through the two apertures. When this is done it is evident that, the conditions being precisely similar, in proportion as the water escapes the two corks will sink and the rods will dissappear into the vessels. When by experiment it is seen that the rapidity of escape is in both cases the same, the vessels are to be conveyed to the places in which both parties are to look after the signals and deposited there. Now whenever any of the contingencies written on the rods occurs he tells us to raise a torch and to wait until the corresponding party raises another. When both the torches are clearly visible the signaler is to lower his torch and at once allow the water to escape through the aperture. Whenever, as the corks sink, the contingency you wish to communicate reaches the mouth of the vessel he tells the signaler to raise his torch and the receivers of the signal are to stop the aperture at once and to note which of the messages written on the rods is at the mouth of the vessel. This will be the message delivered, if the apparatus works at the same pace in both cases. "
Not many would have trouble repreducing the device from this description. It appears that precisely this system was used for the communications of the Roman troops at Carthage, on the Tunesian coast of North Africa, and Sicily in the second century A.D., long after Polybius.
We have developed a number of ways of dealing with limited bandwidth.
For example, circa 150 B.C. the system shown previously had been improved, as shown below in two ways. First, you will notice the use of multiple torches, which multiplies the effectiveness of the torches. Next, if you look closely, it appears that the slaves are now wearing fire-protective coverings on their heads, covering their hair. That's a good thing thing when you must work with a lot of torches in a confined space.
The next example shows how we deal with limited bandwidth in modern times.
This picture shows measured waveforms using an Accelerant PAM-4 multilevel transceiver running at a baud rate of 2.5 GHz (~ 5 Gb/s).
At startup (without equalization) the received data eye is closed. After automatic convergence the transmitter pre-distorts (equalizes) the transmitted PAM-4 signal so that the received signal is easily recovered.
The transmitted baud interval is 400 ps, yielding a delivered data rate of roughly 5 Gb/s.
On this particular backplane the difference in attenuation between 2.2 and 4.4 Gbaud is precisely 9 dB. Therefore, the PAM-4 eye openings at 2.2 Gbaud (=4.4 Gb/s) are the same as the binary eye openings at 4.4 Gbaud (=4.4 Gb/s).
In the absence of crosstalk or other noise these schemes would perform the same. When you add crosstalk the difference quickly becomes apparent—the PAM-4 system, because it works at a lower bandwidth, is far less susceptible to crosstalk.
Note that PAM-4 enjoys a huge advantage in jitter, since the PAM-4 eye is twice the width of the binary 4.4 Gbaud eye. The PAM-4 system is less susceptible to random jitter, and since it operates at a lower bandwidth picks up less noise in the first place.
On this backplane, the overall SNR using PAM-4 in a realistic multi-transceiver environment would be far better than with binary at the same delivered data rate.
Multi-level signaling is evolving towards ever-smaller, and ever-faster, digital channels.
This progression has been going on in the electrical world for over a hundred years. It's unstoppable, and now it's finally come time for us to adopt multi-level signaling on the backplane.
Notes regarding use of multi-level signaling in telegraph systems in 1872 are taken from "Edison: A Life of Invention", by Paul Israel, John Wiley & Sons, 1998 ISBN 0-471-52942-7. This excerpt describes part of the development of Edison's duplex system which independently modulated the amplitude and polarity of the telegraph signal to double the transmitted bit rate on what was essentially a bandwidth-limited communication channel.
[p. 79] [in 1872] "Edison took a significantly different approach from the one most commonly employed by those working on duplex telegraphy. Most other inventors sought to balance the [receiving] relay electrically at the transmitting station to prevent it from responding to [outgoing] signals. Edison instead used a neutral relay at one end to respond to variations in current strength and the familiar polarized relay at the other end to respond to changes in the polarity of the current."
I'll wrap up this talk with a couple of examples, showing measurements taken from a backplane that benefitted from a PAM-4 implementation.