XAUI signal detect
Dawson:
In an editor's note on 278, you indicate that signal detect was added to XAUI as part of a resolution of comment 930. Was this left to editorial license, or was a specific remedy voted on? We think this is a bad idea, for reasons given below, and will recommend it be reversed. We'll input a comment, but I wanted to find out where it originated.
The note indicates it was a response to comment #930, which presumes the existence of a simple retimer in the PMA, which wants to relay a loss-of-light input over XGXS (XAUI) without use of a LF code, presumably by squelching its output. Since the XGXS is AC coupled, this will result in differential inputs at the DTE end which are biased at their switching point. The addition of a signal detect function I believe is an attempt to recognize this condition and ensure that the lack of a valid signal is detected at the DCE.
We think that this new function isn't needed, and that it will have a pretty negative impact on XAUI performance and reliability. It's not needed because even simple retimers could easily implement a mode which outputs the LF sequence interspersed with idle repeatedly when a signal detect input from the optics is inactivated. It may be acceptable to not randomize idles in this fault condition. This method for communicating LF is extraordinarily simple -- there's no reason to define another one. Moreover, the basic function of the retimer is to reset the jitter budget. Since this is best done with an implementation which fully decouples the media clock from the XGXS clock, the proffered case of a simple retimer which does not have this capability may be rare. A full retimer, which would include a clock tolerance FIFO capable of IDLE insertion/removal clearly could obviously generate LF sequences. Easing implementation of the regenerator-style retimer does not justify b!
ur!
!
dening every XGXS implementation with a significant performance and reliability penalty.
The more important objection is that implementing an analog signal detect will reduce performance and reliability of all XAUI implementations to support a rare case. Here's why:
Typical forward crosstalk for 50 Ohm signals implemented with stripline construction and 9 mil space is about 5%. This value saturates in only 2 cm of side-by-side run for the risetimes typical of XAUI signals. 5% crosstalk with a 800 mV single-ended drive results in 40 mV of single-ended noise coupled to the line, from a single interferer and a coupled length of 2 cm. For even modest run lengths, and including other noise effects, a minimum of 100mV of effective differential noise would be expected. This is by no means worst case.
In theory, signal detect functionalilty could be implemented either as an analog envelope detector, or by differentially biasing the inputs and then detecting a continuous zero at the input. But, an envelope detector which can reliably detect a signal smaller than the 200mV XAUI sensitivity but larger than the 100mV expected noise across process, voltage, and temperature is a challenging design, which would significantly complicate the already difficult XAUI receiver. This receiver is required by the deterministic jitter and ISI requirements to provide gain to a pulse of less than 200 ps. duration and 200 mV differential amplitude. Such a high gain, wide bandwidth amplifier will almost certainly oscillate if its inputs are biased at zero differential voltage, with undriven, AC coupled inputs. So, if squelched outputs on XAUI lanes are an acceptable way to indicate failure, then offset bias must be used to prevent oscillation. However, 100mV of differential offset would !
di!
!
rectly subtract from the sensitivity of the receiver, resulting in a severe reduction in reach. In addition, it would displace received edges in time, adding the equivalent of .1 to .2 UI of deterministic jitter. This seems like an unacceptable penalty.
Best Regards,
Joel