Xaui jitter tolerance
Since XAUI jitter will likely be addressed in a separate meeting at Austin,
I would like to raise the issue of modifying the jitter tolerance frequency
"break point" from the standard baudrate/1667 (used in MJS) to something
significantly higher.
For Xaui, the baudrate/1667 would give us a tolerance break point at 1.875
MHz. My feeling is that there is nothing magical about the baudrate/1667
and that it doesn't accurately reflect typical receiver operation in today's
monolithic PLL's. (Perhaps in early telecom days SAW filter applications
required this, but today's receiver designs (at least in XAUI) will not be
using such costly techniques.) Moving the jitter tolerance break point out
to ~5 MHz or so would allow us to track more of the jitter components and
perhaps even make the Tx design easier (smaller capacitors, etc.).
Soo, would there be any objections to moving the tolerance break point out?
I'd like to get some feedback on this before the Austin meeting if
possible.
- Richard Dugan