RE: [10GBASE-CX4] Working Paper Available on IEEE site
Howard,
Took a quick look at the proposed draft and have the following
top level comments:
1. The min xmt amplitude of 800mV is probably not doable on a
chip with 1.2V only supply when all worst case conditions are
considered, it probably needs to be dropped another 50mV
or so. I plan to have a quick presentation showing
this next week. For reference, the min XAUI level is
a lot lower than 800mV using the far end method.
2. Given a 800mV min xmt level, worst case cable model, and
all other worst case conditions, the input to the receiver
will probably be less than 100mV spec. We are trying to
complete some simulations to prove or disprove for
next week mtg, but others should also take a look at this.
3. There is a xmt template specified for the long pulse.
Shouldn't there be one for the short pulse as well to
guarantee ineroability?
4. Should the receiver level and jitter even be specified?
The reason not to is: (1) If the xmt is specified, and if
the channel is specified, then the signal at the receiver
is already determined, no further specs needed on rcv,
and (2) By specifying the receiver, doesn't this preclude
equalization in the cable itself if someone chooses to do
that?
5. I thought it was agreed at last meeting that Chris D cable model
was going to be the worst case. But the worst case model in
section 54.8.2 is Chris D model with an additional 10%.
This seems more worse case than before, what is rationale?
6. Should cable model include phase vs frequency response?
7. For signal detect, it seems that a single noise pulse >100mV
would cause a signal detect OK condition. It may be useful to
have an algorithm with improved noise rejection.
Good first draft!
Steve