Thread Links | Date Links | ||||
---|---|---|---|---|---|
Thread Prev | Thread Next | Thread Index | Date Prev | Date Next | Date Index |
Jeff,
Although you addressed this directly
to me, I assume that you put this topic to the reflector for general
discussion.
As you know, this proposal needs to
be brought forward to the study group and agreed to by at least a 75% majority
before it become an objective. I encourage you to make such a motion at
our next meeting if you feel that this is a necessary change.
Since you directed the message to me, I'll also make some remarks as a citizen of 802.3 and not SG chair: My personal opinion is that it would
help if you elaborated on what this objective would mean in practice (especially
for those who do not participate in the OIF).
I am personally aware of system
requirements for 1E-15 and below, but as you imply in the phrasing of the
proposed objective, it is prohibitive to measure such low error rates. The
specific language you use implies that we will define a system that is
guaranteed to work at a BER of 1E-15 but we will only verify its performance by
measuring the BER to 1E-12. Giving this topic only limited thought, I see a couple of
ways to approach this:
1. Define the system (transmitter, channel, receiver) such that simulated performance is 1E-15, but in compliance test performance is only verified to 1E-12. Presumably, parametric values specified in the standard include margin to account for the difference in what was simulated at 1E-15 and what can actually be measured 1E-12. For a quick example, consider random jitter. If the link is defined such the peak-peak random jitter at 1E-15 is 0.15UI, then the specification would presumably incorporate a specification for peak-peak random jitter at 1E-12 of 0.133UI. 2. Define a system for
performance to 1E-15, and as part of compliance test rely on extrapolation of
measured data (for example, bathtub curves along the vertical or horizontal
axis) to derive values for 1E-15 that can be compared with the specified
values.
One interesting observation regarding (1) is that, if we were to have a closed eye system (i.e. the solution relies heavily on receiver-based equalization, in that the eye at the receiver input is closed), I would expect receiver compliance to be based on "operation at a given BER when driven by compliant driver through a compliant channel." This is the model that has been employed in 100/1000M twisted pair links and in 10GBASE-CX4. Using the model defined in (1), we would drive a compliant channel with a compliant driver and ensure that receiver BER performance was better than 1E-12. I would argue that this says nothing about the receiver's ability to operate at 1E-15. Some additional impairment (worse than worst-case transmitter or channel) needs to be included to ensure that the appropriate margin is in the receiver design. With regards to (2), we would run the
risk of having a hole in the specification (solutions that are both perceived to be compliant but
are not interoperable) since error floors below 1E-12 would be undetected and
extrapolation would provide misleading results.
In summary, I do not debate that some applications would like to see BER exceed 1E-12 (and in some cases, exceed 1E-15). With regards to your proposed objective, I think it would be useful to get a better understanding of how a Backplane Ethernet standard could be judged to have met such an objective or not. I think members of the OIF community
who also participate in the SG could
share some insight here. I welcome them to do so.
Thank you, -Adam
|