Thread Links Date Links
Thread Prev Thread Next Thread Index Date Prev Date Next Date Index

Re: [802.3EEESG] 10BASE-T question



Mike,

I think that some adjustment to the 10BASE-T transmit voltage would be
entirely appropriate. 

The 10BASE-T output voltage spec (IEEE 802.3-2005 14.3.1.2.1) currently
requires that the driver produce a peak differential voltage of 2.2 to
2.8 V into a 100 Ohm resistive load - a very normal output voltage when
the standard was written in the late 80's, but pretty high nearly 20
years later. This voltage allowed 10BASE-T to coexist in bundled Cat 3
cable with analog phone ringers. The transient when an analog phone
ringer goes off-line in that situation could produce over 250 mV.

That high output voltage is not necessary over Cat 5 or better cable. 

The simple change would be to add a differential output voltage spec for
operation over Cat 5 or better cable. In that case, remove the minimum
voltage spec for peak differential voltage into a 100 Ohm resistive
load. One still would keep the maximum voltage spec of 2.8 V or perhaps
substitute a lower maximum. Change the requirement for the Figure 14-9
output voltage template to be the signal produced at the end of a
worst-case Cat 5 cable instead of at the end of the (Cat 3) twisted-pair
model.

This should be fully backwards compatible with existing 10BASE-T
compliant PHYs over Cat 5 cable. The newly specified transmitters will
produce a signal over Cat 5 cable that is within the range of signal
that the original 10BASE-T produces over the Cat 3 cable channel it
specified. That template provides a minimum eye opening of 550 mV. If I
plugged the numbers into my calculator correctly, the attenuation
difference between Cat 5 and Cat 3 cable at 10 MHz is more than 4 dB so
this should allow the transmit voltage to drop by that. It should be
very little work to do this change.

A more aggressive change that would require real work would be to
determine what receive voltage could be tolerated by today's receivers
which probably can tolerate a smaller eye-opening especially if they are
a 1000BASE-T receiver operating in a slowed down mode. But in that case,
one would either need to only use the lower eye-opening when stepped
down by EEE or add negotiation for low voltage 10BASE-T to auto-neg
because it wouldn't ensure backwards compatiblity with classic 10BASE-T
receivers.

I think the fully-backwards compatible change would be pretty easy to
justify.
To summarize, for operation over the channels specified by 100BASE-TX,
1000BASE-T and 10GBASE-T, delete the spec for minimum voltage into a 100
Ohm load and change the test condition for the Figure 14-9 voltage
template to be over a worst case 100BASE-TX channel.

Regards,
Pat


At 01:46 PM 3/28/2007 , Mike Bennett wrote:
>Folks,
>
>For those of you who were able to attend the March meeting, you may 
>recall we had a discussion on 10BASE-T (in the context of having a low 
>energy state mode) and what we might change to specify this, which 
>included possibly changing the output voltage.  Concern was raised that

>the work required to specify a new output voltage for 10BASE-T would 
>far outweigh the benefit.  Additionally, there was a question regarding

>the use of 100BASE-TX instead of doing anything with 10BASE-T.  Would 
>someone please explain just how much work it would be to change 
>10BASE-T and what the benefit would be compared to using 10BASE-T with 
>the originally specified voltage or 100BASE-TX for a low energy (aka
"0BASE-T" or "sleep") state?
>
>Thanks,
>
>Mike