Thread Links | Date Links | ||||
---|---|---|---|---|---|
Thread Prev | Thread Next | Thread Index | Date Prev | Date Next | Date Index |
Following up on John’s presentation from last week, I was looking at the text in the current draft, and had a question I am having difficulty finding the answer to within the 802.3 specification. As it stands currently, the text in 154.5.4 states that SIGNAL_DETECT parameter maps to the SIGNAL_OK parameter (which I’m assuming means that SIGNAL_DETECT OK results in SIGNAL_OK of OK). In Table 154-5,
it defines a SIGNAL_DETECT value of OK to be the following: [(Optical power at TP3
³ minimum average input power [unamplified] in Table 154–9 AND (compliant 100GBASE-R)] So a value of OK is not determined simply by detecting a power level of greater than or equal to -30 dBm, it’s also having a compliant 100GBASE-R signal input (note that I’m inserting the words “signal input”,
since they appear in a similar situation in Table 88-4, although not in Table 154-5). My question then is this: is there a common understanding of what “compliant 100GBASE-R” (or “compliant 100GBASE-R signal input”) means in this particular case? If it means that you need to be able to decode
an apparently valid signal, wouldn’t that address the issue of a false positive due to an amplifier caused high noise floor? Or is the meaning something completely different? Thanks for any help in better understanding this. Matt To unsubscribe from the STDS-802-3-DWDM list, click the following link: https://listserv.ieee.org/cgi-bin/wa?SUBED1=STDS-802-3-DWDM&A=1 |