Thread Links Date Links
Thread Prev Thread Next Thread Index Date Prev Date Next Date Index

Re: XAUI Jitter Telco Protocol 14th Febuary 2001



Rohit,

It is interesting you have asked this question; I had been wondering about this and recently obtained information on the
background of this.

First, however, the terminology.  You refer to the breakpoint as the point where the slope goes from being flat to being
-20 dB/decade.  With this definition, the SONET jitter tolerance mask has 2 breakpoints.  However, the usual terminology
is to refer to the break point as the point where the slope goes from being -20 dB/decade to being flat, moving from left
to right.  With this terminology, the SONET jitter tolerance mask for OC-48 has 3 breakpoints at 1 MHz and 6 kHz (or 5 kHz in ITU-T G.825).  The upper breakpoint reflects the jitter tolerance of the wide-band clock recovery circuit.  Above this frequency, the system must tolerate 0.15 UIpp of sinusoidal jitter.  The lower breakpoint reflects that there may be a narrower-band phase-locked loop in the regenerator (which would, among other things, control jitter accumulation).  Above this frequency, the system must tolerate 1.5 UIpp of sinusoidal jitter.  This means that if this narrower-band PLL also has a bandwidth of 6 kHz, its buffer must be approximately at least 1.5 UI (and if it is narrower, its buffer must be larger).

Returning to your question, it can be rephrased to ask why there is a 15 UIpp level in the jitter tolerance mask.  Certainly a PLL with bandwidth equal to the frequency of the lower breakpoint would track any lower-frequency jitter.  The answer I was given (by a colleague who originally worked on these specs) is that the mask was stopped at the 15 UIpp because, at the time, requiring test sets to generate very large amplitude jitter at low frequencies would be difficult for test sets (and would, at least, make them much more complex) while having very little benefit for the network.  The choice of 15 UI as the cuttoff was made as a value that would be adequate to accommodate jitter sources expected in the network.  Note that the 15 UIpp level extends to 10 Hz; this is the demarcation between jitter and wander.

It is correct that an actual PLL that can accommodate the wide-band jitter (i.e., the narrower-bandwidth PLL) would track jitter at lower frequencies based on a mask that was not cut off at 15 UIpp but instead continued upward at
lower frequencies with a -20 dB/decade slope.  However, it is also valid that, given that we know that the PLL will track the jitter below its bandwidth, it is unnecessary to actually test to the very high amplitude, low frequency jitter values, especially if it is not practical to do so.  Note that the old G.825 mask, which reflects the ETSI mask, does not stop at 15 UIpp.  (The new G.825 reflects both options(ANSI and ETSI) and contains both masks).

I am also interested in knowing the answer to your second question -- namely why the FC/GbE high frequency jitter tolerance was reduced from 0.15 UI to 0.1 UI.  However, the discussion of this in the FC document indicates it was this that gave rise to
the breakpoint being equal to the line-rate over 1667 rather than 2500.

I hope the above clarifies your first question.

Thanks.

Regards,

Geoff Garner
Lucent Technologies
101 Crawfords Corner Rd.
Room 3C-511
Holmdel, NJ  07733
USA
+1 732 949 0374 (voice)
+1 732 949 3210 (fax)
gmgarner@xxxxxxxxxx
 

Rohit Mittal wrote:

 

I don't know if this has been discussed before on the conference calls but does anyone know why the sonet jitter mask (for the Rx) has 2 breakpoints. By breakpoint, I mean the points where the graph of jitter tolerance goes from being flat to sloping down at 20db/decade.For instance, for Oc48, you have breakpoints at 600Hz and 100k.

If you have a PLL, you should generally have only 1 breakpoint, given by the loop bandwidth. That I believe is followed in the FC/GBE communities.

Thanks

ps: Another thing, does anyone know the reason why the high frequency jitter tolerance in FC/GbE was reduced from the sonet .15UI to .1UI

-----Original Message-----
From: Anthony Sanders [mailto:anthony.sanders@xxxxxxxxxxxx]
Sent: Tuesday, February 20, 2001 7:55 AM
To: Serial PMD reflector (E-mail)
Subject: XAUI Jitter Telco Protocol 14th Febuary 2001

Please find attached

1) Protocol from XAUI Jitter Telco, 14th Febuary 2001
2) Updated Jitter Issue List

Please pay attention to two important items, which I would like to get
feedback concerning these item on the reflector asap.

Section 2.5.4; Compliance Receive Eye. My proposal is that we shall not
define a compliance eye directly for receiver tolerance testing, but
only the necessity of including a filter (equal to the polynomial of the
compliance channel), to limit the edge speed.

Section 6.5.1; Use of CJPAT for compliance testing of receiever and
transmitter.
                Does this requirement create too much restriction on the implementor
and testor?
                Is this requirement necessary to guarentee interopability between
devices?
                Is the current jitter specification correct, for CJPAT or K28.5
compliance testing.

Next telco, as always Wednesday, 10:30 (PST)

Best regards,

Anthony Sanders
Principal Engineer
Infineon Technologies
Munich, Germany