Thread Links | Date Links | ||||
---|---|---|---|---|---|
Thread Prev | Thread Next | Thread Index | Date Prev | Date Next | Date Index |
-----Original Message-----
From: Geoffrey Garner [mailto:gmgarner@xxxxxxxxxx]
Sent: Friday, October 20, 2000 1:52 PM
<snip>It is true that in a real network, the jitter present above 80 MHz for STM-64 (or, in general, above f4 for each rate) is expected to be small. However, it seems it is still necessary to specify this bandwidth for test purposes to guarantee that results are reproducible. In addition, if broadband jitter is being applied to a piece of equipment as in MJS-2, the results will be impacted by how high in frequency the broadband jitter extends because, for the same power spectral density amplitude, increasing the highest frequency means more jitter is being applied.
Related to this, I had mentioned in the conference call that I recalled a 50 MHz high frequency cuttoff, but needed to track down the document (I first came across the number in an offline discussion). I believe I have found the source of this, and also some more complete information. In MJS-2, Tables 3 (page 20) and 5 (page 23), and also in Tables 10, 14, and 30 of FC-PI (Rev 9), sinusoidal jitter applied in the tolerance test is swept to a maximum frequency of 5 MHz. Since for 10 Gbe the line rate is about a factor of 10 higher, the corresponding high frequency cuttoff would be 50 MHz. However, on looking more closely at these tables, I do see that for deterministic jitter and random jitter the high frequency cuttoff is line rate (fc)/2.
For 10 Gbit/s, this would be 5 GHz which is larger than the 80 MHz by a factor of 62.5. This could certainly affect the test results (i.e., applying jitter up to 5 GHz versus 80 MHz). Have I correctly interpreted these high frequency cuttoffs?You indicated you have more information on this; I would be interested in it.
Thanks.
Regards,
Geoff Garner