Thread Links | Date Links | ||||
---|---|---|---|---|---|
Thread Prev | Thread Next | Thread Index | Date Prev | Date Next | Date Index |
-----Original Message-----This email is in response to:
From: Tom Lindsay [mailto:Tom.Lindsay@xxxxxxxxx]
Sent: Thursday, February 22, 2001 3:13 PM
To: anthony.sanders@xxxxxxxxxxxx
Cc: Serial PMD reflector (E-mail)
Subject: XAUI frequency tolerance time window
1.2 Very low frequency jitter definition frequency and amplitudeA- Comment to be entered
Tom to send email concerning explanation of difference between low
frequency jitter and clock tolerances of 100ppm clock tolerance.
Agreed.
___________A clock will have a frequency, averaged over a period of time. It may fluctuate within that average. A receiver should track that average, such that the CDR (VCO) is at the same average frequency as the incoming data rate. Generally, this average frequency drives requirements for elasiticity buffering, and it must be controlled through a clock tolerance specification (+/-100ppm).
The fluctuations within the average can be thought of as jitter - jitter also must be controlled.
The time period (or some number of bits, etc.) for measuring clock tolerance should be specified. Generally, if the time period is increased, more lower frequencies of jitter will be averaged (filtered from the measurement); if the time period is decreased, more lower frequencies of jitter will be unfiltered and seen as frequency errors which could be controlled by the 100ppm spec. Note that using 100ppm frequency as jitter control in this manner is more stringent than present DJ specifications.
Complete control of the frequency spectrum is recommended. This is to prevent elasticity buffering overruns and excessive jitter beyond what can be tracked. To provide complete control, frequencies above the frequency that corresponds to this time duration should be controlled with jitter specifications; lower frequencies would be controlled by the 100ppm spec.
Define T = the measurement time duration for clock tolerance measurement. Then, require that a clock frequency must be within 100ppm of ideal when averaged over any interval >=T. Next (this is without rigor, a gut feel, someone should confirm), I would suggest that jitter be controlled down to freq <1/(2*T). This implies testing jitter output for a period >2*T , and testing tolerance with sine frequencies down < 1/(2*T).
What should the value be for T? For practical lab measurements, the time may want to be on the order of at least 1 second. This implies measuring jitter output for >2 seconds, which is still reasonably practical. However, for tolerance, this implies sweeping down to <0.5 Hz, which is not practical. Fibre channel defined T as 200,000 bits, which equates to approx 200 usec at 1Gbit/sec. This implies sweeping sine jitter down to approx 2.5 kHz for that data rate.
I don't have a good feel on what to do from here.
Comments, reactions, suggestions?
- Is this worth pursuing, or this unncessary specification/control for a non-problem?
- If we set T, should its value be short so that sine sweeps can stop at higher frequencies? If so, what frequency? If T is too short, it becomes impossible to verify frequency in the lab; if T is too long, then we run into practical limitations of sine jitter sweeps.
Thanks, Tom Lindsay
Vixel
425/806-4074