Jitter is a key performance factor in high-speed digital transmission systems, such as synchronous optical networks/synchronous digital hierarchy (SONET/SDH), optical transport networks (OTN), and 10 Gigabit Ethernet (GE). This paper outlines the differences between telecom and datacom jitter standards and describes the various jitter applications for compliance testing of 10 G small form-factor pluggable (XFP) transceivers, which have become the dominant 10 G optical interface for telecom and datacom applications.
Accurate jitter measurements are essential for ensuring error-free high-speed data transmission lines. Jitter which is any phase modulation above 10 Hz in a digital signal, is unwanted and always present within devices, systems, and networks. To ensure interoperability between devices and to minimize signal degradation due to jitter accumulation, limits must be set for the maximum level of jitter for an output interface as well as the maximum level tolerated at an input.
Standards bodies determined these limits which can be divided into two categories: telecommunications and data communications. The major telecom standards organizations are International Telecommunications Union (ITU-T) and Telcordia, while the Institute of Electrical and Electronic Engineers (IEEE) is the main datacom standardization organization.
Jitter Aspects and Characteristic Values for 10 G
Telecom and datacom technologies use different timing methods. The system components in synchronous systems, such as SDH/SONET, synchronized to a common clock. In asynchronous and serial systems, such as 10 GE, distributed clocks or clock signals recovered from the data provide the component timing. While it is important to limit jitter generated by components jitter transferred from one component to another is less important than that for synchronous systems, where jitter can increase as it transfers from component to component. Well-defined band-limited jitter generation, tolerance, and transfer requirements exist for SDH/SONET/OTN.
2 High-Speed Jitter Testing of XFP Transceivers
Table 1 shows how the specifications and test methodologies for jitter in 10 GE differ from those for SDH/SONET/OTN transceivers. Both the
specifications and test methodologies attempt to verify that the relative time instability of transmitted signals is not excessive.
In SDH/SONET/OTN systems with regenerators, noise causes the greatest impairment and limiting factor for system performance. Jitter tolerance is
measured using sinusoidal jitter. In Ethernet systems, jitter tolerance is measured using a stressed signal with combinations of impairments.
Table 1 shows characteristic values for XFP transceivers, which support the established telecom standards STM-64 /OC-192 at 9.95 Gbps, and
OTU2 at 10.7 Gbps. The 10 GE datacom standards are supported at 9.95 and 10.31 Gbps, respectively. These transceivers are pluggable optics,
replacing legacy optical circuits with a lot of advantages: cost savings, very compact and flexible design, exchangeability and direct replacement
with equipment from different vendors, and hot-plug capable.
SDH/SONET/OTN Jitter Measurements
Three relevant test configurations for jitter performance measurements are: jitter generation, jitter tolerance, and jitter transfer.
1. Jitter generation: A certain amount of jitter will appear at the output port of any network element (NE), even with an entirely jitter-free digital or
clock signal applied to the input, effect known as jitter generation. The NE itself produces this intrinsic jitter, for example due to thermal noise and drift
in clock oscillators and clock data recovery circuits. Output jitter is the total jitter measured at the output of a system, specified in unit intervals (UI).
One UI corresponds to an amplitude of one clock period, independent of bit rate and signal coding, displays results as a peak-to-peak value or root
mean square (RMS) value over a defined frequency range. Peak-to-peak results provide a better measure of the effect on performance, as the extremes
can cause errors, whereas RMS values provide information about the average total amount of jitter.
2. Jitter tolerance (maximum tolerable jitter, MTJ): A measurement that checks the resilience of equipment after the input of jitter, which is required
to confirm that the NEs in the transmission system can operate error-free in the presence of worst-case jitter from preceding sections. Jitter tolerance is
one of the most important characteristics of the clock recovery and input circuitry of network equipment.
3. Jitter transfer (jitter transfer function, JTF): A measure of the amount of jitter transferred from the input to the output of the network equipment.
JTF is important for cascaded clock recovery circuits in long-distance transmission systems with regenerators and line terminals. In addition, the jitter
transfer measurement is required to confirm that cascaded NEs in the transmission system have not amplified the jitter.
The bathtub curve can also be used to separate random (RJ) from deterministic jitter (DJ). The slope of the bathtub curve can be used to measure the
random jitter, whereas the slope offset positions on the time axis are set by the deterministic jitter. Total jitter (TJ) is quantified by noting the points
where the BER reduces to 10-12 at both eye edges, and subtracting this interval from the bit period. The measurement unfortunately takes a long time.
In practice, the data points are measured from a BER of 10-3 to 10-8 and then extrapolated to a BER of 10-12.
2. Stressed Eye, or stressed receiver sensitivity (SRS): The SRS test verifies that a receiver can operate at a BER of better than 10-12 when receiving
the worst-case permitted signal, which is analogous to jitter tolerance. A SRS test consists of two parts: an eye mask and a sinusoidal jitter template,
both of which are used for step-through measurements. The eye mask is designed to simulate a variety of stresses, including RJ, DCD, ISI, and PJ. The
different stressing components are added to close the eye (blue area in Figure 2b), leaving an assured area of error-free operation in the center (white
area in Figure 2b). The receiver is expected to operate successfully within this small area despite the impairments.
Once the stressed eye is constructed, BER performance is verified while stepping the sinusoidal jitter through the levels specified in the jitter template
(jitter frequencies and amplitudes, see Figure 2b). The receiver must accommodate an impaired incoming signal with applied jitter and crosstalk, and
must achieve a BER of <10-12.
In the typical test setup for XFP shown in Figure 3, the FIBERLAND Solutions ONT-506 is used to verify the jitter integrity of the transceiver. The XFP
module contains an optical transmitter and receiver in the same unit. One end of the module is 10 Gbps chip-to-chip electrical interface, or XFI
serial interconnect, handles differential 10 Gbps signals, which the other end is an optical connection, that complies with 10 Gbps standards.
XFP modules are tested using a compliance test board that has four high-speed electrical connections, two inputs and two outputs.
The ONT-506 injects a signal to the transmit side of the module at test point B’ and measures the receive side of the module at test
point C’. Accurate characterization of transmit and receive sides of the XFP module requires both electrical-to-optical (B’ testing) and
optical-to-electrical (C’ testing) measurements.
The optical output (TX) looped back to the optical input (RX) allows for an electrical-to-electrical jitter test. The electrical output (C’) looped
back to the electrical input (B’) allows for an optical-to-optical jitter test. The loopback method can be used for module verification but cannot
be used to verify the performance of the transmit and receive sides independently.
Both telecom and datacom market segments use the term jitter. Standardization bodies have developed well-documented jitter specifications
and measurements, for which the requirements differ for each sector. XFP transceivers provide a very good example for different jitter standards
supporting multiple data rates.
Three basic test configurations are used to test XFP jitter. For test equipment to meet compliance requirements, it must support all basic jitter applications for both electrical and optical interfaces.