Jitter Generation on T1 or E1 Lines
When the signal traverses a network, the jitter generated by the DUT becomes the input jitter to the next part of the network. If this jitter is amplified, it can exceed the jitter tolerance of the subsequent DUT. In this way, excessive jitter may accumulate and cause errors as the signal progresses through the network equipment.
Now, Jitter Generation and Measurement, Pulse Mask Compliance (XX012) application is available as a part of basic applications in T1 E1 analyzer.
GL's Jitter Generation software has been developed to generate jittered output T1/E1 signal with user-selected frequency and amplitude. It is suitable for testing jitter tolerance and compliance with standards such as G.823. In conjunction with GL's Jitter Measurement application, Jitter Generation may be also be used to test and measure jitter transfer.
GL’s Pulse Shape Measurement software is also available as a part of Jitter Generation and Measurement application. It can determine if the pulse shape fits within a “pulse mask” as specified by standards ITU G.703 and ANSI T1.102-1993.
- Generates intrinsic jitter without any error as per G.823 standards
- Generates user-defined jitter value against an input jitter tolerance mask to test DUTs capability to tolerate large amounts of generated jitter.
- In conjunction with Jitter Measurement provides peak-to-peak jitter value for a given frequency at the system output
- Evaluate the jitter in real-time on either a tick-by-tick or a cumulative basis
What is Jitter?
Jitter is the difference between the actual time of arrival of a clock pulse and its theoretical arrival time
- Jitter: Small amplitude and fast variations in time instants of clock pulses with frequency of variations above 10 Hz.
- Wander: Larger amplitude and slow variations in time instants of clock pulses with frequency of variations below 10 Hz.
- Drift: Very slow variations in a clock signal (below 1 Hz).
- Frequency Offset or Deviation: A permanent or steady-state difference in clock rates. It will eventually result in either frame slips (bit insertions or deletions) or in loss of synchronization of network elements.
Sources of Jitter
- Aging of clock and data recovery circuits,
- Thermal and loading effects,
- Doppler shifts, and
- Multiplexing / De-multiplexing from higher bit rate data streams
Jitter tolerance is a measurement to check the error-free operation of equipment at a maximum specified level of input jitter. This measurement type allows results for defining the equipment specifications in the form of a jitter tolerance mask. To test the maximum jitter tolerated, the amplitude of the jitter tester is increased at each frequency until transmission errors or alarms are detected.
As a signal traverses a network, the jitter generated by each piece of equipment becomes the input jitter to the following equipment. If this jitter is amplified as it passes through the network, then it could exceed the jitter tolerance of subsequent equipment. Jitter transfer is a measure of how much jitter is transferred between input and output of network equipment.
GL's Jitter Generation application offers user-defined jitter masks that can be edited, allowing for jitter transfer measurements with increased jitter amplitude at low frequencies.
Input Jitter and Wander Tolerance (2048 Kbit/s)
The overall specification level of jitter and wander that can be accommodated by a 2048 kbit/s network interface is illustrated in the image below. The peak-to-peak phase amplitude specification above 10 Hz reflects the maximum permissible jitter magnitude in a digital network.
Jitter tolerance mask defines a boundary to tolerate intrinsic jitter. The below table provides Jitter tolerance mask defined as per the standard specification.
Specification of Jitter Tolerance Mask for T1
|GR-499 Cat 1||10Hz - 5 UI|
|500Hz - 5UI|
|8KHz - 0.5UI|
|40KHz - 0.1UI|
|GR-499 Cat 2||10Hz - 10 UI|
|193Hz - 10UI|
|6.43KHz - 0.3UI|
|AT&T Pub 62411 (Dec. 90)||1Hz - 138UI|
|4.9HZ - 28UI|
|300Hz - 28UI|
|10KHz - 0.4UI|
|100KHz - 0.4UI|
Specification of Jitter Tolerance Mask for E1
|ITU-T G.823||20Hz - 1.5UI|
|2.4KHz - 1.5UI|
|18KHz - 0.2UI|
|100KHz - 0.2UI|
The jitter generation application includes built-in input jitter ‘amplitude-versus-frequency’ templates as per the transmission standards. This serves bench mark for the generated jitter. The jitter mask template can be customized with a given Frequency (Hz), Amplitude (UI), and Frequency Offset values as shown in the figure below.
Once jitter is applied to the DUT beyond the tolerance level, the DUT starts to introduce frame errors due to the applied data transition jitter. This can be verified using the Bit Error Rate (BER) Tester. The line sync loss and violations are also indicated for each port in the Monitor T1/E1 Line status window as shown in the images below.
Measuring Through Jitter MeasurementGenerated jitter can be measured using Jitter Measurement software. The Peak-to-Peak jitter amplitude value in UI is computed as the highest cumulative jitter value minus the lowest cumulative jitter value and displayed under the statistics and spectral analysis tabs.
For T1 systems operating at 1.544 Mbps, 1 UI equals 647 nanoseconds. For E1 systems operating at 2.048 Mbps, 1 UI equals 488 nanoseconds.
The spectrum tab displays the spectral analysis and peak amplitude value for the corresponding frequency.
|Item No.||Item Description|
|XX012||Pulse Mask Compliance, Jitter Generation, and Jitter Measurement
(Available as a part of basic applications in T1 E1 analyzer)
Easy, accurate, visual
pulse shape and jitter measurement for T1 E1 signals
|XTE001||Dual T1 E1 Express (PCIe) Boards (requires additional licenses)|
|HTE001||Universal T1/E1 Card with Basic Analyzer Software|
|PTE001||tProbe™ T1E1 Analyzer|