Micrel's Tutorial on System Timing and Minimizing Timing Jitter
In today’s high-speed digital communications industry, timing variations in a signal have become more of an issue due to a need to maximize signal bandwidth while maintaining signal integrity. These timing variations are called jitter. Jitter can also be defined as the deviation of a timing event from its ideal location in time. Because large systems contain many high-speed products, a jitter ‘budget’ must be maintained in order to minimize the probability of incurring a bit error. Bit errors are commonly referred to in terms of a bit error ratio (BER), where the bit error ratio is defined as the ratio of bad bits incurred to the number of valid bits in the data pattern. Controlling jitter in the digital signal path plays a very important role in minimizing BER and thereby increasing system reliability. These systems transfer data at various rates based upon industry standards such as Fibre-Channel, Gigabit Ethernet, and SONET. As the demand for these higher frequencies and data rates increase, so does the demand for lower jitter. As a result, it is important for both circuit and board designers to understand jitter, its sources, and its effects on a system.
In Part I of this two part article, we will take a look at the basics of jitter and how it affects data and clock signals. Part I will also cover the actual types of jitter.
The two major components of Total Jitter in a system are Deterministic Jitter and Random Jitter. Deterministic Jitter can be broken down into subtypes, which include Intersymbol Interference (ISI), Periodic jitter and Duty Cycle Distortion (DCD). Additionally, the second component of total jitter, Random Jitter (RJ), will be covered. RJ is normally due to thermal noise and shot noise in semiconductors.
The entire tutorial can be found at:http://www.micrel.com/hbw_news/Jitter_Article_Part%201.pdf