Simulating the Effect of Blockers on Data Converter Performance in Wideband Receivers
Many different standards for wireless communications equipment are in use today. Narrowband communication standards use stronger transmission in a small slice of bandwidth. Wideband standards use lower transmission power across a larger bandwidth. Each standard defines minimum performance characteristics for receivers, and includes specifications such as bandwidth, maximum signal level, and sensitivity. GSM is one narrowband example; the channel bandwidth is 200 kHz. A GSM receiver must have a minimum sensitivity of –104 dBm and be able to tolerate a –13 dBm signal at the antenna. In contrast, CDMA2000 is a wideband standard that uses a 1.25 MHz bandwidth. CDMA2000 receivers need to have a minimum sensitivity of –117 dBm/1.25 MHz and tolerate a maximum signal of –30 dBm at 900 kHz offset.1
In the block diagram of a receiver (Figure 1), the analog portion of the receive signal chain consists of:
- Antenna. Selective to the spectrum of interest -- this provides some attenuation for out-of-band interference.
- Down-conversion. Translates the received signal to a lower frequency that can be amplified and converted to digital information. This can be accomplished with one or two mixing stages. A single mixer stage can reduce the signal’s center frequency from a few gigahertz to a couple hundred megahertz. A second mixer stage can translate the signal down to tens of megahertz. Using a single mixer stage saves board space and the expense of the second PLL, VCO, and filter. However, the higher IF requires an amplifier and analog-to-digital converter (ADC) with very good performance at high frequencies.
- VGA. A variable-gain amplifier is used to adjust the gain of the circuit depending on the received signal strength.
- Anti-aliasing filter. Attenuates signals outside the band of interest.
Large-scale blockers interfere with converting the desired signal in two ways:
- An ADC’s input range is specified as some amplitude, typically 1V or 2V peak-to-peak or in terms of dBm. This input range limits how much gain the VGA can apply to the signal before the ADC starts to distort or clip. An rms detector or log-amp is frequently used to determine the amplitude of the composite signal, thus determining how much gain the VGA should apply in order to use the ADC’s full input range. Large-scale blockers increase the amplitude of the received signal, limiting the amount of gain that can be applied before saturating the input of the ADC. In-band blockers effectively limit the amplitude of the desired signal. If no blocker was present more gain could be applied to the signal to help overcome the noise floor of the ADC.
- Nonlinearities within the ADC will create intermodulation distortion products within the desired signal band, thus negatively impacting spurious free dynamic range (SFDR) performance. Intermodulation distortion occurs when two frequencies mix together and create signals at the sum and difference of the input frequencies. For example, applying two tones to the input of an ADC where F1=50 MHz and F2=52 MHz, will create new signals at 2 MHz (F1–F2), 102 MHz (F1+F2), 54 MHz (2F2–F1), 48 MHz (2F1–F2), and so on. The amplitude of these new signals depends on the linearity of the ADC and the amplitude of the input signals. The same phenomenon happens when a large in-band blocker mixes with another blocker or the desired signal. New signals, which can be very close to the signal of interest, are created.
Knowing these blockers can cause problems in receiver sensitivity, how does the engineer know that a design will meet the required specifications? There are a couple ways to do this (aside from trial and error). The ADC’s signal to noise ratio (SNR) and SFDR performance can be used to calculate the effect of in-band blockers on a receiver. The ADC performance will be specified for different sample rates and input frequencies, which may not match your application. These specifications are measured with pure sine wave inputs as opposed to the signals received from an antenna. Alternatively, a software package such as Analog Devices’ (ADI) VisualAnalog can simulate ADC performance with a real-world waveform and any sample rate.
Figure 2 shows the results of using a VisualAnalog canvas to evaluate the AD9246, a 14-bit 125 Msps ADC. By creating a signal similar to a CDMA2000 waveform with and without an in-band blocker we can use the tool to analyze the ADC’s performance. This same canvas file (Figure 3) can be easily modified to accept another waveform (W-CDMA, GSM900) or evaluate a different ADC using another ADC model.
Figure 3. VisualAnalog canvas used to analyze the effect of a blocker on ADC performance.
Wideband communication standards such as CDMA2000 require receivers that can tolerate large in-band blockers. Although it is possible to calculate the effect of the ADC’s SNR and SFDR on receiver sensitivity, the datasheet may not specify ADC performance at the desired sample rate and input frequency. New simulation software makes it possible to quickly and easily evaluate different ADCs with real-world signals, ultimately reducing design risk.
(1) AN-808 “Multicarrier CDMA2000 Feasibility”, Brad Brannon and Bill Schofield. ©2006
(2) “Correlating high-speed ADC performance to multicarrier 3G requirements”, Brad Brannon, RF Design, June 2003, pp. 22-28.
Pamela Aparo is a marketing manager in the High Speed Converters Group at Analog Devices, Inc. Pam has worked in marketing and applications roles with ADI for seven years. Prior to ADI, she applied her engineering degree to medical applications working for Baystate Medical Center and OEC Medical Systems. Pam earned a BSEE from the University of Connecticut. Michael Sink is an applications engineer in the High Speed Converters Group at Analog Devices. He has been with ADI for five years. Michael earned a BSEE from North Carolina State University. For more information, contact Analog Devices, One Technology Way, Box 9106, Norwood, MA 02062-9106; (800) 262-5643; www.analog.com