English
Language : 

EVAL-ADF7021DBZ5 Datasheet, PDF (31/64 Pages) Analog Devices – High Performance Narrow-Band Transceiver IC
Data Sheet
ADF7021
The calibration algorithm adjusts the filter center frequency
and measures the RSSI 10 times during the calibration. The
time for an adjustment plus RSSI measurement is given by
IF Tone Calibration Time = IF_CAL_DWELL_TIME
SEQ CLK
It is recommended that the IF tone calibration time be at least
500 µs. The total time for the IF filter fine calibration is given by
IF Filter Fine Calibration Time = IF Tone Calibration Time × 10
RSSI/AGC
The RSSI is implemented as a successive compression log amp
following the baseband (BB) channel filtering. The log amp
achieves ±3 dB log linearity. It also doubles as a limiter to
convert the signal-to-digital levels for the FSK demodulator.
The offset correction circuit uses the BBOS_CLK_DIVIDE bits
(R3_DB[4:5]), which should be set between 1 MHz and 2 MHz.
The RSSI level is converted for user readback and for digitally
controlled AGC by an 80-level (7-bit) flash ADC. This level can
be converted to input power in dBm. By default, the AGC is on
when powered up in receive mode.
OFFSET
CORRECTION
1
A
A
A
LATCH
FSK
DEMOD
IFWR IFWR IFWR IFWR
CLK
R
RSSI
ADC
Figure 45. RSSI Block Diagram
RSSI Thresholds
When the RSSI is above AGC_HIGH_THRESHOLD
(R9_DB[11:17]), the gain is reduced. When the RSSI is
below AGC_LOW_THRESHOLD (R9_DB[4:10]), the gain
is increased. The thresholds default to 30 and 70 on power-up
in receive mode. A delay (set by AGC_CLOCK_DIVIDE,
R3_DB[26:31]) is programmed to allow for settling of the loop.
A value of 10 is recommended.
The user has the option of changing the two threshold values
from the defaults of 30 and 70 (Register 9). The default AGC
setup values should be adequate for most applications. The
threshold values must be chosen to be more than 30 apart for
the AGC to operate correctly.
Offset Correction Clock
In Register 3, the user should set the BBOS_CLK_DIVIDE bits
(R3_DB[4:5]) to give a baseband offset clock (BBOS CLK)
frequency between 1 MHz and 2 MHz.
BBOS CLK [Hz] = XTAL/(BBOS_CLK_DIVIDE)
where BBOS_CLK_DIVIDE can be set to 4, 8, 16, or 32.
AGC Information and Timing
AGC is selected by default and operates by setting the appropriate
LNA and filter gain settings for the measured RSSI level. It is
possible to disable AGC by writing to Register 9 if the user wants to
enter one of the modes listed in Table 14. The time for the AGC
circuit to settle and, therefore, the time it takes to measure the RSSI
accurately, is typically 300 µs. However, this depends on how many
gain settings the AGC circuit has to cycle through. After each gain
change, the AGC loop waits for a programmed time to allow
transients to settle. This AGC update rate is set according to
SEQ _ CLK _ DIVIDE [Hz]
AGC Update Rate [Hz] =
AGC _ CLK _ DIVIDE
where:
AGC_CLK_DIVIDE is set by R3_DB[26:31]. A value of 10 is
recommended.
SEQ_CLK_DIVIDE = 100 kHz (R3_DB[18:25]).
By using the recommended setting for AGC_CLK_DIVIDE, the
total AGC settling time is
Number of AGC Gain Changes
AGC Settling Time [sec] =
AGC Update Rate [Hz]
The worst case for AGC settling is when the AGC control loop
has to cycle through all five gain settings, which gives a maximum
AGC settling time of 500 µs.
Table 14. LNA/Mixer Modes
Receiver Mode
High Sensitivity
Mode (Default)
Enhanced Linearity
High Gain
Medium Gain
Enhanced Linearity
Medium Gain
Low Gain
Enhanced Linearity
Low Gain
LNA_MODE
(R9_DB25)
0
0
1
1
1
1
LNA_GAIN
(R9_DB[20:21])
30
30
10
10
3
3
MIXER_LINEARITY
(R9_DB28)
0
1
0
1
0
1
Sensitivity
(2FSK, DR = 4.8 kbps,
fDEV = 4 kHz)
−118
−114.5
−112
−105.5
−100
−92.3
Rx Current
Consumption (mA)
24.6
Input IP3
(dBm)
−24
24.6
−20
22.1
−13.5
22.1
−9
22.1
−5
22.1
−3
Rev. B | Page 31 of 64