All posts by Simon

PLL characterization – final results for the Micro-Tel SG-811 and Micro-Tel 1295 circuits

After some experimentation, measurements, etc. – as described before, time to wrap it up.

The PLL loop filter output is now connected to the phase lock input (the additional 1 k/100 n low pass in the earlier schematic has been omitted), with a 330 Ohm resistor in series. This will remain in the circuit, because it’s handy to characterize the loop, and to provide a bit of protection for the opamp output, in case something goes wrong, to give it a chance to survive.

With the charge pump current adjustments now implemented in the software, that’s the result, all pretty stable and constant over the full range.

The SG-811 signal source
micro-tel sg-811 pll bandwith vs frequency

The 1295 receiver
micro-tel 1295 pll bandwidth vs frequency

Micro-Tel SG-811 PLL: frequency response
Gain
sg-811 final gain

Phase
sg-811 final phase

Micro-Tel 1295: frequency response
Gain
1295 pll final gain

Phase
1295 pll final phase

PLL frequency response measurement: a ‘not so fancy’ approach, for every lab

Measuring gain and phase shift of some decice doesn’t seem like a big deal, but still, how is it acutally done? Do you need fancy equipment? Or is it something of value for all designers of PLLs that don’t just want to rely on trial and error?

The answer – it’s actually fairly easy, and can be done in any workshop that has these items around:

(1) A simple function generator (sine), that can deliver frequencies around the band width of the PLL you are working with. Output level should be adjustable, coarse adjustment (pot) is enough. You will need about 1 Vpp max for most practical cases.

(2) A resistor, should be a considerably lower value than input impedance of the VCO. Typical VCOs might have several 10s of kOhm input impedance. Otherwise, put a unity gain opamp (e.g., OPA184) in between the resistor and the VCO tune input.

(3) A resistor, and some capacitors (depends a bit on the bandwidth), for general purposes 10-100 kHz, a parallel configuration of a 100n and 2.2 µF cap is just fine. In series with a resistor, a few kOhms. This network is used to feed a little bit of disturbance to the VCO, to see how the loop reacts to it… the whole purpose of this exercise.

(4) Make sure that the loop filter has low output impedance (opamp output). If your circuit uses a passive network as a loop filter, put in an opamp (unity gain) to provide a low output impedance.

(5) A scope, any type will do, best take one with a X-Y input.

Quick scheme:
pll gain phase measurement diagram

To perform the acutal measurements, the setup is powered up, and phase lock established by adequately setting the dividers, as commonly done.
The signals (X: drive=input to the VCO, Y: response=output of the loop filter) are connected to the scope. Set the scope to XY mode, AC coupled input, and SAME scale (V/div) on X and Y.

Next, set the signal gen to a frequency around the range of the expected 0 dB bandwidth (unity-gain bandwidth), and adjust the amplitude to a reasonable value (making sure that the PLL stays perfectly locked!). Amplitude should be several times larger than the background, this will make the measurements easier, and more accurate. If you have a spectrum analyzer, you can check for FM modulation. On the Micro-Tel 1295, which has a small ‘spectrum scan’ scope display, it looks like this:
1295 fm modulated signal during gain-phase test

On the X-Y scope display, depending on where you are with the frequency, it should show the shape of an ellipse, somewhat tilted – examples of the pattern (“Lissajous pattern”) below.

Frequency lower than 0 dB bandwidth – in other words, the loop has positive gain, therefore, Y amplitude (output) will be larger than X (input)
pll gain phase measurement - positive gain (frequency below BW)

Frequency higher than 0 dB bandwidth – in other words, the loop has negative gain, therefore, Y amplitude (output) will be smaller than X (input)
pll gain phase measurement - negative gain (frequency above BW)

And finally, same signal amplitude in X and Y direction.
pll gain phase measurement - 0 dB condition

Sure enough, you don’t need to use the X-Y mode, and circular patterns – any two channel representation of the signals will do, as long as their amplitude is measured, and the frequency identified, at which X and Y have equal amplitude (on the X-Y screen, also check the graticule, because the 45 degrees angle is not so easy to judge accurately). That’s the unity gain (0 dB bandwidth) frequency we are looking for. With little effort, the frequency can be measured to about 10 Hz.
The X-Y method has the big advantage that it relies on the full signal, not just certain points, and triggering a PLL signal with a lot of noise can be an issue.

Try to keep the amplitude stable over the range of frequencies measured – by adjusting the signal gen.

Ideally, the 0 dB bandwidth is measure at various frequencies over the full band of your VCO, because the bandwidth can change with tuning sensitivity, etc., of the VCO.

The 0 dB bandwidth is not the only information that can be extracted – also the phase shift is easily accessible. Just measure, at the unity gain frequency, or any other frequency of interest for you, the length of the black and red lines:
pll gain phase measurement - 0 dB condition - phase determination

The phase angle is then calculated by: divide length of red line, by length of black line, in this case, 4.6/6.9 units. Then apply the inverse sin function, to get the phase angle, sin^-1(4.6/6.9)=41.8 degrees. The 0 dB frequency, in this case, was 330 Hz.

A quick comparison with the data acquired using a more sophisticated methods, a HPAK 3562A Dynamic Signal Analyzer.

Gain: 0 dB at 329 Hz – that’s close!
pll test result - gain

Phase: 38.7 degrees – fair enough.
pll test result - phase

A proper PLL setup should provide at least 20 degrees of phase shift (note that this is not the so-called phase margin, which is a property of an open loop). Closer to 0 degrees, and the loop will remain stable, but a lot of noise (phase noise) and osciallation, finally, occasional loss of lock will be the result.

It’s also a good idea to check that the gain function drops off nicely – there are certain cases, where mulitiple 0 dB points exist – you need to look for the 0 dB point at the highest frequency.

Any questions, or if you need something measured, let me know.

Fractional-N PLL for the Micro-Tel 1295 receiver: some progress, more bandwidth, two extra capacitors, and a cut trace for the SG-811

Step 1 – Programming of the ADF4157, no big issue – fortunately, all well documented in the datasheet. The 1.25 MHz phase detector frequency selected will allow tuning in integer-only (no fractional divider) 10 MHz steps (considering the :8 ADF5002 prescaler).

One sigificant difference to the ADF41020 – the ADF4157 uses 16 steps for the charge current control (0=0.31 mA to 15=5.0 mA).

Step 2 – Checking for lock at various frequencies – in particular, at the low frequencies – the thing is running really at the low edge, 250 MHz input for the ADF4157. However, despite all concerns, no issues, prescaler and PLL are working well even at the low frequency. Quite a bit of noise! Not out of focus…
1295 noisy signal

The PLL is locking fine, but still, significant noise in the loop, and also visible in the 1295 scope display, with a very clean signal supplied to the receiver… bit of a mystery. When the PLL is disengaged, and the 1295 manually tuned – no noise, just some slow drift.

Step 3 – Increased loop bandwidth to about 8 kHz, even more noise – seems to PLL is working against a noisy FM-modulated source…. a mystery. Checked all cables, nothing is changing when I move them around.

Step 4 – Some probing inside of the 1295, and review of the signal path for the PLL tune and coarse tune voltages. And, big surprise – there is a relais (K1) on the YIG diver board, and this disengages a low-pass in the coarse tune voltage line – it is a 499k/22 µF RC, several seconds time constant.

See the red-framed area:
micro-tel 1295 A3B9 YIG driver loop damping

Tackling this through a lowpass in the coarse tune feed line (from the coarse tune DAC) didn’t change a thing – the noise is getting into the YIG driver from instrument-internal sources, or partly from the opamp (U5, LM308) itself, when it is left running at full bandwidth. As a side comment, note the power amplifier – it is a LH0021CK 1 Amp opamp, in a very uncommon 8 lead TO-3 package. Hope this will never fail.

Usually, I don’t want to modify test equipment of this nature, because there is nothing worse than badly tampered high grade test equipment. All conviction aside, 2 X7R capacitors, 100 n each, were soldered in parallel to the R38 resistor, so there will be some bandwidth limitation of the YIG driver, even with the K1 relais open.
micro-tel A3B9 YIG driver board - modified

With these in place – the noise issue is gone.
1295 clean signal

Now, triggered by this discovery – the SG-811 uses a very similar YIG driver board, which also has a low pass engaged, in the CW mode – however, not in the remotely controlled CW mode, with externally settable frequency… easy enough, just one of the logic traces cut, and now the filter stays in – don’t plan on sweeping it with a fast acting PLL anyway.

Back to the fractional-N loop: after some tweaking, the current loop response seems quite satisfactory. Set at 3 kHz for now, with plenty of adjustment margin, by using the 16-step charge pump current setting of the ADF4157. Getting 45 degrees phase margin (closed loop) at 3 kHz – therefore, should also work at higher bandwidth. Will see if this is necessary.

PLL gain
1295 fractional-n loop mag

PLL phase
1295 fractional-n loop phase

R820T, RTL2832U SDR USB stick: some more findings about temperature sensitivity.

Another hack you can do with the SDR USB sticks – mount the clock crystal (28.8 MHz) remotely, feed a stable RF signal at any frequency you desire, and use it a as a thermometer. This will work great at room temperature and up to about 50 deg C, but beware, the ready will be ambiguous at higher temperatures. Here is a quick experiment:

Signal was supplied at 1000 MHz, from a virtually perfectly stable source. At some point, the crystal was touched with a finger (making sure not to touch any of the traces or components, as this could capacitively affect the oscillator). Surprisingly, the frequency first goes down (-0.6 ppm), and then up (+2.5 ppm).
temp effect (hot to cold to hot) ppm

Why is this so surprising – let’s have a quick look at typical crystal oscillators, there are several types, AT-type (most common), SC-type, and closely related IT-type, and others, less common. Nearly exclusively have I come across AT-type so far, for all kinds of cheap clock application. AT type means: inversion point at room temperature, then the frequency decreases a bit with temperature, to about 60 degrees C, then it increases again. See the black curve in this diagram (note that the absolute values are just typical, arbitrary, and change with cut angle tolerance):
r820t quartz ref osc temp dependence
Red line shows the typical characteristics of a SC (or IT)-cut crystal.

Rather than the expected AT-typical transition through a minimum frequency, when going from about 60-70 deg C, down to 35 deg C (body temperature), we observe just the opposite behavior (more like SC-type) – reference frequency goes through a maximum (which is a minimum of the shown signal frequency, if the signal is a constant 1000 MHz).

So it seems, the manufacturer did actually consider a relatively high operating temperature of this device when selecting the crystal, which is running at about 60 degrees – just the rough temperature of the board/crystal.
An AT-type would have been much counter-productive, because this is optimized for constant frequency over a broad range of temperatures, say, -10 to 60 deg C. However, for the SDR USB stick, the temperature related frequency change should be minimal at and around the hot operating condition of the board – I don’t think this is just coincidence, but somebody acutally put some thought into getting a device frequency stable, without any ovens or other compensation devices.

R820T, RTL2832U SDR USB stick: gain accuracy tests

The R820T has the nice feature of a build in pre-amplifier, 0 dB to 49.6 dB nominal gain. Now, the question is, with all the nominal values, what is the acutal gain, and how does this change with frequency?

With the established setup, the frequency-stabilized SDR USB stick (28.8 MHz supplied by a HPAK 8662A, at 500 mV level), and the 8642B source, the gain of the R820T was set to the various values, step by step, and the RF input level varied to keep the SDRSharp FFT peak level at exactly -25 dB. The -25 dB reading can be taken to about +-0.2 dB, when looking at the FFT display.
The test was carried out at two frequencies, namely, 141 MHz and 1000 MHz. Don’t do such evaluation anywhere close to multiples of 28.8 MHz – there are some reference-related spurs that can affect the accuracy.

First, the RF input power needed to get a -25 dB reading:
r820t rf input power at -25 dB vs nominal gain
Interestingly, at 0 dB gain, a bit more power is needed at 141 MHz to get -25 dB, which means, the R820T is a little bit less sensitive at 141 MHz than it is at 1000 MHz, but only at the 0 dB gain setting. At higher gains, the data are more or less superimposed.

Note also that the 43.9 dB and 44.5 dB gain settings have actually identical gain! No idea why.

r820t acutal gain vs nominal gain
These are the acutal gains, calculated from above data, vs. the nominal gain. Pretty linear, but clearly some positive deviation at the low gains.

The full dataset:
r820t rtl2832u sdr usb gain check

This is even more clearly visible in the deviation plot:
r820t gain error vs nominal gain

Accordingly, the preamp provides a bit more gain at lower frequencies, say, 141 MHz, especially when set to high gain, above 35 dB nominal. Below 35 dB, gains for 141 and 1000 MHz are virtually identical.

If you have a SDR USB stick of a different type, and want to have some gains-levels etc measured, just let me know! I might be interested.

R820T, RTL2832U: narrow-range linearity

Some more linearity tests, now over a narrow range, with a precisely linear and calibrated source, measured at 1000 MHz, otherwise all the same is in the earlier post.

These tests were done at a 0 dB gain setting, and the input power was varied in 1 dBm steps.
r820t gain linearity (narrow range)

r820t linearity deviation (narrow range)
The results speak for themselves – the SDR USB stick is pretty accurate, if you want to do a relative comparison of power levels over a range of 10 dB or so. Accuracy might be in the range of +-0.5 dB, or better, provided that the slope has been determined for a given gain setting (don’t change the gain settings, if you need to do accurate measurements). If you want to measure insertion loss, the best way would be to vary the source power in calibrated steps (by a 1 dB high precision attenuator), and just use the SDR USB at a constant dB reading, over a narrow range of +-1 dB, then your measurement error will be limited to virtually the precision attentuator error only. For such tests, it is always good practice to have a 6 dB or 10 dB attenuator at the SDR USB input, to avoid artifacts caused by the non-negligible return loss (SWR) of the SDR USB stick input.

An open item – to go further, one would need to check linearity at various frequencies of interest, etc., but this is all beyond the scope of the SDR USB stick – if it comes to below 0.5 dB accuracy, that’s something better done in a cal lab anyway, nothing that is necessary too frequently out in the field.
For comparison, with a high precision dedicated level measurement receiver (like the Micro-Tel 1295), achievable relative level accuracy (not linearity) over 10 dB is about 0.03-0.05 dB – over a much wider frequency range. Also note that most commonly available spectrum analyzers (Rigol, etc.) aren’t all that linear, see their specs.

R820T, RTL2832U SDR USB stick: using it as a “poor man’s” spectrum analyzer – level linearity, level accuracy

There are some good reasons to always carry one of the SDR USB sticks around – it’s a great little spectrum analyzer. But hold, what does it have in common with the purpose build-professional analyzers selling for USD 1k or more? It certainly has one particular advantage, the SDR USB stick is very small, a mere 10 gramms, and only needs about 1 Watt of power. And it covers the full span of frequencies of general interest (at least is you add an upconverter, for low frequency and HF stuff below 24 MHz).

Well, what are some key requirements of a good spectrum analyzer?

(1) No spurs, at least no unpredictable ones. Well, there are spurs, but mainly multiples of 28.8 MHz (reference), and some spures related to the sampling frequencies (always check if the spur changes, by checking out several sample rates that are not multiples of each other)

(2) Intermodulation distortion. More complicated, will be analyzed later.

(3) Low input return loss (otherwise, amplitudes will be inaccurate). Will be measured later, the VNA rests back home in Germany. But this limitation can be easily overcome by putting a 6 dB attenuator in front of the SDR USB.

(4) Frequency accuracy – this is not great, but stable within a few ppm. If you want to add a precision reference, see earlier post.

(5) Amplitude accuracy – it needs to be very linear (i.e., a 1 dB step in signal strength must convert to a 1 dB step on the readout, same for 10 dB steps, etc.), and this should not very too much with frequency. Absolute amplitude accuracy (i.e., if 1 dBm is fed into the RF input, it needs to read 1 dBm power) – not applicable to the SDR USB stick, it only shows power in nominal, un-calibrated dB.

Well, let’s tackle item 5, and work out some absolute calibration.

The R820T tuner of the SDR USB stick under consideration here has a build-in preamp. This has nominal gains from 0 dB, to 49.6 dB. Some gain curves have been reported for other SDR USB sticks elsewhere, let’s do some in-depth analysis.

How to get this measured properly? The setup and method:
With 1.024 MSPS, 65536 FFT bins, RF frequency of 1000 MHz (HPAK 8642B), reference at 28.80 MHz – 500 mV (provided by an HPAK 8662A) and gains set to values of 0 dB, 20.7 dB (about mid-range), and 49.6 dB (max gain), the input RF power (which is calibrated in absolute dBm, and fed to the SDR USB stick by a low-loss cable) is varied in 10 dB steps, and the dB reading taken from the SDRSharp FFT spectrum display. Note that fully accurate reading of the dBs is only possible if the frequencies (reference and signal) are dead-stable, otherwise everything will be drifting up and down, and the FFT bins won’t be in the same place all the time.

Here are the results:
r820t db output vs rf power input at various gains
Everthing is quite linear (a good fit with just a line), but you notice, the slope of the lines change a bit, depending on the gain setting. In other words, a 1 dBm chain will not always result in an exactly 1 dB change on the SDSSharp display, at high gain setting, it will almost fit, at 0 dB, there is only about 0.93 dB change (readout) for every 1 dBm power change at the input. Well, over 40 dB, that’s an error of about 3 dB, not much, but more than desirable.
r820t level accuracy

After some more measurements, at 38.6 dB nominal gain, it relationship of level slope vs. gain seems pretty clear, at least at 1000 MHz.
r820t linearity (slope) vs nominal gain

After applying the slope correction (comparing a linear fit, with the acutal measured data), these are the residuals:
r820t linearity deviation (wide range)
Less than 1 dB – that’s within the measurement error of the calibration apparatus!

Next interesting item for practical use, the RF input power needed to get a 0 dB reading – the absolute power calibration for this SDR USB stick. This seems to vary from stick to stick only by 2-3 dB, but I don’t have a big set of sticks, multiple lots etc. – so this might be shifted depending on the exact device you are using, but trends should be the same, for all R820T sticks.
r820t 0db equivalent rf input power vs gain
According to this diagram, for any measurements above -40 dBm, you need a good set of attenuators, to bring the signal level down. In fact, the SDR USB might actually make a very decent subtitution type attenuation test receiver, if you put it in line with a precision attenuator, and only use a few dBs of span of the SDR USB (well-calibrated) to determine the signal levels. I checked quickly for drift of the level calibation vs. R820T temperature – there doesn’t seem to be any strong effect, which is a good sign that there is no need to re-calibrate the levels all the time.

R820T, RTL2832U SDR USB stick – sensitivity, dynamic range

After looking around in the web, there doesn’t seem to be a whole lot of information out there on the sensitivity and dynamic range of the SDR USB devices, at least not for the type I’m using here. Even the R820T datasheet isn’t all that clear – there are various versions of the R820T, also using different clock frequencies, with 28.8 MHz, being the most popular lately.

Therefore, time for some measurements.

The setup:

(1) HPAK (formerly HP, then Agilent, now Keysight) 8662A Signal Generator as the reference source, 28.800 MHz, 500 mV level.

(2) HPAK 8642B Signal Generator as the test signal source. This has a calibrated output from -140 dBm to +20 dBm, and very clean and free of spurs, and provides up to 2.1 GHz.
Absolute amplitude accuracy is about 1 dB, linearity is considerably better. As it says on the instrument cover – 70 pounds, “two person lift”.
The 8642B is phase locked to the 8662A clock, via a common 10 MHz reference signal. So even with drift, there can’t be any frequency errors getting into the way of our precision testing.

(3) Some well-shielded test cables, RG223/U, and adapters to link to the MCX connector (use a good test cable, but not your best – most of the SMA to MCX connectors aren’t all that precise, and may damage precision SMA connectors).

(4) The modified SDR USB stick, see earlier post.
r820t rtl2832u sdr usb dut

(5) Laptop PC, running SDRSharp. 1.024 MSPS, all automatic gain and frequency adjustments disabled, I/Q correction enabled.

r820t rtl2832u sdr usb test setup

First, the sensitivity check. Tuned the SDR USB to various frequencies, and measured the input power (dBm needed to get a -40 dB reading, at max gain of the SDR USB – 49.6 dB nominal), this is about 15 dB above the noise floor, and still a signal level that is very stable and can be accurately measured. Afterwards, set gain to 0 dB, and increased RF input power until 0 dB reading was obtained – this is the maximum power that can be reasonably fed to the SDR USB (no damage will occur up to +10 dBm; and even +20 dBm doesn’t seem to do much, at least not if only applied for a short time).

Power levels for -40 dB reading at max gain, and 0 dB reading at 0 dB gain:
r820t input sensitivity and max power
Sensitivity is quite constant over a pretty large range, up to 1500 MHz, no problem. Lowest frequency the thing can handle is about 24 Mhz (doesn’t tune any lower). Note that there are some spurious signals present around 28.8 MHz, (internal) ref clock leakage, and its 2nd harmonic.
R820T usb sdr dynamic range and sensitivity

The RF input power (about -130 dBm) to get -40 dB amplitude, at max gain of the SDR USB, this is quite remarkable, and still about 15 dB above the noise floor. So the R820T exhibits very high sensitivity, no doubt.
Here is an estimation of the dynamic range – “useful” because, it is still has some margin for noise. For the full dynamic range, add about 15 dB.
r820t sdr usb dynamic range
About 93 dB (108 dB full range, from noise floor, at 49.6 dB gain, to 0 dB at 0 dB gain).

R820T, RTL2832U: SDR USB stick hack – clean and stable reference!

One of the shortcomings of these handy and cheap SDR USB sticks is the offset of the reference, and the drift. First, lets see what we have: in the meantime, I have two more or these little units to play around, and all have about 70-100 ppm offset, positive.
The reference is derived from a small 28.8 MHz crystal, and while such crystals are pretty much suitable for clock generation, they do drift with temperature, and the SDR USB stick is getting hot during use… Quick look inside (case can be easily opened, no damage):

r820t sdr usb stick opened
The small silvery metal can is the crystal. Any temperature change will cause this to slightly change frequency.

So, what about the crystal drift? Easy enough, just hooked up one of the sticks to a counter (with stable timebase; you need to use high impedance probe, otherwise, proble will pull the crystal frequency), and logged the frequency values for an hour, after “cold” startup (stick with no case).
The result:
r820t ref clock drift
1 to 2 ppm, that’s not all that bad! Still, in the world of precision oscillators, it’s ridiculously drifting. So for any precise characterization of the SDR USB device, we need to get this under control. Some have tried to replace the crystal by a TCXO, but even with this, it will be challenging to break the 1 ppm (1 kHz at 1 GHz!) mark.

A quick look at the datasheets reveals that pins 8 and 9 of the R820T are the input/output of the xtal drive circuit, and pin 10 forward the clock signal to the RTL2832U, to safe some parts, and cost.

That’s how the oscillator output looks like, probed at pin 9, with a 10 Meg probe (0.5 V/div, 10 ns/div)
r820t ref clock signal 0.5 v-divy 10 ns-divx
The signal, amplitude is about 1.5 Vpp, with a DC bias of 1.2 V.

Therefore, if we want to substitute the crystal, we have to feed a few dBms of power at 28.8 MHz into pin 8, and leave pin 9 unconnected. The feed line (50 Ohms) requires some adequate termination. We also need to provide a DC block, with a little coupling capacitor.
That’s the little hack:
r820t sdr usb ref clock hack
A SMC connector, terminated with 82 Ohm (which will be in parallel with the impedance of the R820T, hopefully giving about 50 Ohm, or close enough), and with 0805 10 nF capacitor, connected to pin 8 of the R820T.

As for the new reference source – nothing less than a HPAK 8662A, which is a real marvel of engineering and one of the best sources I can suggest for any tests that require low phase noise close to the carrier. It is stable to better than 0.0005 ppm, per day – compare this to the 1 ppm, per hour…..
Sure, not the mention – the 8662a carries 80 pounds of electronics, a big fan, and uses about 300 Watts of power to keep things clean.

The new clock source for the SDR USB stick:
r820t new clock source 8662a

The reference level needed to drive to R820T oscillator – just by trial and error, things start to work at about 400 mV, and up to 1 V of signal doesn’t seem to change anything. So I set the level to 500 mV into 50 Ohm, and this seems to work well.

Interestingly, the R820 T seems to work from about 28.75 to 28.85 MHz reference, with no change in performance (at least nothing obvious), except, of course, for the frequency shift. At frequencies below about 28.72, and above 28.89, the stick crashes-no more data comming.

Some tests: 28.800 MHz reference, 1 GHz signal (note the small frequency shift, which is related to SDRSharp software, not to any hardware offsets)
28800

28.75 MHz reference – still 1 GHz signal
28750

28.85 MHz reference – still 1 GHz signal
28850

Checking the math, this all makes sense – for 28.75 MHz, a reading of 1001.739 MHz would be expected, and 998.267, for the 28.85 MHz reference.

PLL measurements continued… ADF41020 locking the Micro-Tel 1295

With the work on the Micro-Tel SG-811 generator PLL mostly completed, some trials with the Micro-Tel 1295 receiver – this instrument has similar YIGs fitted, just needs to be tuned 30 MHz above the actual frequency tuned, because the 1295 is running on a 30 MHz IF (all diagrams have tuned frequencies, not LO frequencies).

After some crude analysis of the schematics, the 1295 seems to be able to handle a bit more PLL bandwidth – so the target set more in the 500 Hz to 1 kHz region, and some calculations were carried out with the ADIsimPLL program, to determine the rough capacitor and resistor values – otherwise, the loop filter is the same as for the SG-811 PLL, also using an OPA284 opamp.

Otherwise, pretty much comparable results (earlier post related to the SG-811), for example (17.8141 GHz tuned/17.8171 GHz LO frequency, Icp setting 6):

Gain (disregard 1 to 10 Hz)
micro-tel 1295 17814100 kHz cpc6 gain

Phase
micro-tel 1295 17814100 kHz cpc6 phase

After quite a few of these measurement (doesn’t actually take too long), the results.
adf41020 pll bw phase margin 1295

Phase margin vs. bandwidth
pm vs bw adf41020 micro-tel 1295

Bandwidth vs. charge pump current Icp setting, at various frequencies
bw vs icp at various frq adf41020 micr-tel 1295 pll

Again, a bandwidth frequency^0.7 product could be used to get the numbers down to two parameters – slope and intercept of the bandwidth*frequency^0.7 vs. Icp setting curve.
Finally, suitable Icp settings for a 600 Hz target BW:
bw vs frq adf41020 micro-tel 1295 with Icp adjustment

The result seems quite satisfactory, pretty much constant 600 Hz BW can be achieved over the full 2 to 18 Ghz range, at about 47 degrees phase margin. This should allow for stable operation. No locking issues were observed at any of the frequencies, even with full Icp current.