Hello All,
I'm trying to arrive at a reliable method of measuring phase noise for
the testing I'm doing on the 70 cm Eagle Receiver.
I recently purchased an SDR-IQ software defined radio. It makes a great
spectrum analyzer below 30 MHz, with a very low noise floor and great
resolution. I took it to work to show to my coworkers and the lab manager
liked it so much he purchased the big brother of this unit, the SDR-14, which I
borrowed and brought back to my shop. I also sent my Agilent 8566B spectrum
analyzer out for calibration so now I have three pieces of test equipment that
I can use to measure the phase noise of the 70 cm Receiver. My problem is that
I can’t get any two of them to agree...
I’ve fed the same 10 MHz 0 dBm signal into each unit, recorded
all of their settings and created the attached PDF file. As you will see, they
all differ. If we look at the 120 Hz power supply sidebands you get the
following:
SDR-IQ: -102 dBm
SDR-14: -93 dBm
8566B: -84 dBm
The 8566B automatically calculates RMS noise levels normalized to a 1
Hz noise power bandwidth by correcting for the analyzer’s log amplifier
and detector response and compensating for the resolution bandwidth setting.
It’s capable of accurately measuring noise levels down to 10 dB above the
spectrum analyzer’s noise level (-131.6 dBm) and reading out in steps of ±0.1
dB. At a 150 Hz offset it measures -104.0 dBc/Hz. Looking at the two SDR
units it’s hard to reconcile this measurement with either of them, or to
reconcile any of the power supply sidebands.
Both SDR units measure power levels in a very linear way. I’ve
confirmed their accuracy at the levels I’m interested in around -90 dBm.
Can anyone shed any light on this? No combination of FFT block size,
span, RF or IF gain, or filter bandwidth can make any two of these devices
agree on the noise floor or the amplitude of the power supply sidbands.
73,
Juan – WA6HTP
A man with a watch will always know
what time it is - a man with two watched can never be sure.