Hi Juan,
What your likely seeing when looking at your low noise 10 MHz signal is the phase noise floors of the three devices your using for measurement.
For instance, the 8566B has a typical phase noise floor at 100 to 320 Hz of around -95 dBc/Hz and it's spec'ed at -80 dBc/Hz (yours might be lower) which is essentially the noise of the phase locked YIG first LO in the 8566B and you won't be able to resolve any thing lower than that, at those frequency offsets. The -85dBc spur at 120 Hz could very well be the 120 Hz spur on the 8566B's LO's. I can't recall what the power line spurious spec is for the 8566B but it should be in your manual.
I can't speak for the SDR's, not having played with them, but I would guess that their phase noise will be much better than the 8566B. Hence you would read a different value than obtained on the 8566B. So what now - right?
Well, from the data you've shown earlier for the 10.7 MHz output of the URx, its phase noise is greater than that of the 8566B's phase noise floor (not to be confused with its "sensitivity floor"). Hence the reading you get on the 8566B is the phase noise of the combined LO's in the URx and the phase noise of your 435 MHz signal generator (and let's assume its at least 10dB lower than the phase of the URx LO's and hence you can ignore it) . Now if you take that data on the 8566B and then run the URx 10.7 output into the SDR's, you should get the same phase noise readings (or very close to it).
In other words - if the expected phase noise of the device under test is greater (by at least 10dB) than the phase noise floor of the measuring device, the measurement is going to be accurate enough for our application.
Clear as mud - huh? Phase noise measurements are not straight forward in most cases unless you have a "phase noise analyzer" ala HP or RDL.
Regards...Bill - N6GHz
Juan Rivera wrote:
Hello All,
I'm trying to arrive at a reliable method of measuring phase noise for the testing I'm doing on the 70 cm Eagle Receiver.
I recently purchased an SDR-IQ software defined radio. It makes a great spectrum analyzer below 30 MHz, with a very low noise floor and great resolution. I took it to work to show to my coworkers and the lab manager liked it so much he purchased the big brother of this unit, the SDR-14, which I borrowed and brought back to my shop. I also sent my Agilent 8566B spectrum analyzer out for calibration so now I have three pieces of test equipment that I can use to measure the phase noise of the 70 cm Receiver. My problem is that I can’t get any two of them to agree...
I’ve fed the same 10 MHz 0 dBm signal into each unit, recorded all of their settings and created the attached PDF file. As you will see, they all differ. If we look at the 120 Hz power supply sidebands you get the following:
SDR-IQ: -102 dBm
SDR-14: -93 dBm
8566B: -84 dBm
The 8566B automatically calculates RMS noise levels normalized to a 1 Hz noise power bandwidth by correcting for the analyzer’s log amplifier and detector response and compensating for the resolution bandwidth setting. It’s capable of accurately measuring noise levels down to 10 dB above the spectrum analyzer’s noise level (-131.6 dBm) and reading out in steps of ±0.1 dB. At a 150 Hz offset it measures -104.0 dBc/Hz. Looking at the two SDR units it’s hard to reconcile this measurement with either of them, or to reconcile any of the power supply sidebands.
Both SDR units measure power levels in a very linear way. I’ve confirmed their accuracy at the levels I’m interested in around -90 dBm.
Can anyone shed any light on this? No combination of FFT block size, span, RF or IF gain, or filter bandwidth can make any two of these devices agree on the noise floor or the amplitude of the power supply sidbands.
73,
Juan – WA6HTP
/A man with a watch will always know what time it is - a man with two watched can never be sure./