If my highly pointed S band dish is on my roof and tracking
the sat
in the sky and my nearest neighbor is more than 150 feet from
my
antenna how can my neighbor's telephone, with very limited
range
knock out the signal from Eagle, when I am using my SSB
preamp?
The answer is relatively easy to compute. The simple communication equation for power received at a receiver (PR) from a distant transmitter (PT) in dB is simply:
PR = PT + GT + GR -LI - LS
Where:
PR is power received PT is power of the transmitter (lets say 40 dBm (10W) GT is gain ot the TX antenna (Lets say 8 dB) GR is gain of the RX antenna (abt 25 dB for a 1m dish) LI is incidental losses (usually less than 3 dB) LS is the space loss due to distance between TX and RX
Computing space loss is basic physics and boils down to ((4 * Pi * R)/ wavelength) squared.
For a P3E satellite with a Range (R) of about 40,000 km and the wavelength of 2.4 Ghz being about .12 meters that loss term is about -192 dB. Add it all up and you get a received signal power of about -122 dBm which is just about exactly the minimum receive signal for a typical FM receiver.
Ok, now compare this signal to one from a 10 milliwatt 2.4 GHz neighbor's phone about 150 feet away:
PT is now 10 dBm GT is at worst 0 dB GR is say - 10 dB (35 dB worse than main lobe) LI is still about 3 dB
But now the space loss, LS is not 40,000 km away, but only 150 feet away. And this computes to be a loss of only - 74 dB. Adding it all up gives a power received of about - 77 dBm.
Notice that this neighbor's wireless phone is a full 45 dB STRONGER coming in from the back of the dish than the satellite coming in from the front, even though the satellite has 35 dB more dish gain in the main lobe.
For reference, this 45 dB stronger neighbor's phone is equivalent to a 32 Killowatt neighbor compared to a 1 Watt satellite signal. Even if you use a 3m dish, then this only changes the signal power from the satellite by 10 dB so now you have a 10W satellite signal competing with a 32 Kw neighbor. Still the neighbor wins...
Computing it backwards, the neighbor's off-axis signal won't be less than the satellite signal until the phone is more than 5 miles away. (Of course, that is if it was line of sight with nothing in the way. Given that each tree is worth about 10 dB at S'band, then you would really maybe need only a small 1 acre forest to block his signal sufficiently?
Bottom line, looking at the numbers, it does not seem to make sense to intentionally design a satellite downlink these days useable by most of the AMSAT membership that is that susceptible to off-the-shelf consumer devices known to cause interference... Which is beyond our control to remedy.
Just my 2 cents. Bob, WB4APR