Croaker90 said:
How do you find out about how far an antennas range will go to receive freqs.
I'm trying to find a antenna that will go about 150 miles, but don't know what to look for to see if it will reach.
You don't mention which band you're trying to monitor, but in any case, the antenna isn't the only determining factor in your reception distance. Frequency band, antenna height, transmitter power and terrain all play a part.
1. Frequency band: basically, the higher the frequency the shorter the distance. VHF and UHF are line-of-sight, which means that if the transmitter is below the horizon for your antenna, you're probably not going to hear it.
2. Antenna height: The higher the antenna, the farther away you'll hear stuff.
3. Transmitter power: the less power the transmitter is putting out, the less chance you're going to hear anything.
4. Terrain: You're not going to hear anything from a transmitter on the other side of that 1500 foot mountain. Buildings also have an effect...
All of these things interact with one another to determine reception distance.
If you're into math, try this formula:
DA1 = 1.415 times the square root of H1
DA2 = 1.415 times the square root of H2
DA1 - distance to the radio horizon from your antenna
DA2 - distance to the radio horizon from the transmitter antenna
H1 - height of your antenna in feet
H2 - height of the transmitter antenna in feet
Now add DA1 and DA2 to get D1 (the approximate reception distance). This will give you the theoretical maximum line-of-sight distance between your antenna and the transmitter antenna.
Remember that D1 is purely an approximation. 1, 3 and 4 are all going to affect your true reception distance. A general rule of thumb is to assume an effective maximum of 20 to 25 miles for scanner reception, with an antenna 30 feet up. Your mileage may vary, but 150 miles is pretty much not going to happen unless your antenna is really, really, really, really high in relation to the transmitter.