Multiple Attic Antennas

Status
Not open for further replies.

rbritton1201

Captain1201
Joined
Jul 27, 2020
Messages
407
I have three Discone antennas in the Attic (receive only), all in a row, separated by about 6 feet between each. I have a Comet Gp-1 VHF/UHF antenna connected to a transceiver (transmit and receive), separated by about 15 feet from the Discones, and positioned away by about 12' from a Television Antenna that's also in the attic.

I'm considering purchasing the DPD Productions Omni-X antenna to replace one of of the Discone antennas, which will be left in place, but disconnected. The Omni-X antenna will be installed away from all the other antennas, as far away as practicable. The Omni-X will be connected to a Uniden SDS200 scanner.

I've tried to position the antennas in the attic sufficiently away from each other so as not to induce an RF coupling effect. But, what is the recommended distance away from the other antennas so as not to induce RF coupling with the other antennas in the attic? I've always heard that 36" is adequate, but there must be a formula based on frequency that's more accurate, but I'm not sure what the formula is?
 

popnokick

Member
Premium Subscriber
Joined
Mar 21, 2004
Messages
2,911
Location
Northeast PA
The recommended distance is 1/4 wavelength at the lowest frequency planned for use. So at 50 mHz (6 Meters) that’s about 1.5 meters apart. What’s the lowest frequency you want to use / hear with the planned antenna(s)? That will allow you to determine the greatest distance needed for separation. Higher freqs = less separation required.
 

AM909

Radio/computer geek
Premium Subscriber
Joined
Dec 10, 2015
Messages
1,474
Location
SoCal
Until prcguy gets here :), the absolute minimum is 1/4 wave to avoid coupling on the receive antennas, which you've done down to about 40 MHz, but I'd worry about having a transmit antenna only a couple VHF-high wavelengths away from all those broadband receiver antennas. Whatever you can do to get the transmit antenna horizontally and, even better vertically, away from the rest would be good. There's no hard rule – the further away, the less RF you're cramming down those unfiltered receive front ends. Here's a theoretical path loss calculator.
 

jaspence

Member
Premium Subscriber
Joined
Mar 21, 2008
Messages
3,041
Location
Michigan
The quality of the VHF-UHF radio can be important. One of my ham HTs knock my computer for a loop if I transmit in the same room even 5 or 6 feet away from the computer with 5 watts. Some of the CCRs can definitely spoil your day and a receiver without good spacing.
 

Ubbe

Member
Joined
Sep 8, 2006
Messages
10,035
Location
Stockholm, Sweden
But, what is the recommended distance away from the other antennas so as not to induce RF coupling with the other antennas in the attic?
I would be more worried about antennas blocking the radio signals on their way to reach the current receive antenna. Also whenever placing antennas indoors they will be more suspect to receive reflecting signals and not only the direct one, making the position of the antenna crucial. Just moving the antenna a foot could make a huge difference.

Best practice are to spend some time in the attic having the antenna connected to a scanner and listen while moving around the antenna in all possible positions.

/Ubbe
 

prcguy

Member
Premium Subscriber
Joined
Jun 30, 2006
Messages
17,173
Location
So Cal - Richardson, TX - Tewksbury, MA
Thanks for the intro, but I feel like brushing snow off my boots and stomping my feet at the door to announce myself.

The answer is vague and variable depending on what you can live with. One topic would be how the radiation pattern will be affected where 1/4 wavelength or a little less or more can make a two element beam out of a pair of antennas depending on their size and design. At 1/2 wavelength spacing you can get a figure 8 pattern and at 1 wavelength a clover leaf pattern and so on.

6ft is about 1/2 wavelength at VHF so there should be some pattern disturbances at VHF causing a figure 8 pattern. 6ft is also about 1 wavelength at UHF so the pattern would be closer to a clover leaf with 4 lobes and 4 nulls. It would be more pronounced with the antenna 6ft away from a long metal mast and replacing that with the same kind of antenna is not exactly the same but they will interact in a similar way to some extent.

Another topic would be transmit power coupling from one antenna to another and possibly damaging a receiver. For low gain antennas I think 6ft would land in the far field where you can use a simple path loss calculator. That shows for two 0dB gain antennas at 2 meters you would have about 22dB of path loss or coupling between antennas. Any actual gain would increase that coupling number. 22dB is a lot of coupling and 50 watts into one antenna would leave you with about 300mw of energy at the other antenna, not including feedline loss on either antenna. 300mw is not a lot of power but could be borderline dangerous for some receivers. I might double the distance from the transmit antenna to the other antennas which will reduce coupling by about 6dB and cut the level down by four times which should be safe for any receiver.

Until prcguy gets here :), the absolute minimum is 1/4 wave to avoid coupling on the receive antennas, which you've done down to about 40 MHz, but I'd worry about having a transmit antenna only a couple VHF-high wavelengths away from all those broadband receiver antennas. Whatever you can do to get the transmit antenna horizontally and, even better vertically, away from the rest would be good. There's no hard rule – the further away, the less RF you're cramming down those unfiltered receive front ends. Here's a theoretical path loss calculator.
 

rbritton1201

Captain1201
Joined
Jul 27, 2020
Messages
407
The lowest frequency would be 144 Mhz. The wave length @ 144 Mhz. would be 6.83 feet, divided by 4 = 1.7 feet. It sounds like the separation I have in the attic is more than adequate according to the formula. But, of course, I realize there are always exceptions due to the other variables. The only transmitting antenna up there in the attic is probably at least 12-15 feet from the receiving antennas., and it's a 2 meter and 70cm antenna. Max output is usually around 20 watts to make it into the repeater from where my QTH is located. The scanner would be turned off during times I'm transmitting on 2 meters or 70cm, just as a precaution I've always followed with respect to other radios being in close proximity while transmitting.

The recommended distance is 1/4 wavelength at the lowest frequency planned for use. So at 50 mHz (6 Meters) that’s about 1.5 meters apart. What’s the lowest frequency you want to use / hear with the planned antenna(s)? That will allow you to determine the greatest distance needed for separation. Higher freqs = less separation required.
 
Last edited:
Status
Not open for further replies.
Top