doublescan
Member
I read a lot on the net about antennas, their design and construction, and sometimes I see references to "optimum height for a (j-pole, dipole, etc). I read last night that optimum height for a half-wave dipole is one-half wavelength from the ground. Since I'm talking about scanner antennas here, as that is my hobby, is that really something to think about when erecting one of my new metal monstrosities? On this forum, and most others I've visited, the longstanding opinion is "Height is King-the more the better." Isn't it always better to try to get your antennas above the closest obstructions, as in the trees? Seems like I get better range on all my home-builts at the tallest placement, or is reception really just totally weather-dependent?