LIScanner101
Completely Banned for the Greater Good
This has probably been beaten to death, and I'm not even sure if I should be posting this here or in the coax forum,
But I see so many conflicting views on using the “right” impedance cable for a scanner - "right” in this case being 50 ohms to match the typical antennas being used (such as a rooftop vehicular whip, or a base discone etc).
But what about if you have a dipole like the Scantenna or the old Monitenna, which always come with a 300-ohm-to-75-ohm balun?
Do you run 75 ohm coax down to your scanner (which is 50 ohms - mismatch!!),
Or do you use 50 ohm coax that matches the scanner’s impedance (which then means you are forced to terminate the cable in a special F connector made for 50 ohm coax but then connect it to a 75 ohm balun on the antenna end - mismatch!!)?
To me it seems like you have to “compromise” at either the antenna end, or the scanner end.
Does it really make any freakin' difference for scanning?
Any thoughts on this?
But I see so many conflicting views on using the “right” impedance cable for a scanner - "right” in this case being 50 ohms to match the typical antennas being used (such as a rooftop vehicular whip, or a base discone etc).
But what about if you have a dipole like the Scantenna or the old Monitenna, which always come with a 300-ohm-to-75-ohm balun?
Do you run 75 ohm coax down to your scanner (which is 50 ohms - mismatch!!),
Or do you use 50 ohm coax that matches the scanner’s impedance (which then means you are forced to terminate the cable in a special F connector made for 50 ohm coax but then connect it to a 75 ohm balun on the antenna end - mismatch!!)?
To me it seems like you have to “compromise” at either the antenna end, or the scanner end.
Does it really make any freakin' difference for scanning?
Any thoughts on this?