One thing you might want to keep in mind fyrfyterskickash is that the only difference between transmitting and receiving, besides the power level, is where the signal comes from. Power transfer is needed on both and you get the most efficient power transfer when the impedances all match. That said, 75 ohm cable does have less loss than the equivalent size 50 ohm cable. I would go with RG-8 cable for a 50-75 foot run and for use at 800 Mhz. If you feel that you have to use 75 ohm, then smaller RG-6 would work fine. Try to stay away from using connector adapters, they will mess up the return loss and increase the overall losses of your antenna system. If you have to buy cable for your antenna then let your connector requirements dictate the cable you buy. If money is no object, then go with 50 ohm 3/4" hard-line.
The problem with this discussion, is there is a fair amount to know, and a much greater amount to understand.
Yes, a better match transfers more power, but 75 ohm cable on a 50 ohm system is only a 1.5:1 swr. Hardly enough to worry about.
Of course that ASSumes that the radio and the antenna are actually 50 ohms.
That is a good assumption for well designed transmitters and a fair assumption for relatively narrow-band antennas.
Now, we are talking about relatively inexpensive receivers, that may or may not exhibit something 'close' to 50 ohms (I would say that 25 to 100 qualifies as "close" for a receiver.
Next we are talking about a very wide bandwidth. I would expect the receiver to not be too bad, but the antenna is likely to be all over the place.
So, what is the thought behind trying to match the cable, when the antenna is darn near a wild-ass-guess?
So lets do a little exercise
850 MHz 50 feet RG-8 perfect match = 3.97 dB loss
850 MHz 50 feet RG-8 75 ohm mismatch = 4.81 dB loss (0.84 dB from mismatch)
850 MHz 50 feet RG-6 75 ohm match = 3.51 dB loss
850 MHz 50 feet RG-6 50 ohm mismatch = 4.30 dB loss (0.81 dB from mismatch)
OK, now, why do we worry about transmitters more than receivers.
Part of the signal entering the receiver will be reflected when it encounters an impedance mismatch. This is usually a relatively small amount (Let's assume 1/2 of the power) is reflected. That reflected signal heads back to teh antenna where some of it is re-radiated (with considerable loss) and some is again reflected back to the receiver, adding (in a insignificant amount) to the original signal. The voltage and power peaks (standing waves) caused by these reflected signals are insignificant to the limitations of the cable, connectors antenna, or receiver front end.
Now for a transmitter.
Part of the signal entering the antenna system will be reflected when it encounters an impedance mismatch. This is desired to be a small amount (Let's assume 1/2 of the power) is reflected. That reflected signal heads back to the transmitter where some of it is reflected back to the antenna system but the majority of it must be dissipated in the final stage circuits of the transmitter (with considerable heat generation). In addition, the voltage and power peaks (Standing waves) caused by these reflected signals can be very significant with regard to the limitations of the transmitter final circuit elements and even the cable and connectors.
So while a mismatch on a receiver only causes the loss of a small amount of signal, the same mismatch on a transmitter can cause damage to the transmitter and/or antenna system.
On top of this, many transmitters are designed to roll-back their output power when a high SWR is detected further reducing the transmitted signal.
So, while Jack seems to KNOW something about the topic, it appears he does not UNDERSTAND all the issues and how they interrelate. (Sorry dude, I gave you several opportunities to become educated)
