Coax Loss

Status
Not open for further replies.

prcguy

Member
Joined
Jun 30, 2006
Messages
15,368
Location
So Cal - Richardson, TX - Tewksbury, MA
Satellite signals after down conversion can run from 950MHz up to 2150MHz. For DirecTV you may find some signals down to 250MHz.

As far as RG-6 that's 70 ohm cable there will be some loss due to the difference in impedance. When use for satellite reception you are not looking to receive at 3ghz, you are transporting signals that have been down converted. YMMV
 

KMG54

Active Member
Premium Subscriber
Joined
Apr 24, 2011
Messages
1,257
Solid signal . com Perfect vision quad shield solid copper rg6 1000 foot spool $129.
 

zz0468

QRT
Banned
Joined
Feb 6, 2007
Messages
6,034
Simple electronics.
3DB loss = Loss of 1/2 of your signal
so say starting at 100% signal
3Db = 50% of 100
3DB = 25% of 100
3DB = 12.5% of 100
3DB = 6.25% or 100
So saying 12DB of loss you only have 6.25 percent of your original 100.
As stated here the higher the frequency the more the loss.

Ok, now let's put these losses into proper perspective.

For the OP, as has been stated, the coax losses are the same for TX as for RX. But looking at it another way, if you have 3 dB loss and 100 watts, you'll get 50 watts out the other end, and 50 watts of heat in the coax. If you have 3 dB and 1 watt, you'll get .5 watts at the other end. 50 watts will make the coax noticably warm, a half watt will not. But it's the same 3dB.

Now, consider a receiver. Receiver sensitivity is generally rated in voltage, not watts. A 10dB loss in power is one tenth the power. But it takes 20 dB loss to get one tenth the voltage. How does that play out in the real world? Well, a 0.5 uV receiver is... decent. That's -113 dBm. Add 3 dB of line loss, and you get 0.355 uV. On the end with a 100 watt transmitter, you've just turned 50 watts of power into heat. But on the receiver end, you've only lost 0.125 microvolts. Now, most modern receivers are much better than 0.5 uV in sensitivity. Let's say they're more like 0.2 uV. If you've got a signal generator and a receiver to play with, set it for 0.5 uV, and then reduce the signal by 3dB. You'll scarcely notice the difference. As you get closer to the receiver threshold, that 3 dB will mean more, but by the time 3dB really matters, it's already too noisy to be usable.

So, 3 dB can be a lot of heat waste on the transmitter end, and barely noticeable on the receiver end, but it's still the same 3dB.

In the end, you want to chase after every bit of loss as you can, because it's additive. 3 db of line loss can add to a dB or two for poorly installed connectors, which can add with a few dB in a lousy antenna location... Before you know it, 3 dB of line loss has turned into 10dB for the entire antenna system, and that gets to be a big hit to performance, if signals are marginal to start with. For a scanner operating in FM or P25, when everything else is up to par, it doesn't make sense to spend an extra $50 on feedline when it's going to save you 1 or 2 dB in loss. I guarantee that you won't notice it unless you're working weak signal modes where the S/N ratio is already approaching zero. Then, and only then, does it make a difference in a receive installation.
 
Status
Not open for further replies.
Top