Simple electronics.
3DB loss = Loss of 1/2 of your signal
so say starting at 100% signal
3Db = 50% of 100
3DB = 25% of 100
3DB = 12.5% of 100
3DB = 6.25% or 100
So saying 12DB of loss you only have 6.25 percent of your original 100.
As stated here the higher the frequency the more the loss.
Ok, now let's put these losses into proper perspective.
For the OP, as has been stated, the coax losses are the same for TX as for RX. But looking at it another way, if you have 3 dB loss and 100 watts, you'll get 50 watts out the other end, and 50 watts of heat in the coax. If you have 3 dB and 1 watt, you'll get .5 watts at the other end. 50 watts will make the coax noticably warm, a half watt will not. But it's the same 3dB.
Now, consider a receiver. Receiver sensitivity is generally rated in voltage, not watts. A 10dB loss in power is one tenth the power. But it takes 20 dB loss to get one tenth the voltage. How does that play out in the real world? Well, a 0.5 uV receiver is... decent. That's -113 dBm. Add 3 dB of line loss, and you get 0.355 uV. On the end with a 100 watt transmitter, you've just turned 50 watts of power into heat. But on the receiver end, you've only lost 0.125 microvolts. Now, most modern receivers are much better than 0.5 uV in sensitivity. Let's say they're more like 0.2 uV. If you've got a signal generator and a receiver to play with, set it for 0.5 uV, and then reduce the signal by 3dB. You'll scarcely notice the difference. As you get closer to the receiver threshold, that 3 dB will mean more, but by the time 3dB really matters, it's already too noisy to be usable.
So, 3 dB can be a lot of heat waste on the transmitter end, and barely noticeable on the receiver end, but it's still the same 3dB.
In the end, you want to chase after every bit of loss as you can, because it's additive. 3 db of line loss can add to a dB or two for poorly installed connectors, which can add with a few dB in a lousy antenna location... Before you know it, 3 dB of line loss has turned into 10dB for the entire antenna system, and that gets to be a big hit to performance, if signals are marginal to start with. For a scanner operating in FM or P25, when everything else is up to par, it doesn't make sense to spend an extra $50 on feedline when it's going to save you 1 or 2 dB in loss. I guarantee that you won't notice it unless you're working weak signal modes where the S/N ratio is already approaching zero. Then, and only then, does it make a difference in a receive installation.