The 'best' cable to use depends on what the length of the run is and at what frequency you want to use it. If you narrow that down by using 100 feet, then it still depends on the frequency of use (max/highest frequency of use). After that it's a matter of comparing the total losses and then balancing it against cost. If the losses between any two cables amounts to less than 3 dB then you will never be able to tell any difference. Fractions of a dB in loss are meaningless in them selves. You have to add all those losses up for the entire system for it to make any difference.
Where did that "3 dB" figure come from? It's a general 'guess' if you are gauging loss by 'ear'. If that gauging is by laboratory gear then reduce it to something slightly lower.
The amount of loss per dollar isn't a very good way of picking feed line cable, but it's probably a very common one.
- 'Doc