No. Well, you could pick an arbitrary figure and re express this in db above or below that figure, but it would be meaningless. Receiver sensitivity is always expressed in microvolts, but it means almost nothing.
Not trying to be argumentative or anything, but receiver sensitivity is not “always expressed in microvolts” and sensitivity does mean something. No, I am not saying sensitivity is the most important parameter of receiver performance, but it is important, I would certainly prefer a 0.1 µV receiver over a 4 mV receiver, assuming the same standard used for both measurements. Of course other parameters such as Dynamic Range, Image Rejection, and Selectivity are more important to the big scheme of things, particularly with regard to modern communications receivers. Yes, by convention receiver sensitivity in communications receivers is most often expressed in microVolts, but not always. Sure, you seldom see anything else, particularly in the Hobby market and in communications, but also in some other disciplines. There are several disciplines that will almost never use µV for receiver sensitivity.
It is not wrong, at all, to express receiver sensitivity several different ways, as long as the standard remains the same or is stated (i.e. 10 dB S/N for X Hz BW). And stepping away from the hobby and communications receiver world it becomes less frequently expressed in microVolts. In the world I work in receiver sensitivity, indeed any receiver centric power level, such as noise floor (but not Noise Figure), MDS, sensitivity, etc, are almost always express in dBm, if I asked most of my techs to give me the sensitivity of the system receiver in microVolts they might give me a blank stare, but ask them to measure it in dBm and they would be on board. While most of my engineers might quickly do the conversion from dBm to Volts a few might have to think about it first, simply because it is not the norm for us. They would still probably measure it in dBm and then convert to µV. Transmitter values, in the same environment, are most often express in either dBm or dBW (locally dBm is most frequent, but dealing with other entities we often have to express in dBW), and only occasionally converted to Watts.
Other units of receiver sensitivity I have seen used would be dBf (dB femtoWatt) and dBp (dB picoWatt). For people not familiar with this 0.22 µV = 0dBf = -30 dBp = -120 dBm (in a 50 Ohm system). So you can see that dBf might be applicable when you do not want large values when talking about receiver MDS’s or sensitivities. dBf grew in popular usage a couple decades ago when makers of stereo receivers decided that big negative numbers (like -120 dBm) and complex numbers with zeros and decimal points in them (like 0.22 µV) did not have a favorable marketing image, so they started expressing stereo FM receivers in dBf, resulting in claims along the line of 10 dBf for 0.7 µV or 7 dBf for 0.5 µV (true values are often rounded off in marketing for a cleaner look).
I am trying my hand at understanding values within radio specifications.
My Icom R75 has a sensitivity of 0.16 µV in SSB with pre-amp one on. Is there a method to convert that to db? And at what point does noise make the lower number become worthless? Well, what I mean is if looking at sensitivity what is the usual number of noise (even in the quietest RF situation) where sensitivity is not that important?
This question has already been answered, but I will also try to explain it, possibly to give a little different view of it. Always, always, pay attention to the units used. This is the letter after the “dB”. dBm, dBf, dBµV are each based on a different reference (dBm is referenced to milliWatts, dBf is referenced to femtoWatts, and dBµV is referenced to microVolts) and cannot be interchanged, although conversion between them is often easy. As Ed_Seedhouse has already pointed out, any ratio can be expressed in “dB” but the number itself is meaningless without the reference.
One of our engineers used to drive this point home by making all the techs work out everything in dB for practice, even the price of lunch or a car in dB$ or in dBcents. For example lunch that cost $7 could be said to cost 8.45 dB$, but a $20000 dollar car would only be 43.01 dB$
We have to start with the basic question, what do we mean by receiver sensitivity? It is not simply the weakest signal a receiver can detect, but rather it is the smallest signal level that, when received, will produce a specific output from the receiver. Not only will the receiver detect the signal, but it will reproduce the signal with a signal to noise ratio that is defined. Since bandwidth plays into all of this any measurement of sensitivity must include the minimum data of the signal level, the signal to noise ratio that is the standard, and the bandwidth used. For example, 0.16 µV (10dB S/N at 500 Hz BW).
You cannot convert 0.16 µV to dB, but you can convert it to some specific scale based on a reference point. In order to do so you must keep in mind the power and Voltage relationships. No reason to go into the formulas used here, the easiest is to just grab a chart (this chart is for a 50 Ohm system):
http://www.hawaiirepeaters.net/dBm-to-Microvolts.pdf
If you look at that chart you will see that your 0.16 µV is about -123 dBm. But that signal was what was required to produce say a 10 dB SNR (or whatever the standard used), so that the noise floor of your R75 might be more like something around -132 or -133 dBm (0.05 µV). This means you might be able to detect a signal a good bit weaker than the 0.16 µV that is the stated sensitivity value.
At what point more sensitivity becomes useless is going to depend on MANY variables. Local noise sources, natural noise, thermal noise, antenna performance, etc. There is no one number. A value for this at 2.5 MHz will not be the same value, even for the same location, for 25 MHz.
Now it makes sense. In reality this is probably something I should not be to concerned with when it comes to specs. In an optimal environment (which I do not have) these stats hold water. Real world performance cannot (really) be measured by these numbers alone (unless really horrible and that will show on paper).
These receiver performance numbers only define the potential capability of the receiver. They are important to know but are not the entire story by a long shot. However, better numbers are never a bad thing unless you have to give up some other aspect of receiver performance to achieve them.
For example, if you build a receiver with 0.02 µV sensitivity that is potentially outstanding. Maybe not everyone can use it because of the noise at their location or because of natural atmospheric noise issues. For people in exceptionally quiet locations or with specific antenna constraints or who use this receiver as part of a more complex system (such as the final IF of a downconverter) this sensitivity might be a trait that could be exploited. But, if in achieving this performance you have a limited dynamic range this just means the receiver will become overloaded on strong signals. Of course there are ways to address this, I only mention it as this is part of the entire package.
A good basic, unscientific, test for a receiver involves removing the antenna. With the receiver attached to your antenna tune to a frequency with no signal on it and no signal near it. Try something at a high frequency if talking about HF, like maybe around 29 MHz. Disconnect the antenna from the radio. If the noise level goes DOWN when the antenna is off the radio then your radio probably has adequate sensitivity for your specific installation, you are already detecting at a level that is at or below your specific noise floor. If you do NOT detect a change in noise floor it means your receiver might be the limitation, and not noise. Yeah, I know, there are lots of potential problems in this test, but it does tell you if you could benefit from more sensitivity or not.
Like the Bonito Radiojet has a sensitivity .03 micro volts with a noise floor of 137 DB. This tells me (if my thinking is right) that the sensitivity measurement is moot because the internal noise floor is actually higher then the rated sensitivity. Is that thinking right?
This sensitivity and noise floor are probably the result of a misunderstanding. -137 dBm is about 0.032 µV.
This web page is probably where you are getting these numbers:
Bonito RadioJet - 24 Bit High Performace IF-Reciver
It says “MDS -134 dBm = 0.03 µV sensitivity MDS = (Minimum Detectable Signal 3 dB above noise)”
The problem here is the use of a non standard measurement and description. It is describing an MDS (Minimum Detectable Signal) as being something that is 3 dB above the noise floor. First, it is saying an MDS of -134 dBm is equal to 0.03 µV sensitivity, and that is simply wrong. It might, maybe, be equal to a 0.03 µV noise floor, but if so this is a very odd way to express it. If their noise floor is truly -137 dBm, as is indicated by their statements (the -134 dBm MDS minus the 3dB defined), the sensitivity might be more like 0.10 µV (10 dB S/N @ 500 Hz), but I just use that number as a possibility, it is not a calculation or a measurement.
Personally I would have serious doubts about any performance claims on that web site based on the appearance that whoever wrote that stuff seemingly does not understand the basics. They are defining their stated sensitivity as the same as their noise floor, and a pretty ambitious noise floor at that.
T!