...Now if this isn't confusing I don't know what is. My results may explain why I don't see the sensitivity issues that others see.
I don't think so. I think you all are barking up the wrong tree, when it comes to determining why you're so unhappy with the performance of the radio.
The difference between the OP's sensitivity measurements and yours is only a small handful of dB. If you were to make a comparison of SINAD measurements on any given receiver, and make a 3-4 dB variation in the signal generator output, in a typical FM receiver, you would barely be able to discern the difference. Operationally, all things being equal, you would be almost unable to tell the difference between a pair of receivers operated side by side with a 4 db difference in sensitivity,
unless signals are already extremely weak, and not saturating the receiver.
In a typical public safety system, signal levels on the street in the primary service area are going to be in the tens of microvolts, well above the saturation level in an FM receiver, and therefore several dB variations in the sensitivity of individual receivers will be virtually undetectable.
Of course, if you're trying to DX systems from well outside their intended coverage area, then maximizing performance becomes more of a concern, but I would still submit that the handful of dB in measured differences between yours and the OP's radios would not be a significant factor, unless the signal is actually arriving at your receivers antenna terminal at less than a half microvolt, in which case, you're going to need additional measures to make it comfortable to listen to (i.e. better antenna, preamp, etc.) Hobbyists need to adjust their expectations accordingly. Professionals know when to simply not bother.
I didn't perform any tests for internal noise, but I'd be willing to bet it's present.
I'll guarantee it's present, and this statement is more along the lines of what I think you'll find the problem is.
Somewhere in the thread, someone mentioned older scanners with 1 microvolt sensitivity kicking butt compared to the 436. The reason for this is due to the synthesizer's phase noise. Phase noise is always present, but in a synthesizer, it becomes one of the parameters that an engineer must use to balance cost vs. performance. The speed at which a scanner must switch channels is important to the users, and that's another tradeoff. In synthesizer design, it becomes a balance of cost and complexity vs phase noise, step size, and scan speed. In a consumer grade scanner, cost and speed is the winner, and phase noise performance is the losing parameter. The synthesizer/vco assembly alone in a commercial/public safety grade radio probably costs more than your entire 436, and therefor it will scan fast, and have good phase noise performance.
The end result with this poor phase noise will be the perception that the receiver sensitivity is bad, and even strong signals that are in full saturation will never completely quiet, as compared with something like a commercial radio, or an older crystal controlled scanner. The noise of the synthesizer is heterodyned down to the IF along with the desired signal, and is superimposed on the resulting IF signal that gets demodulated. The perceived difference in performance with individual 436's could be a few db difference in phase noise (an acceptable variation in a production run), or individual preference in just how much quieting is acceptable, along with that small handful of dB in actual measured sensitivity. And so long as that last item is within specification, the user has no grounds for a beef against Uniden.
Note that the synthesizer phase noise performance is not listed in the radio's spec sheet.
So, in the end, what I suspect is that within these newer scanners, the synthesizer phase noise performance is somewhat worse than in previous scanners. Check things like channel step size, scan rate, and available frequency ranges, and if any of those parameters are substantially better than previous versions, it's a fair bet that the phase noise is substantially worse. If not, that scanner would become prohibitively expensive. The fact is, while most hobbyists don't know it, it's synthesizer phase noise that causes the perception that commercial radios are more sensitive, and better performing radios than scanners. That, and the third order intercept point on the receiver. The actual sensitivity in microvolts is quite comparable.
And not to get nit-picky, but if the Uniden spec is .3 uV (and not .300 uV) then your radios meet this spec in all but one test (820.5 MHz).as anything from .3 to .399 would be within spec...but I'm not here to argue.
It would be worthwhile for someone to compare the actual specs, and make the necessary measurements to see if the radios actually meet it. I suspect even the "bad" ones do. Even the OP's measurements are close enough that I wouldn't quibble about it. The differences are within acceptable measurement error range.
I noticed that among the specs published for the 436 and 536 are "signal to noise" specifications. That represents an opportunity to include the effects of synthesizer performance in evaluating the radio. On VHF and 800, it's around 40 db. That's "adequate" but certainly not a spectacular number.