nd5y
Member
I worked for a cellular manufacturer back in the 1990s in the repair depot. The "customers" were cellular carriers that sent in large quantities of cellphones for warranty repair, not individuals.So, my question is this. How much difference between individual radios is "normal."
We ran the same automatic alignment and pass/fail test on each unit that they did on the production lines in Japan and Mexico (before everything moved to China). This was with old analog single band phones so there were additional things that can affect sensitivity compared to a scanner. The receiver tests were done 3 times while transmitting at high power at the low end of the band, middle of the band, and high end of the band.
Actual measured specs can vary quite a bit from unit to unit. If I remember right there could be up to 6 dB or more difference in receive sensitivity. Some units were way more sensitive than normal, some would barely pass and some were just lemons.
Electronic devices aren't all exacly the same because the components aren't all exactly the same and sometimes the same value/type components from different vendors are used.
I have no idea what Uniden's manufacturing environment is like. I don't know if they need to align and test every unit or if they just load firmware with average values and I don't know what their product specifications and pass/fail tolerances are like. You would need to test both scanners with actual test equipment to know what the difference really is and then you would need to know Unidens' specs for the particular model and production lot and you would need to know if Uniden pulls any shenanigans like if a lot of units start failing on the production line engineering/management might change the specs so more units pass without addressing the underlying problem.
Last edited: