Thanks to both of you. (RWier & Troymail)
Regarding the DSP Levels and Site Data Decode Thresholds.... Is there a best practice to determine these settings? Restated, the default setting for DSP is 64 or so.... lowering this get's me a solid "T". Raising it get's me a flashing "T". (When monitoring a single talk group.....not when scanning)
The defaults for the Site's Data Decode on the primary is between 75 and 95..... BUT between 88 and 95 for the three other sites.
Again, any prescribed method to determine the best rates or is it a trial and error scenario?
Thanks!
I started out testing different DSP (64) settings, making changes 1 at a time. I didn't know if one direction was best, so I vacillated such as: 65, 63, 66 62, 67, 61. After making each setting change, I sent the new programming to the scanner, I would listen to a single, powerful site (multi site, simulcast system) , that is normally very active, and listen for as long as it took to subjectively decide "better or worse".
Probably around ~70, it became clear that going up was worsening reception. Going lower, I wasn't sure of any difference until about 55. Then there was a gradual improvement as the settings went lower. I didn't notice reduction until ~45. So I took the average of (~)70 and (~) 45 and set the DSP at (~) 57. I believe charting these results would have plotted as a very shallow "sine wave".
Next, leaving the DSP at 57, I started raising the upper DD (95) setting by 1. I could detect no difference up to 99. So I reset it to 95, and started lowering the lower setting (75), 5 at a time, such as: 70, 75, 70, 65. I could detect no differences until somewhere around 30-35. I thought there was a small improvement, so I continued downward. Finally settled on what I thought was a barely detectable DD best (very low profile "sine wave") at 15.
I than went back to the DSP, and started working both ways (up/down) from 57, by 1. Wow, the profile of the "sine wave" had heightened noticeably. Noticeable changes with changes of 2. Easily found the at sweet spot at 52. When I sent DSP 52 and DD 15/95 to the scanner, the listening results were simply unbelievable.
Before doing this testing, the longest the scanner would "lock on" a transmission was about 10 seconds, with most dropping out before 5 seconds. Now, there seemed to be no time limit on the length of lock on. A few minute+ transmissions were decoded without missing a single syllable!
What's missing in the above narration, is the time involved! Spread over about 30 days, the above testing probably took more than 30 hours.
Over the 3+ years since then, a have read maybe two posts here stating similar testing, and they came up with considerably different results. About the same number reported simply using my numbers and enjoying the improvement. The "sweet spot" probably depends on many factors, such as TRS System, location, antenna, etc.
That being said, I have determined, to my own satisfaction, that the primary determining factor in successful monitoring of any system, digital, trunked, analogue, simulcast, multisite, etc., is the strength and quality of the signal, measured where the antenna attaches to the scanner. This can be tested OBJECTIVELY using the "E" analyzation features of the HP-1 (E).