I've never seen the need to use frequency readouts further than half a kilohertz. I just don't see the point in it.
Seeing a report of a station on 7345.217 Khz just seems pointless. I'm sure as SDR technology progresses they will be able to carry out the precise reading of the kilohertz to another hundred digits.
And I'm also sure there will be SWL's who will use such readouts in their reports.
"Wow, I just got WWV at 10000.012313337293023949283 khz."
It's only a matter of time.
I think the OP was asking a general question about the format of the frequency readout, not trying to comment on how many digits a signal should be reported too.
This issue of frequency reporting resolution has many facets.
The first is, regardless of how many digits your frequency display is, how accurate is the calibration of that display or how you are detecting the signals frequency? Just because your display goes to Hz does not mean you are accurate within Hz. Regardless of how accurate your readout was when the receiver was first aligned, unless you have it reference locked it will drift. Yeah, all kinds of things like ovens and such can reduce this, but it will eventually be off by a significant amount. You can also calibrate or normalize on the fly, that is check the frequency readout periodically against a known signal, such as WWV, but still there exist a problem of how to confirm it to the Hz level, typically something like detected audio and an oscilloscope will address that issue. If you do not lock your RX to a reference source or you have not recently confirmed the calibration of your frequency display then reporting to the Hz is a waste of time.
And how are you detecting the signals frequency? If it is an AM station are you simply tuning for a peak? If it is an SSB station are you simply tuning to the best sounding audio to your ear? If it is a CW station are you simply tuning by ear to the CW pitch you like? While all of these are common practices, and perfectly adequate for most listeners, not one of them will yield accuracy down to the Hz, except by random luck. Of course these techniques work well for listening, but not for finding frequency to a very tight standard.
Next is, do you need that kind of resolution? Leaving hams out of the equation here, since this is posted in the Monitoring segment of the forums. For the typical SWL doing BC stations there is probably no need to go finer than maybe 0.1 kHz. An argument could be made even 1 or 5 kHz is adequate, but 0.1 kHz can help ID obscure stations with known transmitter offsets. But what about the Utility listener? Many times Ute stations do not fall on even 1 kHz frequency steps. Some users can be identified by their chosen frequency offset, such as one networks common use of XXXX.6 kHz frequencies. Some digital modes must be tuned within a few 10’s of Hz at worst, meaning you must get the freq correct to get the decode, and to get the freq correct you must have it reported to you correctly. And then there is the identification of oddities and unknowns. A couple of years ago I found a periodic tone on a frequency, just a short tone every few seconds. A month or so later I found another tone, different duration, tone period, and frequency. After that I found a couple more, each of different lengths and periods. At first glance in the log they are unrelated, no correlation in times of day, pulse durations, pulse repetition periods, or basic frequency. However looking closer at the frequency revealed that each frequency was 67 Hz high, making a good possibility that they are all related in some way, possibly from the same transmitter.
So yes, there are times that great frequency accuracy, real accuracy, not just a lot of digits on the readout, is of use. Does the average BC SWL need it? Probably not, however it does not hurt at all to have the capability. And there are certainly types of monitoring that can benefit from high resolution frequency reporting.
T!