How does a receiver compensate for frequency deviation?

Status
Not open for further replies.

RFFR

Member
Joined
Jul 19, 2017
Messages
26
So let's say that I'm using DMR or P25 with some cheap handheld radios. Let's say the authorized frequency is 150.55 MHz (totally pulling that out of the air). Because the handhelds may not be calibrated and there may be frequency deviation, let's assume when the handheld transmits, it actually transmits on something like 150.5525. How does the receiver, whether base station or a handheld operating in direct mode, receive the signal? Does the receiver "listen" within a certain bandwidth and just treat any signal within that bandwidth as a valid signal?

I've been using RTLSDR and HackRF to analyze local spectrum and I'm finding some frequencies that don't correlate to any FCC records, but seem to be just off by Hz or even a few kHz. I posted here before and I was informed that actual center freqs can deviate because of cheap crystals or other factors. So, I'm trying to determine how receivers accommodate for that.

Thanks
 

KE5MC

Member
Premium Subscriber
Joined
Dec 19, 2002
Messages
1,235
Location
Lewisville, TX
I would say the receiver will tolerate frequency errors and not compensate. The receiver's band width is set to the bandwidth of the transmitted signal based on FCC requirements plus some added bandwidth so nothing is missed and a small amount of error in frequency is tolerated. Digital modes would have tighter tolerance on transmit than analog. Analog would start to sound bad, but usable where digital would corrupt sooner and nothing decoded.

Just my thoughts without digging into the specifications and alignment procedures for a particular system and radio.

Mike
 

jim202

Member
Joined
Mar 7, 2002
Messages
2,731
Location
New Orleans region
There are a number of things that can control the frequency your radio receiver is thinking it is receiving. The first is the local oscillator in your receiver. The second is the IF of the receiver. Then there is even the variation of the voltage feeding the local oscillator. A good receiver will have a voltage regulator to keep voltage changes from causing frequency errors. It should also have some sort of temperature compensation to try to keep the receiver on frequency. Both high temps and low temps can throw off the receiver frequency control.

Any drift or deviation of the above can effect what frequency your receiver is actually trying to listen to. The higher the frequency your trying to receive, the more critical any drift or deviation from the designed center of operation you can end up with.

Even the age of the receivers different components can have some effect of the frequency accuracy. The lower cost your receiver is, the more prone to frequency error you can experience. Most of the lower cost receivers do not provide for any correction in the local oscillator frequency to bring your receiver back onto the correct received frequency.

This may not be what your wanting to hear, but it's the war a receiver works against. The better the design, the more stable your receiver will be under different conditions.
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,416
Location
VA
Does the receiver "listen" within a certain bandwidth and just treat any signal within that bandwidth as a valid signal?

Yes. FM Narrow (and most digital ) uses a 12.5KHz wide channel. The nominal frequency is the center of that fhannel. Any signal within that 12.5KHz slice of spectrum will be received by the radio. Is a signal is off by 1KHz, it generally doesn't matter--it will still fall within the 12.5KHz channel.
 

wa8pyr

Technischer Guru
Staff member
Lead Database Admin
Joined
Sep 22, 2002
Messages
7,008
Location
Ohio
So let's say that I'm using DMR or P25 with some cheap handheld radios. Let's say the authorized frequency is 150.55 MHz (totally pulling that out of the air). Because the handhelds may not be calibrated and there may be frequency deviation, let's assume when the handheld transmits, it actually transmits on something like 150.5525. How does the receiver, whether base station or a handheld operating in direct mode, receive the signal? Does the receiver "listen" within a certain bandwidth and just treat any signal within that bandwidth as a valid signal?

I've been using RTLSDR and HackRF to analyze local spectrum and I'm finding some frequencies that don't correlate to any FCC records, but seem to be just off by Hz or even a few kHz. I posted here before and I was informed that actual center freqs can deviate because of cheap crystals or other factors. So, I'm trying to determine how receivers accommodate for that.

If I read your post correctly I think what you're after is how does frequency drift (i.e. an off-frequency transmitter) affect reception, so here goes. . . .

First, with Frequency Modulation, the frequency you're receiving will vary by up to X amount (typically 75 kHz for FM broadcast, 5 kHz for land mobile FM, and 2.5 kHz for land mobile narrow-band FM) from the assigned frequency. So, a transmitter on 154.400 MHz with 5 kHz deviation can vary between 154.3975 and 154.4025 MHz.

If the transmitter is on or very close to the assigned frequency, the signal will be within the passband of the receiver (for example 2.5 kHz either side of the assigned frequency for 5 kHz deviation) and the receiver will be able to demodulate it without trouble. However, the farther off frequency the transmitter is, the farther outside the passband of the receiver it will be with less signal for the receiver to demodulate, and consequently more garbled. The receiver can't compensate for this.

In a way, you could think of it like a flashlight (the transmitter) shining through a hole the same size as the flashlight (the passband filter in the receiver). When the flashlight is correctly lined up with the hole (on frequency), all of the light (signal) gets through. As the flashlight is moved farther to one side or the other (off frequency) less of the light (signal) gets through.

With digital signals, there is an acceptable bit error rate for proper decoding of the signal. The farther off frequency the transmitter is, the greater the bit error rate, and consequently the more garbled the signal will be.

FCC rules allow a certain amount of drift; it happens naturally as components in the transmitter age. At one time transmitter alignment was required annually (it still is depending on the radio service), but now as a practical matter a radio only gets aligned if it gets out of tolerance. "Best practices" typically call for periodic alignment no matter what (at least every other year is a good start), but it doesn't always get done.
 

paulmohr

Member
Joined
Jul 12, 2017
Messages
170
Location
Adrian MI
I am going to go with what Jon said, and here is an easy way to see what he is talking about:

I am not familiar with HackRF, but I am going to assume it is similar to SDR# and SDR Console. Open the program and find where to switch the modes. Should be FMN, AM, CW, FMW and so on. Now look at the screen where it shows the frequency peaks and the tuning bar. It is the bar that runs from the top to the bottom that you use to "lock" on to a frequency. Now toggle the modes from FMN to FMW and CW. Watch that bar as you switch them. FMW will be very wide, FMN will be narrow and CW will be very narrow. THAT is the chunk of the frequency spectrum it looks for the signal on.

Now lock onto a frequency and shift the tunning bar from left to right slightly. You will see that as long as that signal is within that bar it will receive it, even if the peak is not dead center.

I am not sure about this, but I would think that not having the peak dead center, or in your case the receiver not being calibrated properly would probably effect how well it receives the signal. So by not having the radio calibrated properly you would still get the frequency from the other radio, but it may effect your range. This is total guess on my part though, I am not completely sure how all this stuff works either lol.
 

RFFR

Member
Joined
Jul 19, 2017
Messages
26
Yes. FM Narrow (and most digital ) uses a 12.5KHz wide channel. The nominal frequency is the center of that fhannel. Any signal within that 12.5KHz slice of spectrum will be received by the radio. Is a signal is off by 1KHz, it generally doesn't matter--it will still fall within the 12.5KHz channel.

Thanks for the reply. So if the channel bandwidth is 12.5 kHz, does that mean that a frequency will still "work" anywhere within 6.25 kHz above or below the designated center freq and that is just might not be as strong the further away it is?
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,416
Location
VA
A typical transmission will use most, but not all of the channel width, say 10KHz. If it drifts too far off-center, it will start bleeding over into the adjacent channel. This will degrade signal quality because not all of it is within the intended channel, and will start interfering witth the adjacent channel. In this example, as long as the center freq is within 1.25KHz of nominal [(12.5 - 10) / 2 = 1.25] then everything is still be OK. But if you deviate more than that, signal quality will drop and interference will rise quickly.
 

RFFR

Member
Joined
Jul 19, 2017
Messages
26
A typical transmission will use most, but not all of the channel width, say 10KHz. If it drifts too far off-center, it will start bleeding over into the adjacent channel. This will degrade signal quality because not all of it is within the intended channel, and will start interfering witth the adjacent channel. In this example, as long as the center freq is within 1.25KHz of nominal [(12.5 - 10) / 2 = 1.25] then everything is still be OK. But if you deviate more than that, signal quality will drop and interference will rise quickly.

Interference with the adjacent channel would depend on an adjacent channel being in use, correct? So if nothing was transmitting anywhere near the intended channel the signal may be degraded, but no interference would occur. Am I understanding that correctly?
 

RFFR

Member
Joined
Jul 19, 2017
Messages
26
First, with Frequency Modulation, the frequency you're receiving will vary by up to X amount (typically 75 kHz for FM broadcast, 5 kHz for land mobile FM, and 2.5 kHz for land mobile narrow-band FM) from the assigned frequency. So, a transmitter on 154.400 MHz with 5 kHz deviation can vary between 154.3975 and 154.4025 MHz.

Thank you for the reply. So if I'm seeing power identified as a frequency that is perhaps 2.5 kHz above/below my intended center freq, there is a possibility of it being correct but just suffering from frequency drift? Should it be higher than that, say 7 kHz, is it just better to assume that I'm looking at an entirely different center frequency which itself may or may not be experience frequency drift?
 

WA0CBW

Member
Premium Subscriber
Joined
Dec 8, 2011
Messages
1,635
Location
Shawnee Kansas (Kansas City)
Also look up the code of federal regulations (CFR 47 part 22.355) for how close to the assigned frequency a transmitter must be to be in compliance with part 90 radios. It is usually measured in "ppm" or parts per million. For example if the requirement is 10 parts per million and the frequency is 150.00 Mhz then the tolerance would be 10 times 150 or +/- 1500 Hertz. How far the equipment can drift off frequency and still be understood is dependent the design of the radio.
BB
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,416
Location
VA
Thank you for the reply. So if I'm seeing power identified as a frequency that is perhaps 2.5 kHz above/below my intended center freq, there is a possibility of it being correct but just suffering from frequency drift?

Yes. But keep in mind that if you are using a cheap SDR dongle that does not have a TXCO, it's likely that the error is the SDR being off-frequency, rather than the transmitter. Cheap SDRs are typically 20-30PPM off-frequency, and I've seen as much as 60PPM. If you get one with a TXCO, then they are typically within 1PPM of nominal frequency.

Also, you can't assume that no interference is occurring just because nobody is transmitting on an adjacent channel. Common courtesy (and laws and regulations) says you should not splatter your signal across multiple channels.
 

wa8pyr

Technischer Guru
Staff member
Lead Database Admin
Joined
Sep 22, 2002
Messages
7,008
Location
Ohio
Thanks for the reply. So if the channel bandwidth is 12.5 kHz, does that mean that a frequency will still "work" anywhere within 6.25 kHz above or below the designated center freq and that is just might not be as strong the further away it is?

Not really. Authorized channel bandwidth and occupied bandwidth are different things, and it can vary depending on the frequency band and transmitter power.

It depends upon the receiver you're using (scanners have less filtering and tend to have slightly wider bandwidth than "real" radios), but if the transmitter is much more than 2 kHz off frequency, you can expect the signal you're receiving to start getting pretty nasty.

Channel spacing varies depending upon the band (for VHF it's 7.5 kHz and UHF it's 12.5 kHz) and refers to how far apart the center frequencies of two adjacent channels are. For example, 460.125 and 460.1375 are adjacent UHF channels on 12.5 kHz channel spacing.

However, that doesn't mean each channel actually has 12.5 kHz of bandwidth to use. The channel bandwidth authorized by FCC rules is approximately 10-15% less (it depends on the band) to prevent interference to/from other users on adjacent channels.

Occupied bandwidth is a more realistic number; narrow band transmitters use 2.5 kHz deviation, which means the frequency varies no more than 2.5 kHz above and below the licensed frequency. This gives an occupied bandwidth more like 5 kHz (maximum has to be less than 8.5 kHz).

FCC Part 90 rules (90.213) require a transmitter in the VHF band to be stable to within 2.5 parts per million (2.5 kHz), and in the UHF band it's 1.0 part per million (1 kHz). Once it gets outside those parameters it's going to cause interference and won't be properly received.
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,416
Location
VA
Thanks for the reply. So if the channel bandwidth is 12.5 kHz, does that mean that a frequency will still "work" anywhere within 6.25 kHz above or below the designated center freq and that is just might not be as strong the further away it is?

No. The receiver would be missing the half of the signal outside the channel, and you'd have severely distorted audio at best.
 

majoco

Stirrer
Joined
Dec 25, 2008
Messages
4,283
Location
New Zealand
Great group of replies but in general haven't answered your initial question.....
How does the receiver compensate for frequency deviation?

Now I suppose you mean the transmitter not being accurately on it's assigned frequency (but may also be 'drift' in your receiver), 'deviation' is the frequency change influenced by the transmitted audio modulation which is what we are trying to receive.

The receiver works basically by changing the incoming frequency (that you have dialled in) down to a different constant frequency where most of the amplification is done. The frequency is changed by mixing it with another frequency generated in your radio, also controlled by the dial frequency, but this frequency may not be as tightly controlled as the one in the transmitter.

The amplified signal goes to a detector, dependant upon the type of modulation you have selected, AM, FM, FMN and so on to the audio amplifier and speaker for you to hear.

But this is where it gets tricky...a little bit of the pre-detector signal goes to a 'f'requency discriminator' where it is applied to a pair of tuned circuits, one tuned to slightly above the frequency and one below - the high one gives out a positive voltage and the low one a negative voltage which are then added together. So if the signal is right on frequency, the output is zero, if it's high then the output is positive and of course low if negative.

This voltage is fed right back to the oscillator which changed the incoming frequency - the oscillator is designed so that this voltage can change it's frequency slightly to bring the "frequency discriminator" output voltage to zero again.

The whole thing is called 'automatic frequency control' and has been around for years, ever since broadcast VHF FM receivers suffered from 'drift' - you'll often see a button with "AFC" on it. Some even have a tuning indicator meter which is centre-zero and swings from one side to the other as you tune through a station. Pushing the AFC button should bring the meter to the centre.
 

paulears

Member
Joined
Oct 14, 2015
Messages
786
Location
Lowestoft - UK
I'm amazed nobody has mentioned how inaccurate SDR receivers are? I have three ranging from 10 quid to about forty quid and they all miss the mark accuracy wise, and while one is always a bit high, the others are random. The SDR software even has an adjustment box so you can adjust things so they read the right frequency. One of mine is over 12.5KHz away from the real frequency, so until I realised, I couldn't understand why some busy frequencies it can hear didn't pop up on my proper radios.

Real radios have amazingly good frequency stability nowadays.

Their ability to tolerate off frequency signals in practice really depends on the receiver bandwidth. Going back a few years to when we channelised in 25KHz steps, we had some problems when 12.5KHz channels started being used. As we are using deviation from a centre frequency for the message carrying system, rather than amplitude modulation, a 12.5KHz radio, with it's reduced deviation just sounds quieter on a 25KHz set up radio. However, the bandwidth usually meant that the 12.5KHz radio, with it's lower deviation , on the next channel up still fell within the pass band, so a radio tuned to 165.100Mhz, heard the radio on 165.1125KHz. On some radios, it was perfectly clean, on others it distorted on loud speech, as the signal crept into the 'too far away' area. On some radios, as soon as the signal went too far away, it simply cut out. Icom amateur sets didn't do this, but their commercial radios had much tighter filtering, and if the signal went outside the pass band, it cut out. The Chinese radios so popular now have frequency steps that even go down to 5KHz, but the filters are quite wide, so they cannot reject strong signals 'next door'. More expensive radios can - which makes them better for professional users, but for amateur users, not just the hams, being able to hear things off frequency isn't always bad.

Most radios now have good enough filtering that a 12.5Khz capable radio can be tuned at the 5 and 6.25KHz steps, one away - up or down, but most cut off anything if they are more than 10KHz away.

SDR receivers have almost no filtering at all, so are very open - add to this their dreadful accuracy and you see issues that are not really there. A repeater on 453.2375MHz locally shows as 453.2382MHz on my frequency counter, but I suspect, based on it's spec, that it is really on 375, and my frequency counter is simply not accurate enough. My best SDR says it is on 453.248MHz, which most people would round up to 250 - a proper legit channel, but it's not!

A proper radio MUST be precise in frequency - there are specifications set down by many worldwide authorities in individual countries, so it makes sense to trust the radio and assume any errors, shifting 5KHz away from the real frequency are NOT coming from the transmitter, but from inaccurate monitoring equipment. If you have local transmitters that you think are accurate - like the coastguard, then see what the SDR reports as their frequency, and then adjust the offset to read the proper frequency - like 156.800MHz. If the coastguard appear to be on 156.807, I would assume this is NOT them, it's you! If I add the offset to my SDRs, next day, that offset will be wrong again, so you can do the calibration on a known frequency to make sure your results are accurate.
 

N4GIX

Member
Premium Subscriber
Joined
May 27, 2015
Messages
2,124
Location
Hot Springs, AR
I'm amazed nobody has mentioned how inaccurate SDR receivers are? I have three ranging from 10 quid to about forty quid and they all miss the mark accuracy wise, and while one is always a bit high, the others are random.

Not all SDR receivers are created equal. For example, my Elecraft KX3 is much more stable than my Collins KWM2A. In fact it is as stable and accurate as my service monitor! :lol:

Now granted, my Elecraft KX3 costs quite a bit more than "a few quid..." :D

Base price for a KX3 begins at $1049 assembled. With the options I added it comes to just shy of $2400. :cool:
 

budevans

Member
Joined
Feb 2, 2009
Messages
2,175
Location
Cleveland, Ohio
So let's say that I'm using DMR or P25 with some cheap handheld radios. Let's say the authorized frequency is 150.55 MHz (totally pulling that out of the air). Because the handhelds may not be calibrated and there may be frequency deviation, let's assume when the handheld transmits, it actually transmits on something like 150.5525. How does the receiver, whether base station or a handheld operating in direct mode, receive the signal? Does the receiver "listen" within a certain bandwidth and just treat any signal within that bandwidth as a valid signal?

I've been using RTLSDR and HackRF to analyze local spectrum and I'm finding some frequencies that don't correlate to any FCC records, but seem to be just off by Hz or even a few kHz. I posted here before and I was informed that actual center freqs can deviate because of cheap crystals or other factors. So, I'm trying to determine how receivers accommodate for that.

Thanks

Do the radio makers still use PLL (Phase Lock Loop) chips in conjunction with a VCO (Voltage Controlled Oscillators) to handle frequency stability?
 

jonwienke

More Info Coming Soon!
Joined
Jul 18, 2014
Messages
13,416
Location
VA
I'm amazed nobody has mentioned how inaccurate SDR receivers are?

I guess you missed this:
"But keep in mind that if you are using a cheap SDR dongle that does not have a TXCO, it's likely that the error is the SDR being off-frequency, rather than the transmitter. Cheap SDRs are typically 20-30PPM off-frequency, and I've seen as much as 60PPM. If you get one with a TXCO, then they are typically within 1PPM of nominal frequency."
 

GTR8000

NY/NJ Database Guy
Database Admin
Joined
Oct 4, 2007
Messages
15,482
Location
BEE00
Yes. But keep in mind that if you are using a cheap SDR dongle that does not have a TXCO, it's likely that the error is the SDR being off-frequency, rather than the transmitter. Cheap SDRs are typically 20-30PPM off-frequency, and I've seen as much as 60PPM. If you get one with a TXCO, then they are typically within 1PPM of nominal frequency.

TCXO (Temperature-compensated crystal oscillator), not "TXCO".
 
Status
Not open for further replies.
Top