TV Broadcast Question

BinaryMode

Blondie Once Said To Call Her But Never Answerd
Joined
Jul 3, 2023
Messages
887
Location
75 parsecs away
With the advent of digital modulation for television, why on earth did they not chose to use frequency hopping spread spectrum?

Whether FHSS or DSSS using at least 2.5 MHz of bandwidth, you'd think it would be a lot more beneficial in terms of minimizing interference and whatnot. Also, in my opinion it should be in the VHF band since that band seems to perform quite well withen foliage and whatnot.

I've been of the opinion for a long time public safety should use spread spectrum as well. There should be a 2.5 MHz or more segment for public safety in the VHF and UHF band for FHSS or DSSS. Maybe use Codec 2.
 

G7RUX

Active Member
Joined
Jul 14, 2021
Messages
581
Well, there's a fair bit to unpick here but the main take-home is that it is unnecessary, spectrally inefficient and leads to complex and expensive receivers and transmitter networks.

The main issue with TV broadcasting that affects the quality of the received service is usually multipath propagation.

Transmitting TV in the VHF bands would take up most of the available spectrum and would lead to issues with reuse distance.

There are many more reasons in play but these are some headlines.
 

krokus

Member
Premium Subscriber
Joined
Jun 9, 2006
Messages
6,144
Location
Southeastern Michigan
The amount of data required for the digital tv formats requires the whole bandwidth of the signals. Granted, they can update the codecs, but that is still a lot of data.
 

gmclam

Member
Premium Subscriber
Joined
Sep 15, 2006
Messages
6,425
Location
Fair Oaks, CA
What are you trying to accomplish by using spread spectrum? krokus stated the main point - bandwidth. TV signals are wide bandwidth and tend to not leave "spaces".

I find that you put "digital television" into a single bucket. Hardly. Are you asking about ATSC 1.0, ATSC 3.0, protocols used by the cable TV industry, or protocols used outside of the USA? They vary.

ATSC 1.0 was defined a very long time ago by today's standards. The whole 8VSB modulation method is the subject of lots of scrutiny. The modulation aspect of ATSC 3.0 seems to address those issues. The problem with 3.0 is that DRM has been included in the overall protocol. It needs to be killed if so-called NextGen TV is ever going to get off the ground (presently it is on the same path as CD stores).
 

prcguy

Member
Premium Subscriber
Joined
Jun 30, 2006
Messages
16,625
Location
So Cal - Richardson, TX - Tewksbury, MA
It takes a certain amount of band width to provide a certain level of video quality no matter what the format, AM modulation, QPSK, ATSC, etc. The current ATSC TV channels take up the same 6MHz band width as the original AM modulated channels but with higher quality and a bunch of lower quality channels tossed into the same BW. FHSS or DSSS will not improve that. All terrestrial TV transmitters have brick wall band pass filters that allow them to meet FCC specs and not go outside their channel allocation.

Once you go digital future technology will help squeeze in more channels or higher quality within the same allotted band width and that's usually due to improvements in video compression. Video compression is the sole reason DirecTV and Dish Network exist, they can squeeze a bunch of channels into one carrier where it used to take a full 36MHz satellite transponder just to deliver one standard definition NTSC video, DirecTV sticks five HD videos within the same BW. I forget how many SD videos they can fit into their 27MHz transponders but it started out with about 11 when DirecTV first launched and as video compression technology improves the satellite carriers buy all new racks of equipment every few years spending gabillions of $$ just to stick one or two more channels into a transponder. Now I think they can fit about 16 SD videos in the same 27MHz as they started with in 1994 with the latest video compression technology.

The same may be true for ATSC but I'm not as familiar with that format as satellite DVB-S2 and it might not allow for the same type video compression upgrades. But I'm certain you can fit more channels or more quality within the 6MHz ATSC TV channels now compared to when the format was launched using newer video compression technology. And I'm also certain you can't fit the current lineup of TV channels in the 6MHz ATSC format in a 2.5MHz FHSS or DSSS channel as those formats have little to do with packing more information into a finite amount of BW.
 
Last edited:

BinaryMode

Blondie Once Said To Call Her But Never Answerd
Joined
Jul 3, 2023
Messages
887
Location
75 parsecs away
It's funny Wi-Fi performs better than digital TV.

Seriously, lose a tiny bit if signal on OTA TV and it's garbage pixel crap. Meanwhile, I can use spread spectrum in Wi-Fi (that's what it is) and stream my rear end off watching copious amounts of great programming all the while surf the Internet...

Bottom line is the whole OTA TV thing is a MASSIVE engineering joke and it seems like it's getting worse!

Speaking of low IQ and engineering disasters , I accidentally just hit the forward button in this FireFreak browser making all of my typed text go away. Me being the forward thinking, intuitive person I am installed a text logger extension that saves my typed text. So all is well. But it raises a question upon two points:

1) Why is common sense out the window with things in this day and age? I often ask simple things like, "why did you code it like that?! You took your ten little fingers and coded it like that!" Or, "why are you going 45 in a 35 MPH designated road thought up by traffic engineers?" You know, the little Neanderthal low IQ chimps that people often inmate and the things they do. Never mind the social media pant-hoot.

2) Why in hell do browser's today that constantly get updated like every week it seems (for what, I don't know. What exactly are they improving that requires constant updates?) not have a built-in feature with a boolean option to log your text for such a disaster?

To tie my rant together with the subject matter is that I think how the FCC does things, and how digital TV is and has been rolled out, not withstanding APCO and P25 is kinda, well stupid. In my opinion of course. Then again, I'm not some RF engineering genius or anything like that either. I just know that spread spectrum technology (invented by a woman during WWII) is a good sound solution when it comes to interference resolution, jamming mitigation and perhaps multipath signal mitigation. And no, it's not super expensive with today's Moore's law of miniaturization and IC improvements over the years. We're down to what now? Some 8 nm at the transistor level just in a computer CPU? Look how everyone has a Dick Tracy watch now-a-days... Especially since that technology can be packed into a Motorola DTR radio or Nextel i355 phone of the not so distant past for Direct Connect of which I have over 15 of those phones for SHTF Comms. Never mind the advent of SDR technology. The DTR and i355 both use spread spectrum.

No, in my opinion digital TV should be spread spectrum because as it stands right now you either get the signal full on or you don't. There's no wiggle room that I believe DSSS or FHSS could offer. Again, look at Wi-Fi's spread spectrum scheme as an example. Now imagine sending that kind of spread spectrum out at broadcast TV wattage. The bandwidth for 802.11g is 20 MHz. And it performs better than digital TV it seems. Then again people do have issues and that's a whole other animal never mind the 100 mW or so of wattage and consumer grade junk they're using. I use a commercial grade AP (Access Point) in the house made by Grandstream (has a built-in controller). So there's the major difference in case anyone wants to make a point about dismal home Wi-Fi. The point is that it seems like Wi-Fi technology can stream and do more better than digital TV. Again, in my own opinion. And the point is that Wi-Fi helps accomplish that though spread spectrum.

Anyway, time to squeeze a squad for more ink...
 

prcguy

Member
Premium Subscriber
Joined
Jun 30, 2006
Messages
16,625
Location
So Cal - Richardson, TX - Tewksbury, MA
Spread spectrum for WiFi is a good thing, for one it allows lots of users to occupy the limited BW with less interference to each other. Its got some built in security. But none of those things are needed for OTA TV because there is no competition for the same TV channel by other users in a given area, it belongs to the licensed TV station in that area. OTA TV doesn't need to be secure, its offered for free. Otherwise spread spectrum doesn't have any benefit I can see for OTA TV. Another digital format/video compression scheme, modulation or FEC might work better but the BW will still be whatever you can fit into 6MHz.
 

BinaryMode

Blondie Once Said To Call Her But Never Answerd
Joined
Jul 3, 2023
Messages
887
Location
75 parsecs away
I wasn't talking about security at all. Never mentioned it. My premise was what I wrote, and I'll quote:



Your Truly said:
I just know that spread spectrum technology (invented by a woman during WWII) is a good sound solution when it comes to interference resolution, jamming mitigation and perhaps multipath signal mitigation.


I just feel in my heart of hearts that perhaps, just perhaps if the signal were spread over a bandwidth in FHSS or DSSS fashion the TV signal would be a bit more reliable, no?
 

prcguy

Member
Premium Subscriber
Joined
Jun 30, 2006
Messages
16,625
Location
So Cal - Richardson, TX - Tewksbury, MA
I wasn't talking about security at all. Never mentioned it. My premise was what I wrote, and I'll quote:






I just feel in my heart of hearts that perhaps, just perhaps if the signal were spread over a bandwidth in FHSS or DSSS fashion the TV signal would be a bit more reliable, no?
Probably not. And TV antennas are designed to cover specific 6MHz chunks of frequencies, if spread spectrum were used and the BW was wider current TV antennas may not cover what's needed resulting in degraded reception. If there were a benefit to using spread spectrum within the existing TV channel BW or for satellite TV it would have been considered or implemented, but its not.

However its great for two way comms providing a level of security that would be hard for the average person to defeat.
 

Citywide173

Member
Feed Provider
Joined
Feb 18, 2005
Messages
2,168
Location
Attleboro, MA
Seriously, lose a tiny bit if signal on OTA TV and it's garbage pixel crap. Meanwhile, I can use spread spectrum in Wi-Fi (that's what it is) and stream my rear end off watching copious amounts of great programming all the while surf the Internet...
If you had your TV within sight of the antenna tower, what do you think your loss rate would be? Unless you live basically under the transmitter, this statement is apples and oranges. What happens to your wifi signal if you go up the block from your router? Subsequently, what happens to the quality of any of those videos that your are streaming?
 

G7RUX

Active Member
Joined
Jul 14, 2021
Messages
581
A simple reason why "wifi performs better than digital TV" is that for video streaming over IP you will almost certainly be using a system with buffering...

As I and others have already pointed out, using spread spectrum techniques for TV transmission would not give sensible advantages.

One thing which is often misunderstood, misquoted and mis-named is spread spectrum itself...this is where the occupied bandwidth is wider than that occupied by the baseband data and this would be disadvantageous for TV transmission. In the UK for example we use a raster of 8 MHz channels; each used to contain one service on a PAL-encoded analog(ue) video system but with the digital TV system in use now provides around 24-40 Mbps of data capacity in the OFDM systems currently used, with this system providing very good handling of multipath interference. Given that acceptable quality MPEG4/H264/H265 coded video can be achieved with a bearer of 5-10 Mbps it is possible to fit half a dozen or so services into a single 8 MHz channel fairly comfortably. So, 40 Mbps in an 8 MHz channel so a "spreading ratio" of around 0.2.

Contrast this with GPS L1 C/A where the channel width is 24 MHz or so and the message bitrate is 50 bps...a 50 bps data stream is spread to take up 24 MHz channel width...so a "spreading ratio" of around 480000.

Now, spread spectrum techniques can have advantages in dealing with multipath propagation but FHSS and DSSS don't really have much of an advantage here. THSS does nothing for multipath mitigation and CSS is extremely effective with multipath but is less resistant to channel noise, so interference can cause issues.

For this reason the DVB-T and -T2 specs use OFDM with a guard interval so multipath which arrives *within* the guard interval, typically from 1/4 to 1/32 of a symbol length, is *constructive* to the received and decoded signal and you can choose the length of this interval to cope with the typical multipath delays you see in an area.

Spread spectrum systems just aren't necessary for digital TV over terrestrial transmission.
 

gmclam

Member
Premium Subscriber
Joined
Sep 15, 2006
Messages
6,425
Location
Fair Oaks, CA
Interesting to combine these topics. LOL. Back in the day we had (E)PROMs. If they were erasable at all, it was via ultra-violet light. When it came to programming them, where is the data coming from (floppy disc?)? People who write code (myself included) were meticulous about getting it correct the first time because it was a literal hassle to update.

Now we have Flash devices which, while not invincible, can be re-programmed umpteen times. And we have high speed data connections with which to get the programming information for them. Companies these days seem to be more focused on getting something out the door, whether it works or not; rather than fully vet the code they are releasing ("let the consumer find our bugs").

Then there is some quirky idea that because the version number is higher it is an "update". It is NOT. It's a version change which may remove or alter features and functions of the prior version. Ironically, I rarely see true bugs fixed.

I presently operate one separate PC for each major Window O/S since NT4.0. The earlier the version, the better they run. Faster with less memory requirements just for starters. When I see Microsoft/etc. push some required reboot/"update", I find it not good. "So, you guys still can't get that code right?". Oh sure there are patches and virus issues, but they should be addressed at the root level and not by forcing every PC in the world to reboot at the whim of a single company.
 

gmclam

Member
Premium Subscriber
Joined
Sep 15, 2006
Messages
6,425
Location
Fair Oaks, CA
I don't have an exact date of when the 525/60 TV system went into operation (for the public). Adding color (when NTSC was adopted) took place circa 1958. As of June 2009, that system was still in use in the USA. Even after June 9 older NTSC (and B&W) sets could still function with the use of a converter. It's no accident that a lot of true engineers kept this system going and kept it backward compatible with virtually every TV made since the beginning of the industry.

ATSC 1.0 was created by the Grand Alliance (a consortium of 7 companies) that had a stake it what they were creating. Too much to get into here. Design by committee, with a little of this and a little of that. Enabling a digital protocol that could be used for higher definition and/or more channels and certainly more features. And a heck of a lot better than the prior HD TV analog system that had been approved for the USA but never implemented.

Just like you indicated where things these days seem to be constantly changed/"updated", ATSC is no different. It wasn't on the air for 10 years and people were complaining about it. Whether trying to receive signals on a mobile device (hand-held or auto for example), or just wanting to implement a better compression protocol. I think it was "hard-wired" to MPEG2 because of those 7 companies that created it (each looking out for their own interests).

We've jumped over ATSC 2.0 and have 3.0 on the air in many markets across the USA. Fortunately 1.0 is still on the air too. The reasons I hate ATSC 3.0 have nothing to do with the technology (better modulation technique, allows other compression protocols, etc.), but with the addition of DRM. Digital Rights Management has NO PLACE in Free Over-The-Air television.

And before 3.0 has been the default standard, we're talking about spread spectrum and other changes. What a mess.
 

BinaryMode

Blondie Once Said To Call Her But Never Answerd
Joined
Jul 3, 2023
Messages
887
Location
75 parsecs away
A simple reason why "wifi performs better than digital TV" is that for video streaming over IP you will almost certainly be using a system with buffering...

As I and others have already pointed out, using spread spectrum techniques for TV transmission would not give sensible advantages.

One thing which is often misunderstood, misquoted and mis-named is spread spectrum itself...this is where the occupied bandwidth is wider than that occupied by the baseband data and this would be disadvantageous for TV transmission. In the UK for example we use a raster of 8 MHz channels; each used to contain one service on a PAL-encoded analog(ue) video system but with the digital TV system in use now provides around 24-40 Mbps of data capacity in the OFDM systems currently used, with this system providing very good handling of multipath interference. Given that acceptable quality MPEG4/H264/H265 coded video can be achieved with a bearer of 5-10 Mbps it is possible to fit half a dozen or so services into a single 8 MHz channel fairly comfortably. So, 40 Mbps in an 8 MHz channel so a "spreading ratio" of around 0.2.

Contrast this with GPS L1 C/A where the channel width is 24 MHz or so and the message bitrate is 50 bps...a 50 bps data stream is spread to take up 24 MHz channel width...so a "spreading ratio" of around 480000.

Now, spread spectrum techniques can have advantages in dealing with multipath propagation but FHSS and DSSS don't really have much of an advantage here. THSS does nothing for multipath mitigation and CSS is extremely effective with multipath but is less resistant to channel noise, so interference can cause issues.

For this reason the DVB-T and -T2 specs use OFDM with a guard interval so multipath which arrives *within* the guard interval, typically from 1/4 to 1/32 of a symbol length, is *constructive* to the received and decoded signal and you can choose the length of this interval to cope with the typical multipath delays you see in an area.

Spread spectrum systems just aren't necessary for digital TV over terrestrial transmission.

I take it the U.S. is different than the UK? It seems like your system is better. Do they use CRC in any of this?


I presently operate one separate PC for each major Window O/S since NT4.0. The earlier the version, the better they run. Faster with less memory requirements just for starters. When I see Microsoft/etc. push some required reboot/"update", I find it not good. "So, you guys still can't get that code right?". Oh sure there are patches and virus issues, but they should be addressed at the root level and not by forcing every PC in the world to reboot at the whim of a single company.

Yep, see my post here where I believe I talked about that. Also, this is why I just strip down Windows 10 . Right now I use maybe 2.5 GB of memory. Unless of course I open FireFreak or play a game of course.

There are actually more CVEs in Windows 10 than Windows 7. And if I'm not mistaken, Windows 7 has been out longer than 10. Everyone (not me) thought Windows 10 would be the "last" OS. I laughed and said no. Shortly after *surprise surprise*, Windows 11. From 8 to 11, all are meant for a Tablet. It's like Redmond tossed the PC user right out the windows.
 
Last edited:

prcguy

Member
Premium Subscriber
Joined
Jun 30, 2006
Messages
16,625
Location
So Cal - Richardson, TX - Tewksbury, MA
Interesting to combine these topics. LOL. Back in the day we had (E)PROMs. If they were erasable at all, it was via ultra-violet light. When it came to programming them, where is the data coming from (floppy disc?)? People who write code (myself included) were meticulous about getting it correct the first time because it was a literal hassle to update.

Now we have Flash devices which, while not invincible, can be re-programmed umpteen times. And we have high speed data connections with which to get the programming information for them. Companies these days seem to be more focused on getting something out the door, whether it works or not; rather than fully vet the code they are releasing ("let the consumer find our bugs").

Then there is some quirky idea that because the version number is higher it is an "update". It is NOT. It's a version change which may remove or alter features and functions of the prior version. Ironically, I rarely see true bugs fixed.

I presently operate one separate PC for each major Window O/S since NT4.0. The earlier the version, the better they run. Faster with less memory requirements just for starters. When I see Microsoft/etc. push some required reboot/"update", I find it not good. "So, you guys still can't get that code right?". Oh sure there are patches and virus issues, but they should be addressed at the root level and not by forcing every PC in the world to reboot at the whim of a single company.
This is a topic for another thread, but, I had the Apple crowd bugging me for a good 30yrs to get an Apple computer because bla, bla bla. I ignored them until 2014 when they sat me down at work, stuck a MacBook in front of me and I watched it performing some stuff while simultaneously dong some Windoz stuff faster than any Windoz machine I ever used. It was awesome and I ordered a MacBook Pro the next day and never looked back.
 

BinaryMode

Blondie Once Said To Call Her But Never Answerd
Joined
Jul 3, 2023
Messages
887
Location
75 parsecs away
If you had your TV within sight of the antenna tower, what do you think your loss rate would be? Unless you live basically under the transmitter, this statement is apples and oranges. What happens to your wifi signal if you go up the block from your router? Subsequently, what happens to the quality of any of those videos that your are streaming?


Reread what I said again please. You make it sound as if I don't know anything....

Yours Truly said:
No, in my opinion digital TV should be spread spectrum because as it stands right now you either get the signal full on or you don't. There's no wiggle room that I believe DSSS or FHSS could offer. Again, look at Wi-Fi's spread spectrum scheme as an example. Now imagine sending that kind of spread spectrum out at broadcast TV wattage. The bandwidth for 802.11g is 20 MHz. And it performs better than digital TV it seems. Then again people do have issues and that's a whole other animal never mind the 100 mW or so of wattage and consumer grade junk they're using. I use a commercial grade AP (Access Point) in the house made by Grandstream (has a built-in controller). So there's the major difference in case anyone wants to make a point about dismal home Wi-Fi. The point is that it seems like Wi-Fi technology can stream and do more better than digital TV. Again, in my own opinion. And the point is that Wi-Fi helps accomplish that though spread spectrum.
 

davidgcet

Member
Premium Subscriber
Joined
Aug 17, 2010
Messages
1,342
your wifi is bidirectional with error correction and buffering. a tv station pumping out 100kw ERP will talk out a LOT further than a home client can talk back to it. therefore the use of bidirectional OTA equipment would likely lessen the effective coverage area. and there was a huge stink when former VHF stations swapped to UHF and lost fringe zones in 2009. Even some UHF stations were no longer being picked up in places analog had covered for decades. I knew the station engineers for 2 local stations here back then and they were flooded with people griping because they had to upgrade antennas to higher gain to keep getting them but then lost other stations because now the beamwidth was narrower so it was harder to split an azimuth and get other areas.
 

Citywide173

Member
Feed Provider
Joined
Feb 18, 2005
Messages
2,168
Location
Attleboro, MA
Reread what I said again please. You make it sound as if I don't know anything....
Even with increased power, the fact is that the further from the transmitter you are, the more signal degradation will occur. Whether it be due to foliage, atmospherics, physical structures, etc. You could not possibly have a high enough output power to overcome some of these things without exceeding maximum permissible levels (IDLH). Additionally, as @davidgcet pointed out, error correction would have to be introduced for optimal signal similar to your router. What would the transmit power out of the TV set have to be for bi-directional communication? It would also require another change in equipment from the current model TV to a TV capable of the technology involved and I don't think consumers (or manufacturers for that matter) want to go through that again.
 

G7RUX

Active Member
Joined
Jul 14, 2021
Messages
581
I take it the U.S. is different than the UK? It seems like your system is better. Do they use CRC in any of this?
The US and UK/EU systems are different in general, yes, although various systems in the US do use DVB-T.

CRC is a very crude form of error detection for such systems and DVB-T uses much more powerful and capable error detection and correction which is where a lot of the advantages stem from.DVB-T uses an external interleaver, a Reed-Solomon coding block and also punctured convolutional coding which all together mean that the FEC system is very capable indeed.
 
Top