LAPD - 12hrs Communications Outage

Status
Not open for further replies.

KMA367

Member
Premium Subscriber
Joined
Nov 21, 2002
Messages
1,040
Location
Redwood Coast, N Calif
Anyone notice this?

Apparently Mt Lee had a complete power outage caused when the back up generator was being tested.

Councilman calls for official's firing after LAPD communications outage - The Daily Breeze

=
Yes, "only" the voice radio from the Dispatch Centers was out for that period, and they put the system into what sounded like either a Level 3 or Level 2 "Fallback" mode, exactly as it is supposed to happen in that situation. I have no idea (and apparently he doesn't either) what this councilman means by "911 calls had to be answered manually," (how else does one answer a phone, robotically?) but the power outage had NO effect on the telephone system for either 9-1-1 or 7-digit calls. They were answered as usual by the Emergency Operators. I also have no idea why "some divisions recalled all officers from the field." The station, vehicle, and portable radios were all functional.

The RTOs (Radio Dispatchers) immediately relay call information to their respective station's Area Command Center radio console - I think they use MC3000s - via open phone lines they establish. The calls are voice broadcast from the station on either the regular or "Fallback" radio frequency for the division.

It's a very complex system of systems, and any system crash is of course a serious matter which I'm sure a lot of people at LAPD, General Services, Information Technology Agency and elsewhere started looking into as soon as it started. Certainly there is a degradation of capabilities with major outage like this, and for that reason LAPD has about a half dozen "fallback" levels and operational modes depending on which frequency(s), systems, or subsystems go down completely or partially.

Councilman/Reserve Officer Englander would never have made it as a cop in the 1960s and early 70s when there were no MDCs, no repeaters, no "backup" frequencies or dispatch capabilities, no 9-1-1, no portable radios, and no computers but rather teletypes and pneumatic tubes to R&I Division for access to suspect/vehicle information.

But somebody's gotta be fired. That'll keep anything from ever going wrong again.
 
Last edited:

Confuzzled

Member
Joined
Aug 16, 2008
Messages
704
Councilman/Reserve Officer Englander would never have made it as a cop in the 1960s and early 70s when there were no MDCs, no repeaters, no "backup" frequencies or dispatch capabilities, no 9-1-1, no portable radios, and no computers but rather teletypes and pneumatic tubes to R&I Division for access to suspect/vehicle information.

Yeah, I love some of the opening scenes of 'Adam-12' where they type the calls on a piece of paper and send them down a mechanical track to the radio operators.
 

karldotcom

Member
Joined
Apr 21, 2003
Messages
1,850
Location
Burbank, CA
If 154.830 Tac-1 was still working......


Who knows...maybe there were notifications made to LAPD....or maybe not. It is always Ready, Shoot, Aim when these things go down.
 

SCPD

QRT
Joined
Feb 24, 2001
Messages
0
Location
Virginia
Another one of those damned if you do and damned if you don't situations. If the generator had not been tested, which involves some risk, and it did not work when a widespread power outage occurs then people would demand some firings as well. I don't understand how good, intelligent people get elected and somehow lose their brains.

I don't understand the issue raised concerning running wants and warrants. The MDC's handle that better than voice does. It puts all the information in front of the officers quickly without the delay of asking someone else to do it. Officers without MDC's such as foot beat officers would be at a disadvantage unless the division stations can do that for them.

I wonder if there is one documented case of a problem caused by the delay during this event.
 
Last edited:

njemt7212

Member
Joined
Sep 27, 2009
Messages
39
Location
NOVA
Maybe someone with knowledge of generators can help me here, how does a test of a backup generator knock out main power feeds?? Did it backfeed into the system?

I think a simple test would be to kill power to the systems and let the backup generator kick-in , if it doesn't kick-in then turn main power back on.

Besides there is always more to a story than the media reports and PIO let out.
 

zerg901

Member
Premium Subscriber
Joined
Apr 19, 2005
Messages
3,725
Location
yup
A. What exactly went down?

Z. I bet if you search at Dispatch Magazine you will find a dozen articles mentioning "crash" "failure" "generator"
 

902

Member
Joined
Nov 7, 2003
Messages
2,625
Location
Downsouthsomewhere
Maybe someone with knowledge of generators can help me here, how does a test of a backup generator knock out main power feeds?? Did it backfeed into the system?

I think a simple test would be to kill power to the systems and let the backup generator kick-in , if it doesn't kick-in then turn main power back on.

Besides there is always more to a story than the media reports and PIO let out.
I've had generator issues before (I'm NOT in LA and have no particular insight into what happened, nor do I want to second-guess anyone - these are only my own experiences). It might seem straight-forward, but it's not. If you lose power and you have no transitory power source (a UPS or batteries), or if that transitory power is good enough to power your system for 30 minutes and it takes you an hour to get things back in operation, equipment goes down. When they restart, some things need to sync up. The biggest ones are master oscillators for simulcasting. They take hours to stabilize to the point where there isn't garbling/phase distortion in analog or excessive BER in digital. Sometimes it's the order in which things come up that goes out of kilter. A base station might come up before the controller. It might sense that it's on its own and start operating independently. When the controller comes up, it might error-out. Network synchronization and timing could be another issue, especially in systems that have a number of different items to sync. It's not like the Micor days where the power goes out, the base is on a battery and you hear a beep every couple of seconds, then the power comes back up and the beep goes away. The new crop of stuff could be very delicate.

The other thing is that it's much better to go off-line under somewhat controlled circumstances than it is for it to have happened during a major event, like a blackout. At least the powers that be at some level can look at what happened and choose actions to prevent it from happening again... or not. YMMV. Certain parts of me hurt from just thinking about Harry's axiom - yes, it's too easy to cheap out and underspec something or buy something that's not exactly what's needed and then point fingers at the guy who's stuck holding the bag without the right stuff or the right budget and can't keep it all from going down. But, like I said, I'm not there and I'm sure they did it right. Right, Harry?
 

jim202

Member
Joined
Mar 7, 2002
Messages
2,730
Location
New Orleans region
Maybe someone with knowledge of generators can help me here, how does a test of a backup generator knock out main power feeds?? Did it backfeed into the system?

I think a simple test would be to kill power to the systems and let the backup generator kick-in , if it doesn't kick-in then turn main power back on.

Besides there is always more to a story than the media reports and PIO let out.


Having been on the work side of generators for some 18 years in the cellular field and also involved with a number of public safety agencies, generators can cause a multitude of issues.

The biggest issue is if the automatic transfer switch freezes in the normal position and won't move to the generator position. This happens because the generator isn't exercised under load every week. The relay either rusts in the normal position and can't move or you find that the coil used to energize it went open and no one knew it. Regular exercise under load would have prevented not having a working generator system. The second issue that could happen is the transfer relay got stuck part way between the normal and the generator position. Now you have the generator running, but can't get the power to the load. You have to shut the generator down and kill the street power to the feed so you can get in there and try to get the relay moving freely again.

In all the cellular switching offices, we had a primary and a secondary generator. Plus we also installed a manual transfer switch as a 3rd fall back. If all else fails, we could roll in a trailer mounted generator and run cables through the wall in a 6 inch pipe we had installed just for that purpose. This would allow you to connect the trailer mounted generator to the manual transfer switch which was on the load side of the automatic transfer switch.

My guess is that the technical powers to be didn't have a backup plan. Either that or they had a melt down of the electrical switching equipment. It could happen. In that case, you just have to rip out the damaged parts and replace them.

We will have to wait and see what comes out as an after action report on the problem.
 
Last edited:

SCPD

QRT
Joined
Feb 24, 2001
Messages
0
Location
Virginia
The other thing is that it's much better to go off-line under somewhat controlled circumstances than it is for it to have happened during a major event, like a blackout. At least the powers that be at some level can look at what happened and choose actions to prevent it from happening again... or not. YMMV. Certain parts of me hurt from just thinking about Harry's axiom - yes, it's too easy to cheap out and underspec something or buy something that's not exactly what's needed and then point fingers at the guy who's stuck holding the bag without the right stuff or the right budget and can't keep it all from going down. But, like I said, I'm not there and I'm sure they did it right. Right, Harry?

This strikes a cord with me as a former federal employee. Local folks would say that the Forest Service wasted money and a lot could be cut. Congress would respond in kind. Of course it was up to the agency what to cut, neither the public or the Congress had no idea. Then the cuts would be made, sometimes in the fire budget. Those cuts could include not having the fire prevention folks in patrol engines (I had a 200 gallon rig when I was in prevention). Then something occurs where a prevention tech arrived on scene 20 minutes before the second in resource and that tech lost the initial attack because all he had in the truck was a couple of 5 gallon backpack pumps. The fire then grows to several thousand acres. The result is the public and the Congress criticizing the Forest Service with comments about incompetency and lack of foresight and someone should be fired.

The very scenario I described above happened to me twice in four years where I knocked a fast growing fire down enough with my Type VI engine (200 gl.) to hold it until the next incoming units arrived, not a moment too soon. If I was driving a dry unit (no engine) large, destructive fires of 5,000+ acres would have occurred in each of these cases. Those fires would have cost enough to prove the old adage, "penny wise, pound foolish." Congress rarely accepts the blame and the public accepts their views rather than look at how an agency is put into these situations..

The public's memory and that of the Congress is so short term that when they say these things to you it as if they are in a different universe.

If that is what is going on here I would think the councilman is showboating to get noticed. Unfortunately this can result in dedicated, high performing people bearing the brunt.
 
Last edited:

2wayfreq

Member
Joined
Jun 8, 2004
Messages
470
Location
Arizona
Assuming LAPD uses Quantars, I know that the standard Quantar power supply (Non 24V) does not tolerate power surges or spikes very well and it takes em out. I've probably sent about 8 or 9 in so far over the years.
 

zz0468

QRT
Banned
Joined
Feb 6, 2007
Messages
6,034
"They should have checked with the department first and there is no way they would have allowed them to work on Mount Lee until the Valley Dispatch Center's problem was fixed," Englander said.

That's the only thing Mitch Englander said that made any sense.

Most people, city politicians included, have no clue how complex these systems can be. The fact that they test the generators at all is praise worthy. But sometimes these tests uncover unforeseen problems. It's the reason for the tests in the first place.

But if they were already dealing with an outage elsewhere, they probably should have held off on the test. SOP should be a call to the affected dispatch centers, and let THEM give the all clear to run a test.

This isn't the sort of thing someone should lose their job over, though. He who does nothing, makes no mistakes.
 

stevef68

Newbie
Joined
May 29, 2010
Messages
1
I think the real problem here is why does LA have a single point of failure?? Although there is mention in the actual story of ANOTHER comm center being down at the same time due to an electrical problem. BUT, having ALL your comm relays in one building never seems like a good idea...
 

karldotcom

Member
Joined
Apr 21, 2003
Messages
1,850
Location
Burbank, CA
66942492-22181328.jpg
 

WayneH

Forums Veteran
Super Moderator
Joined
Dec 16, 2000
Messages
7,521
Location
Your master site
I deal with communications facilities with backup generators on a daily basis (similar to jim202) and am the one who does the maintenance and repair of +24 and -48 power plants. There's little info to go off in this situation - so more or less speculation on my part - but all backup generators should be exercised on a weekly basis and transferred under load (all controlled by the transfer switch). This tests the ability for the generator to perform its job during a loss of power. With somewhere like Mt Lee, which looks like a hub site, it's imperative you make sure this equipment works. If someone was sent up there to test the generator it was probably because it failed to exercise (it eventually happens to any site). I can only guess that when someone went up there to figure out what was wrong they screwed something up and dropped power to everything. There's probably a single power plant - bad for "hub" sites - which should have been powered by backup batteries at all times. Maybe they ran them down and couldn't transfer back. Who knows. There are a lot of variables but since it was working beforehand, human intervention made it worse. Power plants typically run with little maintenance so comm techs end up more experienced with the RF part of the facilities; and some don't like working around power because of getting shocked, shorts or whatever.

When a remote comm site that serves as a hub, and is properly designed, you have your transport off one power plant and your radio gear (Microwave excluded as it's "transport") off another so if you do have problems one doesn't affect the other. Each plant should have its own bank of batteries and these batteries should power long enough to get a replacement generator there when the on-site fails. The batteries are always connected so regardless of what your A/C power is doing they're powering the plant's load.

Taking all the M/W down at Mt Lee would be a serious blow to the network. All the other repeaters will still operate but their links are now gone so they operate independently. If Mt Lee was the voter/comparator site, or primary link to the voter through M/W, it makes matters worse as all the other sites, despite having connections to each other or not, no longer have somewhere to send their audio for distribution throughout the linked network (this could include the dispatch center). Taking repeaters down is one thing but taking down M/W is a whole different matter. Compare this to cutting the phone line to your house versus cutting the link between telephone switches.

I've never seen what an LAPD comm site looks like but the power plant could be a dilapidated pile of junk. Most modern plants will not let you accidentally disconnect the batteries. So ultimately the transfer switch got screwed up by human intervention disconnecting both the generator and utility power causing the batteries to run down killing the facility.
 

902

Member
Joined
Nov 7, 2003
Messages
2,625
Location
Downsouthsomewhere
I think the real problem here is why does LA have a single point of failure?? Although there is mention in the actual story of ANOTHER comm center being down at the same time due to an electrical problem. BUT, having ALL your comm relays in one building never seems like a good idea...
They probably don't have any single point of failure, but sometimes the planets line up, especially when different groups do things. I'm not LA, but in my former job had two radio vendors, two telephone vendors, our internal facilities maintenance people (who reported to someone else and couldn't be bothered with what I needed from them), our IT people (who also reported to someone else and knew better than the system manager), etc. A lot of times a problem I saw and started to react to was just them working on something they were directed to work on, independently of me. They never felt they had to tell anyone outside their sphere. In that respect, I suppose my former organization's failure was not equipment, but organizational culture. That was addressed when the one idiot who used to do all of it retired and they replaced him with a director at $30k more than he made (and actually gave him some authority) and two technicians at $10k less than he made. Oh well...

I'd wait for the hotwash before I poked a stick at LA. Seems like the "perfect storm." Good thing it happened then instead of an earthquake or some other critical event.
 

Thayne

Member
Joined
May 1, 2002
Messages
2,145
I think the real problem here is why does LA have a single point of failure?? Although there is mention in the actual story of ANOTHER comm center being down at the same time due to an electrical problem. BUT, having ALL your comm relays in one building never seems like a good idea...

When I was still working for a local Gov entity in 1995, they had a data center that had 2 different feeds from the Utility Co that would automatically switch if one went down and also a 100 KW Onan Genset that would handle it if both power feeds were down. To make a long story short, one of the feeds went down, and it did not switch to the other primary feed, then the generator started but the transfer switch failed and stuck in the open position (Not connected to anything) at that point myself and another electrician were called out and started the procedure to manually operate the transfer switch. Luckily we had the cover bolted down when we attempted to close it to the emergency side manually. When we did lots of "Dragon breath" erupted from the transfer switch causing all 3 400 amp 600 volt fuses to blow. When it was all over the whole cabinet containing the transfer switch was just a mass of black with lots of melted copper & aluminum in it. So yes, sometimes the planets do align--I was just glad all I got out of it was a few holes burned in my clothes.
 

brscomm

Member
Premium Subscriber
Joined
Dec 19, 2002
Messages
44
Location
Bubbaville these days.
It would seem something faulted rather badly according to the story, they started the generator and something occured that took out utility power to the site. I'd lean toward a catstrophic failure of the transfer switch based on the limited information available.

Should they have tested the site with Valley dispatch having problems? No way. Should they have had clearance from the dispatch supervisors? yes.

I would love to see the incident report on this. It could be enlightening for other PS agencies.
 
Status
Not open for further replies.
Top