When government organizations learn how to learn
Over the weekend, the New York Times ran a front-page article that was not only timely in light of the recent terror attacks in Paris and elsewhere, but also enlightening in terms of how government organizations can get better by consciously seeking to learn over time how to improve their structures and processes.
The article, With Permanent Squad, New York Police Step Up Fight on Terrorism, discusses the emergence of a new unit, the Critical Response Command, which was rolled out just last week. And it tells a fascinating story about the impetus that eventually led to creation of this new unit.
After the Charlie Hebdo attacks in Paris last January, "top officials with the New York Police held a drill to test the city's response to such an attack and found they had a problem. As officers converged on 42nd Street in Manhattan near the United Nations headquarters, some took nearly an hour to get there. Others, coming from plainclothes organized-crime and narcotics units in far-flung precincts in Brooklyn and Queens, did not look the part."
The reporter quoted John J. Miller, NYPD's deputy commissioner for intelligence and counterterrorism, as saying, "One of the observers said they look a little too much like the bad guys."
In other words, the NYPD started by examining how it might do reacting to a similar attack where fast response was important. And after comparing desired to actual performance, officials concluded they came up short.
What a learning organization does next is not to stay tied to how it has always done business, but to try instead to figure out how to change and do the job better. The NYPD has now decided that its earlier approach of putting regular officers on periodic counterterrorism details, as well as its more-recent tactic of dual-hatting plainclothes organized crime/anti-narcotics officers to anti-terrorism duty, were insufficient. The new command, which will grow soon to 500, will be full-time doing this job; they "will roam the city, anywhere from Times Square to Main Street in Flushing, Queens, appearing in different configurations and at different times of day, act as a deterrent to would-be attackers, officials said, and a kind of standing force ready to race to the scene of a strike in minutes."
A related example of NYPD learning around terrorism involved the need for more officers with high-powered semi-automatic rifles to respond to multiple, simultaneous attacks, which grew out of the Mumbai attacks in 2008. The first approach in response to this problem was to arm and train officers from the Organized Crime Control unit with such rifles. However, it was discovered that these officers typically didn't have those weapons close at hand (their existing jobs often required them to blend into the surroundings). One police manager noted, "Do they have the guns with them? No, the guns are back at the office. Are the guns loaded? No. The test was run, it was suboptimal."
Now the department has reacted to the shortcoming by learning: it has created a Strategic Response Group, 700 officers with these weapons with them all the time.
In both instances, what the NYPD has done reminds me of the famous "after-action" reviews invented by the U.S. Army. These are done, with various degrees of formality, after training events and actual battles. They are specifically directed at learning, not punishing or second-guessing.
The three basic questions in any after-action review involve the learning cycle the NYPD has used for anti-terrorism patrols: What was supposed to happen? What did in fact happen? And what are some improvements?
After-action reviews are used by many private-sector companies. Many more government organizations outside the military should be doing the same thing – this method, after all, was initially developed in government!
So what distinguishes a learning organization from one that is bad at learning how to do a better job? I would suggest two differences. One is a willingness to gather information about how current processes are working, so one knows where there are shortcomings. A second is not being so stuck in the "way we've always done things around here" that one is unwilling to try new ways.
Finally, I would argue it is no coincidence this kind of learning is especially prevalent in the NYPD, whose COMPSTAT system made the department famous for its disciplined attention to looking at crime statistics as a way of seeing where problems lay and to put pressure for improvement. The NYPD's aggressive use of performance measurement made the organization more sensitive to gaps between desired and actual performance -- the first stage in any serious learning process.
So, just as some sort of after-action reviews would be helpful for many government organizations outside the military, aggressive managerial use of performance measurement is a good way to ingrain a culture in government that embraces improvement through learning.
With the Thanksgiving holiday, today will be my only blog post for the week. Best holiday wishes to all.
Posted by Steve Kelman on Nov 23, 2015 at 12:03 PM