top of page
Search
  • Writer's pictureJerry Ratcliffe

Hey chief, stop looking for the silver bullet

I recently had the pleasure of watching a good friend talk to a room full of police leaders about evidence-based policing. Even though the presentation was thoughtful and succinct, listening to their comments and questions afterwards, it was clear that a few chiefs were still searching for the mythical silver bullet. I presented next, and got a similar response from a couple of attendees: “What can we do that just works?”

Ah, the illusive silver bullet known to slay all werewolves.

The silver bullet is a police-led initiative that is guaranteed to cut crime in half, doesn’t require resources or incur overtime, won’t infuriate the union, improves police legitimacy, is media friendly, popular with politicians, doesn’t generate complaints or litigation, and can be implemented by Monday. It is also mythical.

The bullet doesn’t exist because no strategy—however well supported by research—is guaranteed to succeed. I know that you might have read some web sites that list what works and what doesn’t, but they all need a Barry Bonds-type asterisk. The asterisk here means ‘it depends.’ There are two reasons for the asterisk; strategy validity and implementation effectiveness.


Strategy validity is comprised of the effectiveness of the strategy, as well as its external validity. Effectiveness is important because there are some tactics that generally don’t work. The classic examples are D.A.R.E. and Scared Straight. I managed to upset the commandant of a police academy by pointing this out after he had dedicated over a decade of his life as a D.A.R.E. instructor. “If I can save just one child, it’s been worth it,” he said as he stormed off. And he might have been right. Perhaps he was that instructor who managed to succeed. Unfortunately, if we are to be evidence-based, given the preponderance of evidence, it is likely his efforts were in vain.


External validity is an indication of the extent to which a project that worked in one place will work in another. For example, the Philadelphia Foot Patrol Experiment found that foot patrols in high crime hot spots reduced violence by 23 percent compared to equivalent comparison foot beats. As a result, in Philadelphia, new recruits out of the academy automatically go onto foot beats for the first part of their service.


But what if you are in a rural area with low population density? As Evan Sorg and I write (borrowing from Pete Moskos), if the mail carrier isn’t walking, it probably doesn’t make sense for the cops to be either. Some strategies have greater applicability to a wider range of police departments. Most places have experimented with variations of Compstat at some point, whereas fewer departments have explored the value of fixed wing aircraft. The external validity of a tactic is therefore a factor to consider in estimating likely success.

The strategy validity is a combination of effectiveness and external validity, but it is of little relevance if your tactic is not implemented properly. That leads to our second factor: implementation effectiveness.


Any chief can google a tactic that another police leader is championing. But will it be implemented in the same way? There is so much variance in operational implementation, it is impossible to predict success. I was a member of the recent National Academies of Sciences consensus panel on proactive policing that rated focused deterrence strategies like Operation Ceasefire as effective; but in Baltimore, poor implementation was blamed for its lack of success and ultimate demise.


Our panel also rated hot spots policing as effective. The police chief who ran the intervention evaluated by David Weisburd and colleagues was therefore no doubt disappointed that it didn’t affect anything. But with "aggressive order maintenance policing" implemented for only three hours a week, what did the department expect?

Because your department might adopt a strategy that is fundamentally flawed or is not appropriate for your environment, or you might not implement it effectively, nobody can say for certain whether it will succeed. And it is not just about money. If you have a weak department culture, investing in body-worn cameras will have limited value if officers don’t trust management and turn the cameras off.


No silver bullet, but we do have evidence-based policing

While we don’t have a silver bullet, an understanding of evidence-based policing can indicate the next best thing. EBP can maximize your chances of success.


To demonstrate, let’s put some ball parked probabilities on these ideas. Say you choose a strategy that has positive evaluations and would be appropriate for your town. In other words, an effective strategy with high external validity. We could therefore score strategy validity on a scale of 0 to 1 and give this a success probability of 0.8.

Then let’s say that you are prepared to support the strategy with enough resources, targeted and concentrated in the right places, and for enough time (what is called dosage) to maximize your chances of being successful. Trouble is, we rarely know what is enough resources, but for argument’s sake, lets ballpark this implementation effectiveness at 0.9.

Crime reduction success = Strategy validity x Implementation effectiveness

If crime reduction success = strategy validity x implementation effectiveness, then your chance of crime reduction success is 0.8 x 0.9 = 0.72. You have a 72% chance your strategy will be effective.


But imagine your strategy has a low track record of success. Perhaps it only has a 20% chance of succeeding (strategy validity of 0.2). Even if you implement the hell out of it, you are still unlikely to reduce crime. If your implementation effectiveness is 0.95, you still only have a crime reduction success of 19% (0.2 x 0.95).


Equally, if you pick a highly regarded strategy (let’s give hot spots policing a strategy validity of 0.9), you rapidly lose effect if you don’t put the effort and resources behind it. If your implementation effectiveness is estimated at just 0.5, then your chances of crime reduction success drop to just 45%.


The ‘so what?’

When crime scientists say ‘it depends’ it isn’t because we enjoy sitting on the fence. It’s because we recognize that no tactic has a strategy validity of 100%. We also know that a city might invest so little in it that a good idea can fail because of weak implementation. In a recent op-ed, I lamented that my own city invests only $130,000 in focused deterrence (rated effective in evidence-based evaluations) from a violence prevention budget of $48m.

We are not yet at the stage where we can assign rates between 0 and 1 on strategies or implementations, though forest plots (for example) might be a starting point. However, the basic idea in this blog might help you understand how the effects of strategy validity and implementation effectiveness interact. And why it is so important to score highly on both.

Evidence-based policing will not give you the silver bullet. Nothing can, and you should be suspicious of anyone promising it. But evidence-based policing can evaluate a tactic's track record of success and when it might be best deployed (strategy validity). It can also describe how it was operationalized successfully so you can maximize your implementation effectiveness.

It’s the closest thing to a silver bullet you will get, absent an actual werewolf problem.

141 views0 comments

Recent Posts

See All
bottom of page