Using CLOAK and DAGGER to analyze and understand illicit networks

In this guest blog, Prof. Aili Malm explains a new approach to thinking about social network analysis of illicit networks. Follow Dr Malm on twitter @ailimalm

Prof. Aili Malm, California State University, Long Beach

Working with police personnel has always generated the most interesting research.

About 10 years ago, I got a call from Dr. Allan Castle, then head of intelligence analysis for the Pacific Region of the Royal Canadian Mounted Police (RCMP). Allan and his analysts wanted a coding strategy to understand the thousands of pages of intelligence files and detailed group narratives they had built for over 100 illicit groups. The strategy had to cover all the relationships so often present in intelligence files – family, friend, criminal, business, group membership, and so forth. But we also had to be pragmatic. Time and energy was limited, and it also needed to balance the complexity of relationships that fuel illicit activity, but simple enough to inform and influence police decision-making. The analysts coded more than 120 groups across five different types of relationships, delivering a wealth of data for SNA training sessions as well as some publications (see list at bottom of page).

Since that first foray into SNA to research illicit markets and criminal groups, I’ve been lucky enough to work with many police departments and help apply SNA to their intelligence data. And I keep coming back to the same coding strategy. It provides the necessary level of detail to capture how different types of relationships overlap, and makes use of data already collected during the course of police investigations. It is also not so overly complicated that it bewilders the decision-making environment vital to good intelligence-led policing. Like any good ‘structured analytical technique’, it can, as legendary CIA analyst Richards J. Heuer, Jr has said, “guide the dialogue between analysts with common interests as they share evidence, share alternative perspectives, and discuss the meaning and significance of the evidence” (Heuer, 2009).

The strategy outlined here is based on both the empirical research as well as the grounded expertise of professional intelligence staff. The analysts brought professional experience, which “is accumulated over time through reflection on the outcomes of similar actions taken in similar situations” (Barends et al., 2014). Pooling the collected experience of many analysts with the literature on SNA has created the CLOAK strategy.

The CLOAK analytical approach

CLOAK is a structured analytical approach that involves building, for a defined set of individuals, networks based on five different types of positive relationships between actors in an illicit network/market. These networks can be layered on top of one another to assess multiplex relationships. Multiplexity is simply SNA jargon for differentiating between different types of relationships and seeing how people can be connected in a number of ways.

Co-offendingCo-offending networks are defined as individuals who commit crimes with one another. It is often found in co-arrest data, and more generally through intelligence collection.

Legitimate: Many offenders are also involved in legit business dealings, such as co-owning real estate. You may need to adjust how you define legitimate by the amount of gray activity in your target illicit market.

Organization: These are defined as formal group ties in organizations such as outlaw motorcycle gangs or MS-13. Participants tend to have specific roles. These will usually be reciprocal ties, since all individuals will be connected to one another through group membership. These will mostly (but not always) be criminal ties.

Acquaintance: These networks are built by connecting acquaintances and friends. For example, neighborhood gangs (especially on the US east coast) often lack the formality of organizations, and instead are based on loose affiliations from school or block ties.

Kinship: Kinship networks are formed by actors tied through biological or family-based relationships. These networks also include romantic relationship ties outside of marriage.

Using a CLOAK structured approach to analysis requires knowing more than the traditional binary connection between individuals. You need content and context. After all, it’s difficult to differentiate between the five different types of relationships if you do not have the content of phone calls/texts between people. But with this information, you gain much more insight and understanding of different networks. And that is where DAGGER come in.

The DAGGER application

If every network was the same, there would be little point in analyzing the different components of relationships. But diverse illicit markets stress different connections. Like so many structured analytical techniques (think PESTEL, or ACH) CLOAK doesn’t necessarily cover every possible eventuality, but it covers most of what you will need. The same with the various markets covered by DAGGER. The table here shows a first estimate of which ties are most important to various illicit markets; Drugs, Art/antiquities, Guns (small arms), Gangs, Exploitation (and trafficking), and Religious extremism. For example, in drug networks co-offender ties are often weak and transitory, whereas organizational and kinship ties are strong bonds important to the success and strength of the criminal network. The table is intended to guide both data collection and analysis. Researchers and analysts should (1) prioritize data collection for the important ties, and (2) consider a link weighting strategy where important ties are emphasized.

The table is a first estimate based on my interpretation of the existing research, and the row at the bottom provides an estimate of the relative confidence we should draw from the literature. Drug networks are relatively well understood but illicit gun and art network still need much more research. It is highly likely this table will change as research increases in this area (so watch this space!).

The potential value of focused deterrence as an effective crime prevention technique has highlighted the importance of SNA as integral to crime reduction in complicated environments. Many analysts are now aware of SNA, though it is often applied in a binary fashion that provides little more than rudimentary insight. To reach that goal, accessible and structured approaches such as CLOAK and DAGGER might help spread not just SNA but a way to apply it to a broader audience of analysts and researchers.

References

Barends, E., Rousseau, D. M. & Briner, R. B. (2014). Evidence-Based Management: The Basic Principles. Amsterdam: Center for Evidence-Based Management.

Bright, D. A., Greenhill, C., Ritter, A., & Morselli, C. (2015). Networks within networks: using multiple link types to examine network structure and identify key actors in a drug trafficking operation. Global Crime16(3), 219-237.

Diviák, T., Dijkstra, J. K., & Snijders, T. A. (2017). Structure, multiplexity, and centrality in a corruption network: the Czech Rath affair. Trends in Organized Crime, 1-24.

Heuer, R. J. (2009). The evolution of structured analytic techniques. 9 pages. Washington, DC. Presentation to the National Academy of Science, National Research Council Committee on Behavioral and Social Science Research to Improve Intelligence Analysis for National Security.

Malm, A., Bichler, G., & Van De Walle, S. (2010). Comparing the ties that bind criminal networks: Is blood thicker than water? Security Journal23(1), 52-74.

Malm, A., & Bichler, G. (2011). Networks of collaborating criminals: Assessing the structural vulnerability of drug markets. Journal of Research in Crime and Delinquency48(2), 271-297.

Malm, A., & Bichler, G. (2013). Using friends for money: the positional importance of money-launderers in organized crime. Trends in Organized Crime16(4), 365-381.

Smith, C. M., & Papachristos, A. V. (2016). Trust thy crooked neighbor: multiplexity in Chicago organized crime networks. American Sociological Review81(4), 644-667.

 

How long division taught me to think about crime

In the intelligence-led policing course I’m currently teaching in El Salvador, we were talking about how to tackle complicated problems. And the problems in El Salvador are indeed challenging and complex. There are no easy solutions. Do we give up and stick with failed responses, or do we learn how to deal with complexity and move forward?

Humans have a tendency to try and brainstorm their way out of a problem, but cognitive psychologists are pretty clear that unstructured thinking is inefficient and error prone. Former CIA analysis chief Morgan Jones (1998) suggests that this kind of intuitive, unstructured thinking can lead to a number of problems, including;

  • Beginning analysis by forming conclusions;
  • Focusing on the solution we intuitively prefer;
  • Settling for the first solution that appears satisfactory;
  • Focusing on the substance and not the process of analysis;
  • Confusing discussing or thinking hard about a problem as being the same as analyzing it.

To his last point, you can see the problems with just trying to ‘think’ our way out of an issue when you watch a child struggle with a math question. If they can’t remember how to solve the puzzle, they either give up or stare intently at the paper hoping the answer will miraculously appear. For example, we can’t solve long division without a structured approach (or a calculator). I’m old enough to still remember being taught a process for long division. Figuring out how many times the divisor (number doing the division) goes into the dividend is difficult, but it’s easier when you only divide a part of the dividend (the number being divided). If you don’t know what I’m talking about, google it. Bottom line is that you take something that is too complicated for most people, and you break it into simpler and more manageable steps.

Today we discovered that the way I was taught was slightly different than in El Salvador; however my Salvadorian police colleague (below) and I still got to the same answer because the key was a structured, methodical approach.

We can learn how to reduce complexity down to manageable tasks, in the same way that pilots use checklists as a structured approach to manage a complex task such as landing an aircraft in dense fog. We can take the seemingly intractable, and achieve what had not been previously possible.

In my training I use numerous structured approaches like this, including VOLTAGE. VOLTAGE is an extension of a simple analytical tool (VOLT) that has previously been used in some British police services as a framework for structuring knowledge about crime problems. Take a complicated crime problem and break it down into simpler components. The VOLTAGE elements, with some example questions you might ask yourself, are:

Component Example questions
Victims Does crime concentrate among a certain type of victim or target? Are there multiple victims or is a particular target the subject of repeat victimization? Does the type of target generate particular public concern (such as children)?
Offenders Is the crime problem created by numerous offenders who are not known to each other? Is it caused by a few repeat offenders and their friends? Are there new offenders in the area (prison releases)?
Locations Are specific places targeted, or is crime distributed more widely? What is special about the place? It is a particular location or a particular street (with a troublesome family), or an area (housing project or nighttime economy zone)?
Times Is the crime problem within normal variation or explainable by annual seasonal patterns? If not, are there specific times when crime is concentrated? Are new patterns evident?
Attractors Are particular locations or places attracting offenders because of the easy criminal opportunities (attractors) or are places inadvertently creating crime opportunities (generators)? Where are the worst places?
Groups Are gangs or inter-gang conflicts a factor in the crime spike? Is there involvement of organized crime? Are school children involved either as offenders or victims? Are there disputes between criminal families, or fans of particular sports teams?
Enhancers Are factors such as drug or alcohol use a factor to consider? Are behavioral (mental) health issues part of the problem?

Instead of just ‘thinking hard’ about a crime problem, it is better to think specifically about what we know about the victims, the offenders, the locations, the times, and so forth. It helps us identify what we know, and what we don’t know. VOLTAGE is discussed in my book on intelligence-led policing, and in my forthcoming book “Reducing Crime: A Companion for Police Leaders” which will be published later in the year.

Reference

Jones, M. D. (1998). The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving. New York: Random House.

 

It’s time for Compstat to change

If we are to promote more thoughtful and evidence-based policing, then Compstat has to change. The Compstat-type crime management meeting has its origins in Bill Bratton’s need to extract greater accountability from NYPD precinct commanders in late 1990s New York. It was definitely innovative for policing at the time, and instigated many initiatives that are hugely beneficial to modern policing (such as the growth of crime mapping). And arguably it has been successful in promoting greater reflexivity from middle managers; however these days the flaws are increasingly apparent.

Over my years of watching Compstat-type meetings in a number of departments, I’ve observed everyone settle into their Compstat role relatively comfortably. Well almost. The mid-level local area commander who has to field questions is often a little uneasy, but these days few careers are destroyed in Compstat. A little preparation, some confidence, and a handful of quick statistics or case details to bullshit through the tough parts will all see a shrewd commander escape unscathed.

In turn, the executives know their role. They stare intently at the map, ask about a crime hot spot or two, perhaps interrogate a little on a case just to check the commander has some specifics on hand, and then volunteer thoughts on a strategy the commander should try—just to demonstrate their experience. It’s an easy role because it doesn’t require any preparation. In turn, the area commander pledges to increase patrols in the neighborhood and everyone commits to reviewing progress next month, safe in the knowledge that little review will actually take place because by then new dots will have appeared on the map to absorb everyone’s attention. It’s a one-trick pony and everyone is comfortable with the trick.

There are some glaring problems with Compstat. The first is that the analysis is weak and often just based on a map of dots or, if the department is adventurous, crime hot spots. Unfortunately, a map of crime hot spots should be the start of an analysis, not the conclusion. It’s great for telling us what is going on, but this sort of map can’t really tell us why. We need more information and intelligence to get to why. And why is vital if we are to implement a successful crime reduction strategy.

We never get beyond this basic map because of the second problem: the frequent push to make an operational decision immediately. When command staff have to magic up a response on the spot, the result is often a superficial operational choice. Nobody wants to appear indecisive, but with crime control it can be disastrous. Too few commanders ever request more time to do more analysis, or time to consider the evidence base for their operational strategies. It’s as if asking to think more about a complex problem would be seen as weak or too ‘clever’. I concede that tackling an emerging crime spike might be valuable (though they often regress to the mean, or as Sir Francis Galton called it in 1886, regression towards mediocrity). Many Compstat issues however, revolve around chronic, long-term problems where a few days isn’t going to make much difference. We should adopt the attitude that it’s better to have a thoughtfully considered successful strategy next week than a failing one this week.

Because of the pressure to miracle a working strategy out of thin air, area commanders usually default to a limited set of standard approaches, saturation patrol with uniform resources being the one that I see at least 90 percent of the time. And it’s applied to everything, regardless of whether there is any likelihood that it will impact the problem. It is suggested by executives and embraced by local area commanders because it is how we’ve always escaped from Compstat. Few question saturation patrols, there is some evidence it works in the short term, and it’s a non-threatening traditional policing approach that everyone understands. Saturation patrol is like a favorite winter coat, except that we like to wear it all year round.

Third, in the absence of a more thoughtful and evidence-based process, too many decisions and views lack any evidential support and instead are driven by personal views. There is a scene in the movie Moneyball where all the old baseball scouts are giving their thoughts on which players the team should buy, based only on the scouts’ experience, opinion and personal judgment. They ignore the nerd in the corner who has real data and figures … and some insight. They even question if he has to be in the room. In the movie, the data analyst is disparaged, even though he doesn’t bring an opinion or intuition to the table. He brings data analysis, and the data don’t care how long you have been in the business.

Too many Compstat meetings are reminiscent of this scene. The centerpiece of many Compstat meetings is a map of crime that many are viewing for the first time. A room full of people wax lyrical on the crime problem based on their intuitive interpretation of a map of crime on the wall, and then they promote solutions for our beleaguered commander, based too often on opinion and personal judgement and too little on knowledge of the supporting evidence of the tactic’s effectiveness. Because everyone knows they have to come back in a month the strategies are inevitably short-term in nature and never evaluated. And without being evaluated, they are never discredited, so they become the go-to tactical choice ad infinitum.

So the problems with Compstat are weak analysis, rushed decision-making, and opinion-driven strategies. What might the solutions be?

The U.K.’s National Intelligence Model is a good starting point for consideration. It has a strategic and a tactical cycle. The strategic meeting attendees determine the main strategic aims and goals for the district. At a recent meeting a senior commander told me “We are usually too busy putting out fires to care about who is throwing matches around.” Any process that has some strategic direction to focus the tactical day-to-day management of a district has the capacity to keep at least one eye on the match thrower. A monthly meeting, focused on chronic district problems, can generate two or three strategic priorities.

A more regular tactical meeting is then tasked with implementing these strategic priorities. This might be a weekly meeting that can both deal with the dramas of the day as well as supervise implementation of the goals set at the strategic meeting. It is important that the tactical meeting should spend some time on the implementation of the larger strategic goals. In this way, the strategic goals are not subsumed by day-to-day dramas that often comprise the tyranny of the moment. And the tactical meeting shouldn’t set strategic goals—that is the role of the strategic working group.

I’ve previously written that Compstat has become a game of “whack-a-mole” policing with no long-term value. Dots appear, and we move the troops to the dots to try and quell the problem. Next month new dots appear somewhere else, and we do the whole thing all over again. If we don’t retain a strategic eye on long-term goals, it’s not effective policing. It’s Groundhog Day policing.

Year-to-date comparisons and why we should stop doing them

Year-to-date comparisons are common in both policing and the media. They involve comparing the cumulative crime count for the current year up to a certain date and comparing to the same point in the preceding year. For a Philadelphia example from April of this year, NBC reported that homicides were up 20 percent in 2017 compared to 2016. You can also find these types of comparison in the Compstat meetings of many police departments.

To gauge how reliable these mid-year estimates of doom-and-gloom are, I downloaded nine years (2007-2015) of monthly homicide counts from the Philadelphia Police Department. These are all open data available here. I calculated the overall year change as well as the cumulative change monthly from year to year. In the table below you can see a row of annual totals in grey near the bottom, below which is the target prediction as a percentage of the previous year (white text, blue background). For example, the 332 homicides in 2008 were 14.7% lower than the previous year, expressed in 2007 terms.

Let’s determine that we can tolerate our prediction to be within 5 percent plus or minus the eventual difference between this year and the preceding year. That stipulates a fairly generous 10% range as indicated by the Low and High rows in blue.

Each month you can see the percentage difference between the indicated year-to-date at the end of the month, and the calendar year-to-date (YTD) for the same period in the previous year. So for example, at the end of January 2008 we had 21.9% fewer homicides than at the end of January 2007. By the time we get to December, we obviously have all the homicides for the year, so the December percentage change exactly matches the target percentage difference.

Cells highlighted with a green background have a difference on the previous year that is within our +/- 5 percent tolerance. By the end of each January, we only had one year (2012) with a percentage difference that was within 5 percent of how the city ended the year. The 57% increase in January 2011 was considerably different that the eventual 6% increase over 2010 at the end of December. When Philadelphia Magazine dramatically posted “Philly’s Murder Rate is Skyrocketing Again in 2014” on January 14th of that year, the month did indeed end up nearly 37 percent over 2013. But by year’s end, the city had recorded just one homicide more than the preceding year – a less dramatic increase of 0.4%.

In fact, if we seek out a month where the difference is within our 10% range and later months will remain consistently accurate through to the end of the year, then we have to wait until the months shown with a border. 2009 performed well, however while 2010 was fairly accurate throughout the summer, the cumulative totals in September and October were more than 5% higher than the previous year when the year ended only 0.3% higher.

To use calendar YTD comparisons with any confidence, we have to wait until the end of October before we can be more than 50% confident that the year-to-date is indicative of how we will enter the New Year. And even then we still have to be cautious. There was a chance at the end of November 2010 that we would end the year with fewer homicides, though the eventual count crept into increase territory.

The bottom line is that with crimes such as homicide, we need not necessarily worry about crime panics at the beginning of the year. This isn’t to say we should ever get complacent and of course every homicide is one too many; however the likely trend will only become clear by the autumn.

Alternatives exist. Moving averages seem to work okay, but another alternative I like is to compare full (annual) YTDs to the prior annual (i.e. full 12 month) YTD. So instead of (for example) comparing January-April 2010 to January to April 2009, you could compare the 12-month change May 2009-April 2010 against the May 2008-April 2009 total. I’ve done that in the red graph below. The first available point is December 2008 and as we know from the previous table, the preceding 12 months had outperformed the annual year 2007 by 14.7%. But then each subsequent month measures not just the calendar YTD but the 12-month YTD.

The result is a graph that shows the trend changing over time from negative (good) territory to positive (bad for homicides because it show an increase). Not only do you get a more realistic comparison that is useful throughout the year, you can see changing trend. Anything below the horizontal axis is good news – you are doing well. Above it means that your recent 12 months (measured at any point) was worse than the preceding 12 months.

You can have overlapping comparison periods. The graph in blue below compares 24 months of accumulated counts with the 24 month totals for the previous year. For example, the first point available is December 2009. This -11.7% value represents the change in total homicides from the 24 months January 2008 to December 2009 and compares it to the 24 month total from a year previous to this (January 2007 to December 2008). For comparison purposes, I have retained the same vertical scale but note the change in horizontal axis.

You can see there is more smoothing, but the general trend over time is still visible. Lots of variations available and you might want to play with different options for your crime type and crime volume.