Social Science as a Disaster – Northwestern’s Evaluation of Ceasefire (Part II of IV)

This blog is part II of a multi-part series discussion issues surrounding the re-funding of the anti-violence program Ceasefire.  You can check out  the previously posted blog titled “The Audit of Ceasefire…..(Part I in a Series)”  You can also find every documents mentioned in the series available in PDF format in FOI Center.  Part III will cover the media coverage of the recent push by Ceasefire to get their State of Illinois funding restarted.

One of the greatest problems with social science based research is the question of when the general public should access the results and how they should be interpreted. This problem is no clearer than in the ongoing issues surrounding the Evaluation of the non-violence program Ceasefire conducted by the Institute for Policy Research at Northwestern University. This report specifically, and social science generally, is incapable of determining with any reliability that Ceasefire achieved any of the results the report claims the program did achieve. This report made no attempts to control for the impact of external issues on the level of violence. The Chicago Police Department and Ceasefire are mutually claiming influence for the crime drops in the areas that Ceasefire worked in. Maybe they both had an impact, maybe just one did, or maybe the crime reduction, (if there is one), is due to a still undetermined extraneous variable that the academics, the police, and Ceasefire administrators have missed. Other agencies have been involved also, including the US Attorney’s Office and their Project Safe Neighborhood. The major problem with such mediocre social science research is that it is usually done in conjunction with an agenda and released to the media as if it is gospel. I will address the problems with reporting on this evaluation and the audit in my next installment. Suffice it to say, Eric Zorn from the Tribune and Alex Kotlowitz from Northwestern who authored an article in the New York Times Magazine swallowed deeply when it came to the findings of this Evaluation without ever questioning the methodology or the results.

The Evaluation:

  • All of this came to a head in summer 2007, when state politics slipped into a standoff between the governor and the General Assembly. Legislators’ requests to fund specific Ceasefire sites were among the many member initiatives listed in a routine “pork barrel” bill the Governor’s staff systematically axed the program from the final budget. (Executive Summary Page-9)
  • Most of the remaining criticism in the document focused on run-of-the-mill accounting errors that easily could have been made while trying to manage more than 20 active sites. The audit had been initiated by longtime critic of Ceasefire, a powerful state senator representing city’s South Side and a prominent leader of the Illinois Legislative Black Caucus. (Chapter 3 – Page 3-23)
  • Many observers wondered whether the audit was unbiased, and certainly the exquisite timing of its release was damaging to Ceasefire: the media focused on the audit as the budget cut. (Chapter 3 – Page 3-23)

This seems like amidst a federally funded social science program evaluation the authors made a political decision to categorize the removal of funds from Ceasefire as a political choice rather than as a result of the horrific audit. Skogan also seems to be rather unhappy that the State initiated an audit of the program and even raised the possibility of a conspiracy against the program citing the exquisite timing of the release of the audit. Even if the initiation of the audit was politically motivated their suspicions were obviously verified. It is so nice being able to review the findings of such unbiased social science research. This type of work is why citizens have little to no faith left in social scientists.

  • 192 months (16 years) of data on selected sites and matched comparison areas to examine trends in violence. (Executive Summary Page -16)

How nice it must be to get access to 16 years of violence data from the City. I wonder what one must do to get that type of data? This is a major critique of the clique of academics that do get access to this data, what do they do to get it in the first place and how do they continue to get access? The Chicago Police Department (CPD) is extremely restrictive about who gets access to data they create. It is clear that the CPD will not release data that can eventually be used against them to critique their actions; thus, it is the responsibility of all academics to question the legitimacy of all research that is based on what can only be thought of as coerced access. Is there some sort of overt or covert deal cut or understanding made, I do not know; but I must question the legitimacy of the access.

  • Another statistical analysis focused on gang homicide. It utilized social network analysis to examine the effect of the introduction of Ceasefire on networks of within-gang and between gang homicides, and the number of violent gang active in the area. (Executive Summary Page -16)

I attended a lecture by Wes Skogan at UIC many months ago. During this lecture he presented an unpublished paper and contributed the crime drop in Chicago to Chicago’s Alternative Policing Strategy (CAPS), increased incarceration, and smarter policing. Strangely enough Ceasefire was not mentioned in his talk. At that talk Skogan was questioned about the legitimacy of how police label murders gang related. Skogan responded by saying that the CPD have a strict policy they follow. Skogan brushed aside the fact that in Chicago there just might be political influence over how a homicide gets labeled gang related. Any analysis based on the gang related definitions from the CPD would have to thoroughly explain how the CPD used this policy and to prove the policy’s reliability in determining whether or not the homicide was truly gang related. Per Skogan’s standard method of operation he automatically makes the assumption that the CPD’s methods are legitimate. This assumption tarnishes any statistical research flowing from the analysis using the CPD’s numbers.

  • The program helped push gun homicides down only in Auburn Gresham, but the report discusses the statistical problems associated with analyzing these relatively rare events. (Executive Summary Page -17)

Even in Chicago murder statistics have been proven to at the very least be controversial if not out right doctored. Given Ceasefire’s ability to reduce shootings one would still have to question why homicides only fell in one area due to Ceasefire. I guess in the other areas the shootings they prevented were from people with bad aim?

The major criticism of this study is:

How are the researchers parsing out what effect is caused by Ceasefire and not caused by some other variable?

The politics surrounding crime and homicide rates are intense in Chicago and Cook County. Every criminal justice agency in Chicago and Cook County has claimed a role in reducing crime since 1990. Depending on which agency is giving you data the researcher is obliged to reinforce the agency’s role in reducing crime. Skogan has long been a strong advocate of CAPS’ role in reducing crime throughout the City. CAPS role in crime and violence reduction is just as un-provable as Ceasefire’s role in violence reduction. Does this mean that the programs are failures because you cannot prove their success? I would say no, but adding the obvious fiscal management problems at Ceasefire it is even more important to determine the successfulness of the program.

Some of the possible intervening variables that cast doubt on this study:

Chicago Police Specific:

  • Hot Spots Policing in the areas covered by Ceasefire including Special Operations Section, Deployment Operations Center, Targeted Response Unit, district level hot spots, etc…
  • New Gang unit approaches to those beats
  • New Tactical unit approaches to those beats
  • Presence of Pod cameras
  • Recent street corner conspiracy busts to remove targeted gang leadership in the area
  • CAPS

Criminal Justice System Specific:

  • Increased levels of incarcerated individuals
  • Special programs by the Cook County State’s Attorney’s Office
  • Project Safe Neighborhood run through the US Attorney’s Office

Society Related:

  • Declines are actually due to continuing drop in crime nationally since 1990 and are completely unrelated to any intervention
  • Other interventions in the community not associated with Ceasefire
  • Gentrification is ongoing in the city and may have played a role in selected communities
  • Settling of drug markets reduces violence

While I cannot prove decisively that any of the possible intervening variables are the causes, if there are any, of the reduced violence in the areas covered by Ceasefire, the researchers also cannot prove that they are not. This evaluation is loaded with assumptions that have not even been close to validated by this study or any other study. This is why researchers should not release research to the general public before the social science community gets a chance to assess its credibility. The media has taken this story and run with it like the results of this study came down from the heavens. In reality this study should be published in an academic journal and never be seen by the media and policy makers. It certainly should not be used to motivate public policy. One can only question why this study was released to the media? This question along with the media coverage of this issue in general will be the subject of my next installment.