Two gun violence experts will spend the next four years studying D.C.’s violence interruption programs. Their goal is to figure out what these programs are doing right — and where they could improve.
Their effort will be the most in-depth and rigorous evaluation of D.C.’s violence intervention programs to date. It’s funded by Arnold Ventures, a philanthropic foundation that funds criminal justice-focused studies.
Arnold Ventures put out a call for applications last year — and Johns Hopkins University professor Daniel Webster and his research partner, University of Maryland professor Joseph Richardson, applied. Webster has been studying Baltimore’s violence intervention programs for about 15 years, but for this project, he decided to switch his focus to the District.
“I thought that … the need and opportunity in the District of Columbia was greater than where I had been working, studying similar programs in Baltimore,” Webster says.
Officials with D.C.’s Office of Gun Violence Prevention say they’re excited about the study — and also added that they too are looking top to bottom at D.C.’s violence prevention programs to see what can be improved.
In 2018, D.C. launched two violence intervention programs: Cure the Streets, under the D.C. Attorney General’s office, and a separate program under Mayor Muriel Bowser’s Office of Neighborhood Safety and Engagement. The programs contract with community-based organizations to employ workers called violence interrupters in D.C. neighborhoods with some of the highest rates of gun violence.
The goal of those workers is to build trusting relationships with the people involved in violence so that they can stop retaliatory shootings, mediate disputes between warring crews, and connect the people at most risk of being shot with city services.
The D.C. Auditor examined the programs in a 2022 report that found mixed results. A Ph.D. student at American University also recently published a study on Cure the Streets, though some experts (including Webster) have raised concerns about its methodology. But the city has never had researchers formally evaluate their violence prevention programs’ effectiveness — despite spending millions of dollars on them.
The programs have published some data themselves: For example, the website for D.C.’s Office of Gun Violence Prevention recently launched a dashboard where they show crime trends in the neighborhoods where ONSE and Cure the Streets employ violence interrupters. But in the absence of more definitive answers about the effects these programs have had, some have grown skeptical about the investments.
“We have funded violence interrupters and social workers for 30 years,” former Ward 2 Councilmember Jack Evans in a Washington Post op-ed earlier this year, for example. “Unfortunately, it has never worked. Let’s figure out why before spending more money.”
And Webster says there’s some credence to that opinion.
“I think it’s unfortunate that the programs have been around for a while now without any formal, rigorous evaluation to answer the questions of potential skeptics. So I’m somewhat sympathetic to the skeptics saying like, ‘Come on, why don’t we have evidence that this is working?’” Webster says.
But, he adds, it takes a while to establish these programs, train workers, and start to see results — and the same degree of scrutiny is rarely applied to police departments, who have been trying to prevent crime for far longer.
“[Those] skeptics don’t apply the same degree of skepticism to some of the policing approaches that we take to reducing gun violence in the city. I think they should,” Webster says.
Webster and Richardson plan to use both data analysis and interviews for the study. The key metric they’ll focus on is incidents of gun violence: homicides with guns (the vast majority of homicides in the District), as well as assaults involving guns.
They’re also going to interview violence intervention workers, as well as the residents at the center of the city’s gun violence.
“We’re going to ask them [about] their own experiences, and if or how the program has impacted their lives in key ways relevant to their risks for engaging in gun violence,” Webster says. Because, he says, interviews can reveal what numbers sometimes can’t.
There are many ways to measure the success of non-police forms of violence intervention, Richardson says.
“The way that we can measure success is by what communities determine to be metrics for success,” Richardson told WAMU in March. Those can include measuring whether people get repeatedly shot, trying to measure the level of trust violence interrupters build with people, measuring the number of shootings in a neighborhood, or measuring participants’ attendance in therapy. But in measuring these programs’ success, Richardson adds, “we also have to consider that the funding or lack thereof in [community violence intervention] programs has affected the coverage of these programs across the city.”
“Our goal with this project is not simply to give a thumbs up or thumbs down,” Webster adds. It’s important to learn whether the programs “work” or “don’t work,” but “we want to know why. Why is it working?”
Such questions have taken on renewed urgency in recent years, because homicides in the District remain at levels the city hasn’t seen in two decades. While murders remain significantly down from where they were in the 90s, they’ve increased every year between 2018 and 2021. While homicides fell 10% in 2022, the city still lost 203 people to homicides that year and saw a steep rise in killings of teens. And so far this year, homicides are 30% up compared to this time last year.
The relentless grief — spread unequally across the city, concentrated most in low-income Black neighborhoods — has residents desperate for solutions.
Richardson and Webster’s study will last four years – but they plan to start sharing results earlier.
“Within 18 months,” Webster says, “we’ll show some preliminary findings of what we’re learning.”
Jenny Gathright