Policymakers, we need you! ….to pay attention to the evidence.

by Chen Reis

Last week Nicholas Kristof*, the popular NYT columnist, created a storm on twitter and facebook with his column “Professors, We Need You! “ which, among other points, decried the irrelevance of much social science research to policy-making. There have been a number of responses from academics on Twitter, Facebook, and in blogs with many pointing out that they and a significant number of their colleagues are actively working to produce policy relevant research.

Kristof makes some valid points about the obscurity of much social science research and the inaccessibility of the jargon. But he does not mention an important reality:  that even relevant, good quality, and well communicated research often fails to have much impact on public dialog and policy.  Some of the challenges may be inherent to the nature of policy-making itself, but the discrepancy is often seen when research findings do not conform to preconceived notions or agenda of  policymakers. When research demonstrates that pre-existing ’solutions’ are not applicable, it is likely to be ignored as well. This too is true both in the US national system and internationally.  For example, even though  the data suggest that most of the gender-based violence even in humanitarian settings is perpetrated by intimate partners, most of the focus in processes aimed at ending impunity and preventing violence remains on combatant perpetrated sexual violence.

Even in areas for which there is more of an evidence base, it is not clear how and whether the evidence is used. ALNAP, the Active Learning Network for Accountability and Performance in Humanitarian Action, is working to identify the quality and use of evidence available for the humanitarian sector.

The problem is not only that existing evidence is often ignored, but also that there is also little recognition or mention of the need for data on what works, even in key high level statements and commitments. The lack of evidence about what works speaks to not only the complexity of research in crisis settings but also to the lack of resources available for robust program monitoring and evaluation.  When it comes to prevention of and response to sexual violence in conflict, and to evaluation of humanitarian programming in general, it is only fairly recently that there has been a move to identify  evidence of what works. Humanitarian non-governmental organizations like the International Rescue Committee (IRC) are working with academic institutions to evaluate interventions for sexual violence in humanitarian settings. There are also initiatives to support the generation of evidence for action, such as the Research for Health in Humanitarian Crises (R2HC) initiative of the ELRHA.

It will be interesting to see whether this push for evidence-based action is reflected in the UK hosted Global Summit to End Sexual Violence in Conflict scheduled for this June.  I hope that support for building the evidence base and for using the evidence to inform policy and programming plays a greater and more integrated part of the global efforts to prevent and respond to sexual violence in humanitarian settings.

———————————————————

* Kristof’s own work and actions related to sexual violence have been critiqued  as uninformed /naïve and potentially harmful.

2 thoughts on “Policymakers, we need you! ….to pay attention to the evidence.

  1. smartipants

    One of the things I learned about policy makers when I was lobbying the US Government is that few of them have the luxury of time to read our long reports. Long powerpoint presentations filled with acronyms and unimaginative presentation of data isn’t helping our cause. Can we use new communication advances to improve our relationships with policy makers and help elevate the ground-breaking things out of the onslaught of fundraising and routine reports?

    Another point, how many humanitarian agencies really read their own reports from other projects that highlight interesting work or use monitoring and evaluation to put lessons learned from failed projects into place? We shouldn’t just blame policy makers for this failure to read the research – rushed and stretched programmers rarely have the luxury to either write up what happened in their projects or reflect on how things from other countries might work in their context.

    Going to trainings or conferences where you could share and get inspired by new research is often seen as a “perk” or a free vacation in aid agencies rather than a chance to inject new learning and techniques into programs. Some places I’ve worked don’t want to let hard working field people go to such places so its often just researchers talking to researchers. How can we change this cultural aspect of the aid world?

    Like

    Reply

Leave a comment