Tag Archives: research evidence

Book Review: The Myth of Research-Based Policy & Practice

I recently reviewed The Myth of Research-Based Policy & Practice for SAGE Methodspace. While this book will be of limited relevance to many PhD students, Chapter 6 provides a useful discussion of quality as it pertains to qualitative research practice that may be helpful to those of you coming to terms with your own epistemological and paradigmatic leanings.

Hammersley, M. (2013). The Myth of Research-Based Policy & Practice. London: SAGE.

First, a confession: I was drawn in by the catchy title. Working broadly in the area of international development, I frequently encounter appeals for research that can inform policy (e.g. situation analyses, monitoring & evaluation, impact assessments). As an early career researcher, I also feel intense pressure from within the university sector to demonstrate either that my research is ripe for commercialization or that it has a social impact (i.e. that it can influence decision makers). So when the title of this book promised to expose research-based policy/practice as a myth, it immediately caught my attention.

I am also quite interested in the politics of evidence. Who determines what counts as evidence? And by what standards? These questions are inherently political, snarled in complex webs of power and influence so pervasive they are often easily overlooked.  So if we accept (at least initially) the premise that research can (or should?) inform policy/practice, we need to carefully consider the standard of evidence required by policymakers and practitioners, as well as what this then means for our own research practice.

The Myth of Research-Based Policy and Practice has two stated objectives: first, to broadly consider ‘what counts as knowledge’ and then to expose ‘the limits of what counts as knowledge in evidence-based policymaking’ (p. 1). The book goes some way toward achieving both. The Introduction provides a useful overview of the history of evidence-based/informed policy, charting its path from medicine to education and other policy areas encompassed by the social sciences. This historical background is significant in that it clearly illustrates how randomized controlled trials (which provide a particular type of evidence suitable for answering certain types of questions in a medical context) became the gold standard for research-based evidence across a broad spectrum of social policy areas. This ‘positivist conception’ of the social sciences, moreover, has little time for socially grounded or ‘critical’ research that adheres to alternative epistemological and paradigmatic positions. And therein lies the problem. As Hammersley notes, “…a grand conception of research is widely shared among social scientists: it is often assumed that the knowledge they produce can generate conclusions that should replace or correct the practical knowledge of actors, and that this will bring about substantial improvement in the world” (p. 9). But all evidence is not created equal. And, as the author points out, practical knowledge also has a role to play in informed decision making.

With this in mind, I found Chapters 2 through 4 (which address the issues raised above in more detail) particularly interesting and well developed. Living in our own epistemological bubbles, we rarely pause to consider – let alone critically question – the nature of evidence. Hammersley urges us to take these questions seriously, further differentiating between evidence and expertise. While it could be debated whether or not the author convincingly demonstrates that evidence-based policymaking/practice is a myth, he certainly exposes the limitations of evidence produced by social science research in this context.

The book has been written so that each chapter can stand on its own and be read independently; this is both a strength and a weakness. While there are some advantages to this format (and I know that more publishers are moving in this direction), the book as a whole seems somehow less than the sum of its parts. At Chapter 7, it takes an abrupt turn; shifting focus from the theoretical and philosophical issues that underpin research and ‘evidence’ toward, first, action research as a particular research practice, and then different approaches to reviewing literature. As systematic reviews constitute one element of the ‘gold standard’, I can understand why the topic of literature reviews is relevant; however, I am not convinced that dedicating a full third of the book to literature reviews is justified. The lack of a concluding chapter means that the book comes to an abrupt stop, without tying off the various threads to the argument.

In sum, I suspect that this book will appeal to scholars frustrated by growing demands that their work produce particular sorts of outcomes, and those interested in phronetic social science. Chapter 6, The question of quality in qualitative research, might also be of interest to PhD students as they discover their own epistemological and paradigmatic leanings. It is a book worth dipping in and out of (particularly the early chapters), which I suspect may have been its aim all along.