Tag Archives: research

Book Review: The Myth of Research-Based Policy & Practice

I recently reviewed The Myth of Research-Based Policy & Practice for SAGE Methodspace. While this book will be of limited relevance to many PhD students, Chapter 6 provides a useful discussion of quality as it pertains to qualitative research practice that may be helpful to those of you coming to terms with your own epistemological and paradigmatic leanings.

Hammersley, M. (2013). The Myth of Research-Based Policy & Practice. London: SAGE.

First, a confession: I was drawn in by the catchy title. Working broadly in the area of international development, I frequently encounter appeals for research that can inform policy (e.g. situation analyses, monitoring & evaluation, impact assessments). As an early career researcher, I also feel intense pressure from within the university sector to demonstrate either that my research is ripe for commercialization or that it has a social impact (i.e. that it can influence decision makers). So when the title of this book promised to expose research-based policy/practice as a myth, it immediately caught my attention.

I am also quite interested in the politics of evidence. Who determines what counts as evidence? And by what standards? These questions are inherently political, snarled in complex webs of power and influence so pervasive they are often easily overlooked.  So if we accept (at least initially) the premise that research can (or should?) inform policy/practice, we need to carefully consider the standard of evidence required by policymakers and practitioners, as well as what this then means for our own research practice.

The Myth of Research-Based Policy and Practice has two stated objectives: first, to broadly consider ‘what counts as knowledge’ and then to expose ‘the limits of what counts as knowledge in evidence-based policymaking’ (p. 1). The book goes some way toward achieving both. The Introduction provides a useful overview of the history of evidence-based/informed policy, charting its path from medicine to education and other policy areas encompassed by the social sciences. This historical background is significant in that it clearly illustrates how randomized controlled trials (which provide a particular type of evidence suitable for answering certain types of questions in a medical context) became the gold standard for research-based evidence across a broad spectrum of social policy areas. This ‘positivist conception’ of the social sciences, moreover, has little time for socially grounded or ‘critical’ research that adheres to alternative epistemological and paradigmatic positions. And therein lies the problem. As Hammersley notes, “…a grand conception of research is widely shared among social scientists: it is often assumed that the knowledge they produce can generate conclusions that should replace or correct the practical knowledge of actors, and that this will bring about substantial improvement in the world” (p. 9). But all evidence is not created equal. And, as the author points out, practical knowledge also has a role to play in informed decision making.

With this in mind, I found Chapters 2 through 4 (which address the issues raised above in more detail) particularly interesting and well developed. Living in our own epistemological bubbles, we rarely pause to consider – let alone critically question – the nature of evidence. Hammersley urges us to take these questions seriously, further differentiating between evidence and expertise. While it could be debated whether or not the author convincingly demonstrates that evidence-based policymaking/practice is a myth, he certainly exposes the limitations of evidence produced by social science research in this context.

The book has been written so that each chapter can stand on its own and be read independently; this is both a strength and a weakness. While there are some advantages to this format (and I know that more publishers are moving in this direction), the book as a whole seems somehow less than the sum of its parts. At Chapter 7, it takes an abrupt turn; shifting focus from the theoretical and philosophical issues that underpin research and ‘evidence’ toward, first, action research as a particular research practice, and then different approaches to reviewing literature. As systematic reviews constitute one element of the ‘gold standard’, I can understand why the topic of literature reviews is relevant; however, I am not convinced that dedicating a full third of the book to literature reviews is justified. The lack of a concluding chapter means that the book comes to an abrupt stop, without tying off the various threads to the argument.

In sum, I suspect that this book will appeal to scholars frustrated by growing demands that their work produce particular sorts of outcomes, and those interested in phronetic social science. Chapter 6, The question of quality in qualitative research, might also be of interest to PhD students as they discover their own epistemological and paradigmatic leanings. It is a book worth dipping in and out of (particularly the early chapters), which I suspect may have been its aim all along.


A PhD is all consuming. Unlike other jobs that you can leave at the office, a PhD takes up residence in your head (and your living room). It goes home with you in the evening. It hangs around on the weekend. It even accompanies you on holiday (if you’re lucky enough to get one). Of the many PhD students I’ve known over the past 5 years, I have only encountered one who successfully maintained a 9-5, 40 hour work week.

So where does that leave the rest of us? Regardless of your work pattern, sometimes we all need a break. In fact, studies have shown that working more than 40 hours per week does not increase productivity in the long run. It can be incredibly difficult with 12 articles still to read and a draft chapter due next week, but you need to learn to discipline yourself and the PhD-monster lurking over your shoulder. I’m still learning, but am getting better.

One aspect of successful self-discipline is actually getting the work done; setting reasonable expectations and following them through. Several weeks ago, I set myself the goal of writing uninterrupted (no talking, checking email, searching the internet or fidgeting with my phone) for at least 30 minutes every day. It turns out that even this modest task is easier said than done. So I bought a kitchen timer – a hot pink one that sits on my desk and reminds me of my writing obligation. Once I set the timer, writing becomes my sole occupation until the ticking stops and the bell signals that I’m allowed to stop. I’m not suggesting that you run out and buy your own kitchen timer, just that you think about how you might manage your working time more effectively and in ways that enhance your productivity without requiring that you enslave yourself to your PhD.

My second tip for successful self-discipline is little rewards. Once you’ve accomplished whatever it was you set out to do, allow yourself a treat. Take a short break, go for a walk or, better still, bake some brownies!

Delicious Fudge Brownies

(Adapted from ‘On Delicious Fudge Brownies’ in The Pirates! In an Adventure with Communists, by Gideon Defoe)

  • 200g dark chocolate
  • 210g butter
  • 40g vegetable oil
  • 5 eggs
  • 400g sugar
  • 50g honey
  • ½ tsp vanilla
  • 125g flour
  • 50g cocoa powder
  • ½ tsp sea salt
  • chopped nuts (optional)

Melt chocolate and butter together in a pan over low heat. Remove from heat and stir in remaining ingredients. Pour into a baking pan lined with parchment paper and bake for approximately 30 minutes or until the centre is set. Leave to cool on a wire rack before cutting.

Collaborative Fieldwork

Having the opportunity to conduct fieldwork is, for me at least, one of the best bits of being a researcher. What’s not to love about travelling to exotic (or even not-so-exotic) locations and meeting interesting people? I also enjoy the serendipity and unpredictability of fieldwork. Despite the best-laid plans, there is no knowing what might happen tomorrow or when your research could take an unexpected turn.

Now that I have completed several trips to ‘the field’, I can see that my own approach to fieldwork has changed considerably. Perhaps this is just indicative of a learning curve that all researchers experience. In any case, I thought that I would raise the idea of collaborative fieldwork for those of you who are preparing to set out for ‘the field’ yourselves.

PhD research is, almost by definition, an individual pursuit. Earning your doctorate will ultimately depend on your ability to demonstrate to the examiners that you have made an original contribution to knowledge. In the social sciences and humanities, this is rarely achieved through group work. I suspect that this has something to do with why our fieldwork is, by and large, an individual activity as well. After all, it is at this stage of the research process when you collect the evidence necessary for making your all-important original contribution.

When I conducted my PhD fieldwork in Madagascar, I intuitively adopted the mindset of a solo investigator. Sure there were plenty of other people involved: research assistants, gatekeepers, participants, community leaders, and local acquaintances. But at the end of the day, the research was mine and I called the shots.

Compare that characterization to the fieldwork that I conducted in Papua New Guinea last month. In this instance, I was conducting research with a colleague from PNG and had brought along an Australian student to work as our RA. The research was also embedded in a longstanding relationship that my colleague has with our host community. So the research was not only mine and ours (the three-person research team), but also theirs (the participants). Coming up with an agenda that suited everyone required negotiation and flexibility. Not everything went to plan. But the three-person research team configuration in particular proved incredibly beneficial.

While my colleague and I facilitated the research activities, our RA documented the process. This meant that I could focus all of my energy on the participants and our interaction without worrying about whether the video cameras were working or trying to simultaneously lead a discussion and take detailed field notes. In the evenings, all three of us would get together to go over the day’s activities; often we recorded these conversations as well. Initially, it was surprising how we had each picked up on quite different details. Sometimes this could be attributed to vantage point, at others language skills. There is no doubt, however, that three heads were better than one.

This experience has made me wonder how much richer my PhD research might have been had I adopted a similar approach in Madagascar. I did arrive in the country on my own, but I could have easily built up the sort of three-person team described above. In fact, simply fostering a more collaborative relationship with my research assistants, rather than using them primarily as translators, might have made a considerable difference. It would have taken some work early on to train someone in documentation skills or set a precedent for discussing the day’s activities before heading our separate ways, but I now suspect that it would have been well worth the effort in the long run.

So, if you are starting to think about your own fieldwork, I would encourage you to consider how you might adopt a collaborative approach. It is still up to you to do the work, collect the evidence you need and write it up in a compelling way. But in the heady chaos of fieldwork, collaboration has strong advantages over going it alone.

It’s ok to fail

In his book A Short History of Nearly Everything, Bill Bryson tells the harrowing story of Guillaume le Gentil:

Le Gentil set off from France a year ahead of time to observe the transit [of Venus] from India, but various setbacks left him still at sea on the day of the transit – just about the worst place to be since steady measurements were impossible on pitching ships.

Undaunted, le Gentil continued on to India to await the next transit in 1769. With eight years to prepare, he erected a first-rate viewing station, tested and retested his instruments, and had everything in a state of perfect readiness. On the morning of the second transit, June 4, 1769, he awoke to a fine day, but just as Venus began its pass, a cloud slid in front of the Sun and remained there for almost exactly the duration of the transit: three hours, fourteen minutes, and seven seconds.

Stoically, le Gentil packed up his instruments and set off for the nearest port, but en route he contracted dysentery and was laid up for nearly a year. Still weakened, he finally made it onto a ship. It was nearly wrecked in a hurricane off the African coast. When at last he reached home, eleven and a half years after setting off, and having achieved nothing, he discovered that his relatives had had him declared dead in his absence and had enthusiastically plundered his estate.

This is what we’ve chosen as researchers: a life of failure and rejection. In all seriousness, though, research is a process of trial and error almost by definition. It is hard. We do not always succeed. And that’s ok.

If our hypotheses were always spot on, if our procedures always worked exactly as expected – if life was really that predictable – there wouldn’t be much point in conducting research at all. Thankfully for those of us who love research, there are still plenty of things that we don’t know and don’t understand that require investigation. That said, the arduous process of developing new knowledge is replete with surprises and setbacks.

Not knowing any more about le Gentil or his story than what’s written above, I would still question the assertion that he “achieved nothing.” He may not have achieved what he set out to, but that should not by default mean that the entire adventure was without merit. I suspect that the experience of spending eight years in a foreign culture mastering his instruments must have had some unanticipated (and perhaps undocumented) benefits. In my own case, arriving in the field only to find my methods unsuitable was a fortuitous fork in the road. A nightmare at the time, this wholly unexpected scenario presented an opportunity to change tack and experiment with visual methods. Five years later, visual methodology is at the core of my research agenda. It hasn’t been an easy journey, but it has certainly been an interesting one.

In some ways, I have also been incredibly lucky. Although my PhD fieldwork did not go at all according to plan, my essentially made-up method worked well enough that I was able to return home with sufficient data to successfully complete my thesis on schedule. Not everyone is so lucky. And I’m not so lucky all of the time. Sometimes despite doing everything right, our research still goes awry. A cloud passes in front of the sun. What then?

I don’t know when or how it started, but a culture has developed in academia that rewards ‘success!!’ at the expense of knowledge and understanding. We are under enormous pressure to get it right. Some, though not all, of this pressure comes from the need to publish (‘as much as possible!!’). Journals accept papers that present significant (i.e. positive) findings. Professor Keith Laws recently observed that:

This publication bias* is pervasive and systemic, afflicting researchers, reviewers and editors – all of whom seem symbiotically wed to journals pursuing greater impact from ever more glamorous or curious findings.

He goes on to say that the solution is not the creation of special journals that publish negative or null findings. (An idea I’ve personally heard discussed on more than one occasion.) Instead, Laws argues that we need to make room for these “unloved” findings in mainstream journals. True, this depends in part on the cooperation of reviewers and editors. It also depends on us; we supply the content.

About a year ago, I submitted a manuscript to a top methodology journal. The article details three attempts a photographic data collection, two of which were only moderately successful. The third attempt was undertaken in conditions that were far from ideal and was largely unsuccessful as a result. One reviewer picked up on this, questioning why I chose to proceed with the research. I responded truthfully that that’s the nature of my work. The article is now in press.

In a previous post, Monica voiced concern that university metrics encourage the mass-production of ‘plywood’ rather than oak- or mahogany-quality research. The expectation that our research will churn out positive results (within a 2-3 year timeframe) compounds the problem and changes the very nature of the endeavor. Sometimes your procedure won’t go to plan. Sometimes your results won’t be what you expected. Sometimes a cloud passes in front of the sun at exactly the wrong moment. That’s the harsh reality of research. And, it’s ok.

(And if we’re bold, we can even get it published: warts, failings and all.)

*I don’t think that psychology is so different from other social science disciplines.

Making up for Lost Time: Some thoughts on working from home

It has been a while since my last post. The Christmas holidays soon gave way to January; February came and went in a blur; and here we are. Time flies – whether you’re having fun or not. (And particularly when you’re coming up against a deadline, whether for your PhD or a journal publication.)

I also made the decision that I wouldn’t post merely for the sake of posting. There is far too much nonsense populating the internet already without me adding to it. For the past several weeks I haven’t felt like I had anything worthwhile to say.

…And then Marissa Mayer (the CEO of Yahoo) announced that Yahoo employees would no longer be allowed to work from home.

For many PhD students, working from home is not so much a choice as an inevitability. When I was completing my PhD, office space was at a premium. Those with teaching responsibilities had priority, but with three of us sharing a tiny room containing two desks, the only time I set foot in that cubbyhole was during my office hour. I could be wrong, but I suspect that my experience was not all that unusual. I had a friend at a different university who even held her office hour in a coffee shop.

So, given that many of us work from home – for lack of an office or otherwise – how can we make sure that we work from home effectively?

The decision at Yahoo has prompted a fair bit of discussion on this topic. The Guardian, for instance, recently posted 5 “Golden Rules” of working from home. I found two of these particularly relevant to my experience of writing a PhD thesis from home.

The first golden rule is to “make a sacred space.” Even if you aren’t going in to the office every day, it is still a good idea to have a dedicated workspace. Easier said than done, you might be thinking. Few PhD candidates that I’ve known have had a spare room that could be converted into a home office. Odds are, you’re probably living in a shared house or, like me, in a studio apartment. Even in cramped conditions, it is still possible to carve out a little corner for your regular workspace.

The other option is to adopt a coffee shop. I wrote my entire Masters dissertation and the better part of my PhD thesis at Starbucks. I would aim to be there by 9:30am and would often stay until 3 or 4 o’clock in the afternoon. Whenever possible, I would sit in the same spot. Whether at home or in a coffee shop or someplace else, your sacred workspace should be somewhere that you actually like to be. You’re unlikely to be at your most productive in an environment that you hate (like my 3 person, 2 desk cubbyhole of an office). I’ve yet to find a coffee shop in Brisbane where I can work as effectively as I did in York. Now when I work from home, my sacred space is my kitchen table.

The second golden rule is to “go on a digital diet.” For many of us, checking our email (or facebook or twitter) has become a compulsion. The internet can be particularly seductive when you’re meant to be writing a thesis chapter. Pausing to check a reference on the electronic library catalogue is the first step down the slippery slope to online procrastination.

The Guardian rules suggest rationing your time spent online, using an internet blocker if necessary. I’ve found that it is best if I get up and start writing as soon as possible. If I can get from the bedroom to the kitchen table without doing anything other than brushing my teeth and making a cup of coffee, it will be a good writing day. By contrast, if I check my email first thing in the morning (which, coincidentally, generally happens on days that I’m in the office), I almost inevitably end up succumbing to the various other demands on my time and am lucky if I get any writing done at all.

As in all things, moderation is key. And, for me, the key to cutting down on digital distractions is to implement a working-from-home routine that cuts out those distractions altogether for at least four hours. If I need to check an obscure reference or follow up on a particularly relevant journal article, I make a note of it and do all of my checking in the afternoon.

We all work differently – the main thing is to get the work done. As a PhD candidate, you have the luxury of setting your own hours and maintaining your own schedule, but you should still be putting in full days. Establishing some rules of working from home for yourself can make the difference between a drafted thesis chapter and a whole lot of lost time.