Category Archives: Research

Making the most of your Postdoc

With just days remaining of my Postdoc, I thought it would be worth reflecting on what I’ve learned from this experience over the past two years. Let me preface this post by saying that, like all PhDs, all Postdocs are different. My experience is just that, a single experience. But I’ve learned a few lessons along the way that, in retrospect, I wish I could have known in advance.

Postdoctoral fellowships are shrouded in mystery in much the same way as a PhD. In both cases, I had only a vague notion of what these commitments would entail before I got started. When the opportunities arose, I jumped at them. After all, who doesn’t want a Postdoc, right? I only found out what my Postdoc would really entail once I was already in the midst of it.

Sink or swim…

Tip #1: Find a good mentor. Look far and wide if you have to. Some people get lucky and find a mentor either in their PhD supervisor or their professional supervisor in the department hosting their Postdoc. If that’s you, fantastic. If that’s not you, don’t panic. There are various ways that you can go about finding a mentor. A good way to begin is by reading the profiles of faculty members at your host institution, looking out for people with similar interests to your own or who have built the sort of career you aspire to. I have found that well-established academics nearing the end of their careers and Emeritus Professors are generally more interested in fostering the development of young academics than mid-career colleagues who are (understandably) preoccupied with keeping their own careers on track.

Conferences and training courses also offer opportunities for meeting potential mentors, particularly practitioners working in your field. These people might not be able to coach you on the ins and outs of building an impressive academic profile, but they can provide invaluable advice on broader issues of professional development. A mentor working outside of the university sector can also offer an alternative perspective to counter the advice you receive from within the academy.

Tip #2: Write a book. Seriously. While I was on the lookout for a mentor from day one (and found two), I left the book project far too late. When I started my Postdoc, I vaguely knew that some people turned their PhD theses into books. But, to be absolutely frank, I wasn’t terribly interested in spending another year or more with a 300-page document that I thought I had just finished. I got caught up in the excitement of planning my next research project, put my thesis on a dusty shelf, and concentrated on journal publications. And then, after about 18 months, I learned that it is fairly standard practice to use a Postdoc to write a book. When I paused to assess my surroundings, I realized that approximately half of my Postdoc friends either have books already in print or complete manuscripts under review. Great.

Six months later, I have a book in the works myself; but writing it will carry over into my new, non-academic life. “Why bother?” you might be asking. The answer is simple: in this uncertain job market I want to keep my options open. I am moving from one contract position to another, and will be looking for my next job in two year’s time. If potential employers will be expecting to see a book on my CV alongside my Postdoc, I want to make sure I’m covered.

Tip #3: Find ways to achieve balance. This is a cutthroat business we’re in. If you think managing a PhD can be overwhelming, hold on tight because you haven’t experienced anything yet. Publication records, citation scores, speaking engagements, teaching assessments, records of professional service – all of these things will be quantified to determine your professional worth. Without a strategy for maintaining some semblance of balance, it is easy to become overwhelmed. The best advice I’ve come across for pursuing balance as a junior academic comes from Radhika Nagpal’s article The Awesomest 7-Year Postdoc or: How I Learned to Stop Worrying and Love the Tenure-Track Faculty Life. While it is worth reading all seven of her strategies, I found strategies 4 (work fixed hours and in fixed amounts) and 5 (try to be the best whole person you can) particularly insightful. And encouraging.

Good luck!

Book Review: The Myth of Research-Based Policy & Practice

I recently reviewed The Myth of Research-Based Policy & Practice for SAGE Methodspace. While this book will be of limited relevance to many PhD students, Chapter 6 provides a useful discussion of quality as it pertains to qualitative research practice that may be helpful to those of you coming to terms with your own epistemological and paradigmatic leanings.

Hammersley, M. (2013). The Myth of Research-Based Policy & Practice. London: SAGE.

First, a confession: I was drawn in by the catchy title. Working broadly in the area of international development, I frequently encounter appeals for research that can inform policy (e.g. situation analyses, monitoring & evaluation, impact assessments). As an early career researcher, I also feel intense pressure from within the university sector to demonstrate either that my research is ripe for commercialization or that it has a social impact (i.e. that it can influence decision makers). So when the title of this book promised to expose research-based policy/practice as a myth, it immediately caught my attention.

I am also quite interested in the politics of evidence. Who determines what counts as evidence? And by what standards? These questions are inherently political, snarled in complex webs of power and influence so pervasive they are often easily overlooked.  So if we accept (at least initially) the premise that research can (or should?) inform policy/practice, we need to carefully consider the standard of evidence required by policymakers and practitioners, as well as what this then means for our own research practice.

The Myth of Research-Based Policy and Practice has two stated objectives: first, to broadly consider ‘what counts as knowledge’ and then to expose ‘the limits of what counts as knowledge in evidence-based policymaking’ (p. 1). The book goes some way toward achieving both. The Introduction provides a useful overview of the history of evidence-based/informed policy, charting its path from medicine to education and other policy areas encompassed by the social sciences. This historical background is significant in that it clearly illustrates how randomized controlled trials (which provide a particular type of evidence suitable for answering certain types of questions in a medical context) became the gold standard for research-based evidence across a broad spectrum of social policy areas. This ‘positivist conception’ of the social sciences, moreover, has little time for socially grounded or ‘critical’ research that adheres to alternative epistemological and paradigmatic positions. And therein lies the problem. As Hammersley notes, “…a grand conception of research is widely shared among social scientists: it is often assumed that the knowledge they produce can generate conclusions that should replace or correct the practical knowledge of actors, and that this will bring about substantial improvement in the world” (p. 9). But all evidence is not created equal. And, as the author points out, practical knowledge also has a role to play in informed decision making.

With this in mind, I found Chapters 2 through 4 (which address the issues raised above in more detail) particularly interesting and well developed. Living in our own epistemological bubbles, we rarely pause to consider – let alone critically question – the nature of evidence. Hammersley urges us to take these questions seriously, further differentiating between evidence and expertise. While it could be debated whether or not the author convincingly demonstrates that evidence-based policymaking/practice is a myth, he certainly exposes the limitations of evidence produced by social science research in this context.

The book has been written so that each chapter can stand on its own and be read independently; this is both a strength and a weakness. While there are some advantages to this format (and I know that more publishers are moving in this direction), the book as a whole seems somehow less than the sum of its parts. At Chapter 7, it takes an abrupt turn; shifting focus from the theoretical and philosophical issues that underpin research and ‘evidence’ toward, first, action research as a particular research practice, and then different approaches to reviewing literature. As systematic reviews constitute one element of the ‘gold standard’, I can understand why the topic of literature reviews is relevant; however, I am not convinced that dedicating a full third of the book to literature reviews is justified. The lack of a concluding chapter means that the book comes to an abrupt stop, without tying off the various threads to the argument.

In sum, I suspect that this book will appeal to scholars frustrated by growing demands that their work produce particular sorts of outcomes, and those interested in phronetic social science. Chapter 6, The question of quality in qualitative research, might also be of interest to PhD students as they discover their own epistemological and paradigmatic leanings. It is a book worth dipping in and out of (particularly the early chapters), which I suspect may have been its aim all along.

Self-discipline

A PhD is all consuming. Unlike other jobs that you can leave at the office, a PhD takes up residence in your head (and your living room). It goes home with you in the evening. It hangs around on the weekend. It even accompanies you on holiday (if you’re lucky enough to get one). Of the many PhD students I’ve known over the past 5 years, I have only encountered one who successfully maintained a 9-5, 40 hour work week.

So where does that leave the rest of us? Regardless of your work pattern, sometimes we all need a break. In fact, studies have shown that working more than 40 hours per week does not increase productivity in the long run. It can be incredibly difficult with 12 articles still to read and a draft chapter due next week, but you need to learn to discipline yourself and the PhD-monster lurking over your shoulder. I’m still learning, but am getting better.

One aspect of successful self-discipline is actually getting the work done; setting reasonable expectations and following them through. Several weeks ago, I set myself the goal of writing uninterrupted (no talking, checking email, searching the internet or fidgeting with my phone) for at least 30 minutes every day. It turns out that even this modest task is easier said than done. So I bought a kitchen timer – a hot pink one that sits on my desk and reminds me of my writing obligation. Once I set the timer, writing becomes my sole occupation until the ticking stops and the bell signals that I’m allowed to stop. I’m not suggesting that you run out and buy your own kitchen timer, just that you think about how you might manage your working time more effectively and in ways that enhance your productivity without requiring that you enslave yourself to your PhD.

My second tip for successful self-discipline is little rewards. Once you’ve accomplished whatever it was you set out to do, allow yourself a treat. Take a short break, go for a walk or, better still, bake some brownies!

Delicious Fudge Brownies

(Adapted from ‘On Delicious Fudge Brownies’ in The Pirates! In an Adventure with Communists, by Gideon Defoe)

  • 200g dark chocolate
  • 210g butter
  • 40g vegetable oil
  • 5 eggs
  • 400g sugar
  • 50g honey
  • ½ tsp vanilla
  • 125g flour
  • 50g cocoa powder
  • ½ tsp sea salt
  • chopped nuts (optional)

Melt chocolate and butter together in a pan over low heat. Remove from heat and stir in remaining ingredients. Pour into a baking pan lined with parchment paper and bake for approximately 30 minutes or until the centre is set. Leave to cool on a wire rack before cutting.

Collaborative Fieldwork

Having the opportunity to conduct fieldwork is, for me at least, one of the best bits of being a researcher. What’s not to love about travelling to exotic (or even not-so-exotic) locations and meeting interesting people? I also enjoy the serendipity and unpredictability of fieldwork. Despite the best-laid plans, there is no knowing what might happen tomorrow or when your research could take an unexpected turn.

Now that I have completed several trips to ‘the field’, I can see that my own approach to fieldwork has changed considerably. Perhaps this is just indicative of a learning curve that all researchers experience. In any case, I thought that I would raise the idea of collaborative fieldwork for those of you who are preparing to set out for ‘the field’ yourselves.

PhD research is, almost by definition, an individual pursuit. Earning your doctorate will ultimately depend on your ability to demonstrate to the examiners that you have made an original contribution to knowledge. In the social sciences and humanities, this is rarely achieved through group work. I suspect that this has something to do with why our fieldwork is, by and large, an individual activity as well. After all, it is at this stage of the research process when you collect the evidence necessary for making your all-important original contribution.

When I conducted my PhD fieldwork in Madagascar, I intuitively adopted the mindset of a solo investigator. Sure there were plenty of other people involved: research assistants, gatekeepers, participants, community leaders, and local acquaintances. But at the end of the day, the research was mine and I called the shots.

Compare that characterization to the fieldwork that I conducted in Papua New Guinea last month. In this instance, I was conducting research with a colleague from PNG and had brought along an Australian student to work as our RA. The research was also embedded in a longstanding relationship that my colleague has with our host community. So the research was not only mine and ours (the three-person research team), but also theirs (the participants). Coming up with an agenda that suited everyone required negotiation and flexibility. Not everything went to plan. But the three-person research team configuration in particular proved incredibly beneficial.

While my colleague and I facilitated the research activities, our RA documented the process. This meant that I could focus all of my energy on the participants and our interaction without worrying about whether the video cameras were working or trying to simultaneously lead a discussion and take detailed field notes. In the evenings, all three of us would get together to go over the day’s activities; often we recorded these conversations as well. Initially, it was surprising how we had each picked up on quite different details. Sometimes this could be attributed to vantage point, at others language skills. There is no doubt, however, that three heads were better than one.

This experience has made me wonder how much richer my PhD research might have been had I adopted a similar approach in Madagascar. I did arrive in the country on my own, but I could have easily built up the sort of three-person team described above. In fact, simply fostering a more collaborative relationship with my research assistants, rather than using them primarily as translators, might have made a considerable difference. It would have taken some work early on to train someone in documentation skills or set a precedent for discussing the day’s activities before heading our separate ways, but I now suspect that it would have been well worth the effort in the long run.

So, if you are starting to think about your own fieldwork, I would encourage you to consider how you might adopt a collaborative approach. It is still up to you to do the work, collect the evidence you need and write it up in a compelling way. But in the heady chaos of fieldwork, collaboration has strong advantages over going it alone.

It’s ok to fail

In his book A Short History of Nearly Everything, Bill Bryson tells the harrowing story of Guillaume le Gentil:

Le Gentil set off from France a year ahead of time to observe the transit [of Venus] from India, but various setbacks left him still at sea on the day of the transit – just about the worst place to be since steady measurements were impossible on pitching ships.

Undaunted, le Gentil continued on to India to await the next transit in 1769. With eight years to prepare, he erected a first-rate viewing station, tested and retested his instruments, and had everything in a state of perfect readiness. On the morning of the second transit, June 4, 1769, he awoke to a fine day, but just as Venus began its pass, a cloud slid in front of the Sun and remained there for almost exactly the duration of the transit: three hours, fourteen minutes, and seven seconds.

Stoically, le Gentil packed up his instruments and set off for the nearest port, but en route he contracted dysentery and was laid up for nearly a year. Still weakened, he finally made it onto a ship. It was nearly wrecked in a hurricane off the African coast. When at last he reached home, eleven and a half years after setting off, and having achieved nothing, he discovered that his relatives had had him declared dead in his absence and had enthusiastically plundered his estate.

This is what we’ve chosen as researchers: a life of failure and rejection. In all seriousness, though, research is a process of trial and error almost by definition. It is hard. We do not always succeed. And that’s ok.

If our hypotheses were always spot on, if our procedures always worked exactly as expected – if life was really that predictable – there wouldn’t be much point in conducting research at all. Thankfully for those of us who love research, there are still plenty of things that we don’t know and don’t understand that require investigation. That said, the arduous process of developing new knowledge is replete with surprises and setbacks.

Not knowing any more about le Gentil or his story than what’s written above, I would still question the assertion that he “achieved nothing.” He may not have achieved what he set out to, but that should not by default mean that the entire adventure was without merit. I suspect that the experience of spending eight years in a foreign culture mastering his instruments must have had some unanticipated (and perhaps undocumented) benefits. In my own case, arriving in the field only to find my methods unsuitable was a fortuitous fork in the road. A nightmare at the time, this wholly unexpected scenario presented an opportunity to change tack and experiment with visual methods. Five years later, visual methodology is at the core of my research agenda. It hasn’t been an easy journey, but it has certainly been an interesting one.

In some ways, I have also been incredibly lucky. Although my PhD fieldwork did not go at all according to plan, my essentially made-up method worked well enough that I was able to return home with sufficient data to successfully complete my thesis on schedule. Not everyone is so lucky. And I’m not so lucky all of the time. Sometimes despite doing everything right, our research still goes awry. A cloud passes in front of the sun. What then?

I don’t know when or how it started, but a culture has developed in academia that rewards ‘success!!’ at the expense of knowledge and understanding. We are under enormous pressure to get it right. Some, though not all, of this pressure comes from the need to publish (‘as much as possible!!’). Journals accept papers that present significant (i.e. positive) findings. Professor Keith Laws recently observed that:

This publication bias* is pervasive and systemic, afflicting researchers, reviewers and editors – all of whom seem symbiotically wed to journals pursuing greater impact from ever more glamorous or curious findings.

He goes on to say that the solution is not the creation of special journals that publish negative or null findings. (An idea I’ve personally heard discussed on more than one occasion.) Instead, Laws argues that we need to make room for these “unloved” findings in mainstream journals. True, this depends in part on the cooperation of reviewers and editors. It also depends on us; we supply the content.

About a year ago, I submitted a manuscript to a top methodology journal. The article details three attempts a photographic data collection, two of which were only moderately successful. The third attempt was undertaken in conditions that were far from ideal and was largely unsuccessful as a result. One reviewer picked up on this, questioning why I chose to proceed with the research. I responded truthfully that that’s the nature of my work. The article is now in press.

In a previous post, Monica voiced concern that university metrics encourage the mass-production of ‘plywood’ rather than oak- or mahogany-quality research. The expectation that our research will churn out positive results (within a 2-3 year timeframe) compounds the problem and changes the very nature of the endeavor. Sometimes your procedure won’t go to plan. Sometimes your results won’t be what you expected. Sometimes a cloud passes in front of the sun at exactly the wrong moment. That’s the harsh reality of research. And, it’s ok.

(And if we’re bold, we can even get it published: warts, failings and all.)

*I don’t think that psychology is so different from other social science disciplines.