How Researchers Can Get Their Evidence Used in Policy (Full Transcript)

WHO veteran Gauden Galea explains what policymakers need: timely, synthesized, actionable evidence—plus context, narrative, and clear “so what” implications.
Download Transcript (DOCX)
Speakers
add Add new speaker

[00:00:01] Speaker 1: Millions of researchers hope their work will influence policy, but very few deeply understand how decisions get made inside big ministries of health, ministries in government, big organizations like the World Health Organization, and by prime ministers and leaders around the globe. Today, I'm tremendously excited because we have Dr. Gauden Galea joining us. He's been inside the World Health Organization as a director for decades. He's going to really go deeper to help us not only understand what really moves policy, but what you can do as a researcher to move the needle in your own work and beyond. So today, what we're going to cover is how evidence is actually used in policymaking, common mistakes researchers make when trying to influence policy, how many of you who are early career researchers can be more impactful. I know many people who watch this channel are doing so because they want to bend the arc of history and make a difference in some small lasting way. And finally, we're going to cover how to design research that really gets attention of policymakers who have influence. Those of you who are just joining, I'm Professor David Stuckler, and I lead FastTrack, a research mentorship program. And we are committed to open access to bring you those insights that are often handed down in the ivory towers from one mentor to mentee and making that open access and available to all. So with that, without further ado, big welcome. I see several of you joining us again. Hi, Myneri, good to have you. We'll take questions afterwards. But without further ado, I want to bring on Dr. Gautam Galea. Gautam, thank you so much for joining us. You almost need no introduction. You've had many lives from working inside WHO to even being a policymaker. But let me dive straight in. Gautam, what do you see is the biggest mistake that researchers make when trying to influence policy?

[00:01:54] Speaker 2: So what? I think I can give the answer in just those two words. As a peer reviewer, as a policymaker, as a person interested in public health research over time, and I make no apologies for any examples that I give coming from public health, because that's the area that I am most comfortable in, having worked decades in it, three of those in WHO. I always ask myself, whatever the paper I am reading, so what? What do I do on Monday morning? Much of my scientific reading happens on Friday afternoon and on Saturday, as I top up from contents pages and maybe papers that have been shared by colleagues over the last week. What happens next? Has the researcher given me that question, helped me answer that question in some way?

[00:03:00] Speaker 1: Gautam, that's, sorry to cut you off, just emphasizing that so what component is so important because I think a lot of researchers think, oh, I produced the paper, it got published in a journal and stopped there. And what you're saying goes much deeper. It even strikes to the heart of the question they ask, the research they did in the first place. I didn't mean to cut you off. I know you have a lot of gems of insights for us today, but please keep going.

[00:03:27] Speaker 2: So yeah, I cannot agree more. It first was, I was first exposed to that question rather in a challenging fashion by Peter Faro, who has sadly passed away now, was at one point editor-in-chief of the International Journal of Epidemiology. But back in my home country in Malta, he served as the head of department for a period of time and I looked at him as a mentor. His own research really had an impact. He studied congenital iodine deficiency syndrome among, and the origins of cerebral palsy. So major epidemiological questions, and he made a huge difference to untold lives in his work. And as I was then a young researcher trying to propose my research for the membership, he wouldn't even discuss the methods before I could answer, why was I asking this question? What difference would it make to public health? In the case of the time, he would also say, so what about your basic, you come to a question with a background yourself, you are an anthropologist, you are a physician, you are an economist. So the other so what question is, what does your background contribute to your ability to answer this question? So the same tobacco control question answered by a physician, by an economist, by a social scientist, an anthropologist, would have very different leanings, would have very different implications for action. So that's, if you leave that out, I think that that's the biggest question. Sometimes in a very busy office, and let me tell you, I wasn't a minister, ministers are much more busy than I ever was, are much busier. They're not going to be reading papers that are discursive and elegant and neat, maybe, but not really making an impact. Why am I reading this? What should I do?

[00:06:09] Speaker 1: I got just, just to say, I think a little bit, you're saying a lot of things I want to come to, especially about making meaning, and infusing your research with purpose that I see researchers sometimes drift from over time. We can come back to that. But I think kind of in a subtle way, a nuanced way, what you're saying is a lot of researchers, if I could take some liberties, are doing work that doesn't have that so what component to it, that you're saying maybe gets lost in these discursive brambles that are just inaccessible to policymakers. Is that in line with what you're saying in a more diplomatic way?

[00:06:52] Speaker 2: Let me exclude one meaning. I am not saying that all research needs to be applied research. There is a lot of space for basic science, for exploring ideas, for describing concepts. If I'm allowed one paper that some, some colleagues have invited me in and we are discussing, we're looking at right now, what does it, how does the literature define small states? The paper just talking about definitions of small states is not, it's not apparent what the purpose is, what its meaning is. Are we just adding a more rigorous definition to a glossary, or does it have more meaning? So once you start to ask the so what in a question like that, you start to say, ah, definitions are political, they affect lives. How do you apply such a definition if you are working for the Gates Foundation, if you are working for WHO, if you are in the World Bank, if you are living in a country, is it small just because it is small, or is there an implication for action? If you are distributing COVID vaccines, should you be using different sliding scales to treat whole populations in small states and make a population level impact? Or do you just use the same proportional distribution and give them 100 vaccines for a whole population and make no impact? So definitions of what is small can be presented purely as a review exercise, but they can also be used to infuse policy, to give even without being ideological, can even be used as part of the advocacy on what is correct and what is right to be done in relation to small countries.

[00:09:16] Speaker 1: Again, that's a great example of small states. But one of the things that you just pointed out is that if I got this right, is that all research is political, whether you want it to be or not, that it interacts with a political economy of ideas. This goes back to Foucault, knowledge is power. And I think a lot of researchers sometimes think that, well, my goal is to be dispassionate. My goal is just to produce evidence and stop there. And that's well and good. But a lot of what you're saying is, well, no, that doesn't take place in a vacuum. On one side, well and good, you want to use rigorous, robust methods and not have bias. But it's OK to have that human side, your passion in to help you ask good questions with so many different kinds of questions. And I think that's a great example of what implications is also important, what you're saying to think about what the implications are for those definitions you raise, for what you're putting in print. Really, really helpful. I want to step back for a second, though, I want to kind of take us to the beginning because I know you started off as a doctor and then shifted out from the clinic and got through the upper echelons of policymaking. Where it all started for you and where you got your passion for making a difference?

[00:10:36] Speaker 2: We've only got an hour, so I'll give you the short version of that. I trained as a doctor, as a physician. I loved the profession, but I also saw, at least in my time, its major limitations and felt that there had to be something that had broader scopes. I was always drawn to public health. Eventually, after having done my public health training, I got drawn to international and global health and hence my move. After 13 years of working at the national level, I then moved to international and joined the WHO, where I spent the rest of my career. Parallel to that was a personal interest in computing and eventually data science and now digital health and AI. And it's been a passion that at times was a hobby, but at times felt much more than that. And it's great that I've reached a stage in life where I can now give both of them the rightful attention. But I've always used the data science in one form or another as a strong underpinning of my public health practice, whether it was investigating a food poisoning outbreak in the mid-1980s, to whether it's arguing for the preventable and treatable fractions of avoidable, non-communicable diseases in Europe. So across that whole range, or in our collaboration with you, David, the listing of a set of cost-effective interventions that can prevent and control non-communicable diseases within a short period of time, what we termed the quick buys, one of our, I think, our most recent paper of this year.

[00:13:07] Speaker 1: Gowden, yes, I have had the good fortune to join you on, I haven't taken full stock, but it's at least a dozen papers over the years that we've published. But I do notice some of you do look up Gowden, we'll share his contact info and where you can find him in the description. But on LinkedIn, you have the line that you're a Pythonista. And that's, I think, relatively unique at such a high level. I think you might be the only Pythonista that's ever been in the upper echelons of the World Health Organization, much less the UN. It leads me to the next question for you. As somebody who not just interprets and uses evidence, but somebody who produces it, what's your perspective here on the kinds of evidence that get taken seriously, which evidence gets picked up and which gets ignored?

[00:14:03] Speaker 2: So I returned the compliment then. The first time I noticed the name David Stuckler was when I saw a paper of yours that about this, it made me think that this must be a venerable, older researcher when I read the paper that was connecting cardiac deaths with banking crises. And the beauty of that paper was that it was not perfect. It was an ecological study that had tried to control for as many variables as possible. But it was what it was, looking at banking crisis one year and cardiac deaths the next year. But it was, though not perfect, it was perfectly correct. It was perfectly timed. It was at the beginning of the financial crisis of 2008. And I use you as an example, both with that and with your book on austerity as examples of one piece, one attribute of evidence that is taken seriously. And that is, it is timely, even if it is not perfect. If it takes you 10 years to make it perfect, the Overton window, the opportunity in politics to make a difference has passed. It's good evidence. It's a synthesis, not single studies. You bring in, so systematic reviews have a strong worth. They're accessible even to young researchers and could even be the first product. It's something if you're able to bring, to synthesize a broad spectrum of data, you save time for the policymaker. And it is the sort of paper that is then used to develop guidelines. So guidelines, review committees within an organization like WHO assigns great value to well done systematic reviews and meta-analyses. The good evidence is evidence that has some element of cost, impact and implementation data. In pulling together our paper on the quick buys, we were rigorous. We did not skip. It was a whole series of systematic reviews that we did, but it was interesting that how few studies we could surface that actually applied their study in the real world. So you want to know, a policymaker wants to know, not just that there may be an impact, but when that impact is going to happen. And this is a question that researchers don't ask. So we talk about return on investment, but you need to have a time horizon, you need to have a space where you can say, OK, I'm putting, I'm investing in such and such an intervention now, whichever is the field of endeavor. We happen to be talking about public health. So when is the change going to happen? And finally, and maybe we'll talk more about it in a moment or later, is evidence that fits existing policy tools. What do I mean by that? Over the years, I have seen the evolution of research on, say, tobacco control. At the beginning, having established the strong risk of certain endpoints, cancer, cardiovascular disease with smoking, having shown the burden, it became OK to be putting forward research about the misbehavior of industry. And big tobacco became a target with CEOs standing up in Congress and swearing about the safety of their product. So. At one point, it was OK simply to criticize corporate power, but today's policymaker has heard all that before. Now they want to know what is the evidence that you have, that policy tools that they are used to working with are going to have an impact. For example, we have moved from saying, look how bad the tobacco industry is, to saying, ah, you have these two types of taxes and this is the level of taxation that has been shown through a natural experiment to have an impact on consumption and on revenue. So the policymaker needs to see the feasibility within the tools available to them, setting of standards, regulation, etc.

[00:20:02] Speaker 1: That's, Gavin, that's really, really helpful. I think one of the things you're bringing out is this importance of real world evidence that fits with the current policy framing and discussion that if you do want to inject yourself into policy, you need to not start from what might be intellectually curious or interesting, but start from the real conversations and debates policymakers are having. And that's where things like systematic reviews and other tools to rapidly synthesize evidence can be very powerful. Let me take a skeptical view, though. Some would say that while policymakers aren't really in the business of evidence, they go find whatever researcher or idea backs up what they already want to do. And what would you say to that? In that kind of a landscape, what can research really do to influence policy?

[00:20:56] Speaker 2: You're right. We had a paper with Tom Gaziano and Srinath Reddy on evidence for chronic disease prevention and control, and we reviewed the lines of evidence. It's an older paper, but I remember Srinath bringing up a line saying, distinguishing between evidence based policy and policy based evidence. And the reality is indeed that often politicians have a style where they want to do something and they're asking the civil servants to find the evidence that justifies it. I think this is where the researcher who thinks that their research is context free is mistaken, that as we develop our research, we need to adopt what might be called context engineering, to borrow a term from the field of artificial intelligence. The research paper is just one product. Alongside it is a network of influences, a reputation, a timeliness, a different styles, a conceptual model behind it, a link with pain points. So if your current government has risen to power funded by the tobacco industry, they're likely to be friends with the tobacco industry and you may need to focus on alcohol for the time being. And then when they're out of power to bring in, I'm talking actually, it sounds like a trivial or a made up example, but this is a real example. You also have to think about how do you put your research within the narrative that appeals to the policymaker. So context, narrative and you really need to know the policy process that you are going to do. We've just finished the fourth United Nations high level meeting and there were a lot of people trying to influence the political declaration and bringing in a lot of research. Those that started to make their advocacy after the period that the declaration had been drafted stood no chance of their research being heard or considered. So one needs to know that if there is going to be, if you're trying to influence a parliament, if you're trying to influence a governing body of an intergovernmental agency, you need to know the cadences, the rhythms of their process, who are the decision makers and to bring those people at the right time with the exposure to the narrative and the meaning behind your message. It's not going to happen automatically.

[00:24:49] Speaker 1: Gautam, that's really helpful. Actually, I see a framework starting to emerge here. I'm just going to highlight some of these on the screen because I plucked out five big points that maybe to you are so intuitive that you gloss over them, but I want to distill them. So if you do want to make an impact with policy, you need to, before you start making an impact with your research on policy, you need to answer so what from the very beginning. It needs to be timely. It needs to, what Gautam referred to as an Overton window, kind of political windows of opportunity that open and close. But if evidence lands at the wrong time, it is literally going to land on deaf ears. There are moments in the policy cycle, moments of opportunity that you have to seize on. Gautam also said, not all research has to be basic in nature or applied. But if you do want to influence policy, you probably need to do more real world evidence that is in the more applied space and is synthetic in a way that policymakers can actually use to answer practical questions about implementation costs, benefits in the real world that they're wishing to affect. Gautam, you said, know the context and policy process. Critically important to know who might be supportive, who might be opposed to your argument. Understanding where your evidence is going to fit in that political economy of ideas and framing the narrative right in line with that. That's already a lot there. Does that fit with your thinking? Have I got those elements right? Excellent. I just want to follow on that, though, because you said something important about reaching out to the right policymakers. Some of the people following the channel are earlier stage researchers. Can they really make an impact without having a senior title or somebody famous on their team?

[00:26:40] Speaker 2: Yes, is the short answer. I think if you take these five points that you have so neatly captured, influence does not require rank. It requires usefulness, utility. So that's powerful. If you want to make an impact, I would suggest that and you're new, you're entering a field, become an expert in a narrow, neglected area of policy. It can be anything that so selecting your area of expertise is the first and I think most important. What are the pain points of the policymaker? What are the areas that the questions that they are asking? And often you will be able to find these and fairly easily politicians will have published there. So we want to know, for example, what is the priority for any EU commissioner, not just the Commissioner for Health? Well, there is a letter written by Ursula von der Leyen, that makes a list of what she is looking for out of them. And if you're interested in anything from development aid to sports or education to health, there is a letter that you should read. If you haven't read it, you don't know what are the areas of policy. So the Commissioner of Health, for example, has a high priority now for the cardiovascular action plan. And it's no secret that it's well known throughout the public health field in Europe. And you should try and find which are the areas that you can bring utility to. What are the questions that you can identify that people will come to you? Because you are now known as the person who has an opinion on that. The second is to be able to translate evidence clearly and quickly. So if you're in a university, you have to produce peer reviewed publications, etc. But you must be able to also produce evidence in policy briefs. So training yourself to produce two page summaries of the policy, of policies that relate to your field of expertise is very, very important. You might look at it as grey literature. You may look at it as inferior to your formal publications. But these are producing things cleanly, quickly and targeting them to the right people. I said already about working on synthesis, not just primary studies. And I think you're not always going to be asked in and it is hard to start. So no doubt. But dare I say quality comes, I'm going to use this phrase a bit guardedly, quality comes with quantity. I'm not suggesting to lower your standards, but the more people see you or see your name around a certain area of policy, then they start to think of you when they think of that area. And I think this is how everyone who has a name got that name. Everyone started from where you are.

[00:31:14] Speaker 1: That's got so I sometimes need to pause to take a breath to unpack a lot of the nuggets you're sharing with us. One of the things we do with the fast track researchers we work with is we try to work over three month timelines to get papers out, because often we've calibrated that you get four papers out a year on a topic that is planting a flag and establishing that you're an expert in that area, gets you picked up on the conference circuit, gets you in demand by policymakers. And it sounds paradoxical to say quality comes with quantity, but that's exactly what we find, because you start getting at the center stage and you start attracting resources, better data, better research questions, collaborations that comes with that quantity. And we're not just talking about mass produced intellectual landfill where the paper produced wasn't even worth the tree that was cut down to print it. Really important. I wanted to pull up as you were speaking Jasmine's comment, because this really speaks to your point about policy briefs. And I think sometimes there are pressures as researchers to produce the most rigorous, high quality piece of work. But we I remember when I was doing my master's training back at Yale, there was often a thought that, well, the D.C. expert is just at the master's level. And sometimes with our PhDs, we're shooting for way up here. And sometimes it's better to inject some evidence into an evidence free ideological political debate to at least get some shreds of technical discussion and tethering to the facts and in real data. But what Jasmine said here is it's timely because of a very contentious debate surrounding education. Say I live in deliberation. Decisions will be made in the next few months. Well, if you submit a paper to an education journal, you're looking at a turnaround time of average of three months for peer review and then average of three months for revisions. And maybe your paper gets out in seven, eight months. And now you've missed that Overton timely window, which was point number two, Gowden. So that that's definitely a challenge. I want to pull up a few more of these comments that we can look through. And I do have a few more questions. I'm just going to pull these up on this on the stream. And so if we scroll back to a few of them, I want to go back and just say nice to see some of you. Thanks, Famil, good to have you join us on the channel. If you have any questions for Gowden, do do let us know. Mynari says, I've been studying AI policy materials from UNESCO and other platforms. I'd be grateful for your suggestions on how I can apply meaningfully these insights to the framework I'm currently developing. Now, if I were going to pretend to be Gowden for a second, I would step back and say, hey, Mynari, going back to the list you set out. Well, can you answer the question Gowden asked? So what? But I don't want to steal your thunder here, Gowden. What would you say to Mynari here, who wants to obviously working with UNESCO, big UN organization? What would you say?

[00:34:10] Speaker 2: Indeed, there's AI fever hitting all the fields at this stage, and I don't know what papers you're studying or or to what end, but clearly you need to ask yourself, what is my position? What is my background? Are you a computer scientist? Are you an expert on governance? You're going to be looking at these AI policy papers very differently. And you need to ask yourself why you are looking at the AI policy of UNICEF. Are you trying to extract something from it in order to influence policymaking in your country? Or are you trying to contribute to UNESCO's thinking and to contribute to global standards? I think it's it's wonderful that you are doing that. I find myself consuming AI papers with great avidity. It's a big area of interest for me. But at the same time, it needs the application, the so what. I have just been, for example, before I entered into this call, I had a wonderful chat with the the center with the people running the Center for Biomedical Cybernetics in the University of Malta, where I am now honorary professor. And here I was with three people with an engineering background who are experts in signal collection and processing and pattern extraction. And they're they're using machine learning and AI to develop human computer interfaces, brain controlled wheelchairs, exploring the thermographic imaging of the skin to try and detect malignant lesions early, even possibly earlier than they are visible to in the normal visible with the visible radiation. This is a very different sort of AI than someone who is working on decision support systems within the clinic. So I think you need to break down what is your interest in AI and not to get lost in in the vast array of temptations. We're in the garden of AI Eden and to become expert in a specific area where you where you can then become the point of reference.

[00:37:24] Speaker 1: I think that's a really nice point, Gautam. I mean, AI is very hot right now. A few years ago, it was anything that had COVID connect to it would publish really well. But you might not have necessarily wanted to be the COVID expert per se. And in this case, too, it's AI. But then coming back to what you said so helpfully earlier, AI. But so what? What what's we always like to look at what's the debate that you're connecting into? What's the discussion? What's the question? If you want to influence policy, you're almost saying, work back from what are the questions the policymakers are asking. If you have a question that the policymaker is asking, you immediately have your so what answered because you've injected yourself right into a debate. You also give us a helpful tip that you these questions the policymakers are asking are pretty easy to find. You can read newspapers, you can go look on their websites, government official websites that set out what their agendas and priorities are. And I typically find those working in more social science areas doing more applied research. You can have your cake and eat it, too. You can do something that's intellectually interesting and still ask a question that speaks to a very live debate. A lot of our true power as researchers comes from the ability to problematize, to ask those good questions. That's where our real freedom is. And I wouldn't give that away so lightly. Again, another thing I commonly say is about 95% of the success of your research comes from that initial step of getting the topic right. But OK, again, I would just want to take a couple other questions here. I've got a few more questions for you. So we've got another one here that makes a comment about evidence based advocacy. And then it's Ananda here saying evidence based advocacy and then required research in the development sector. And I think, Ananda, if I'm interpreting what you're trying to say, is that the sometimes the research follows the advocacy without some kind of advocacy in the first place. The research doesn't even happen. I'm not even sure. But Gautam, do you want to try to take a crack at this one here?

[00:39:40] Speaker 2: Let me, I think that there is a deeper meaning, Ananda, I'm not sure given your, by the way, when I look here, it's because the screen with David's face and your comment is on my left side and my camera is on my right side. Hence my looking away. It's not because I'm not interested or I'm being interrupted. To your subject, evidence based advocacy. I love the phrase because I think it describes what I have done for a lot of my time, my career in public health, whether in WHO or nationally. Early on, the AIDS pandemic had started and I lived in a highly Catholic country. And the very mention of the word condom was enough to spark of months of correspondence in the public newspapers, including at some point groups, fundamentalist groups asking for my dismissal from my post. You could, you can do advocacy and you must in public health all the time as you advocate for the right policy, but you cannot base it on ideology. You have to base it on strong policy, strong evidence that you have collected and put together. In June of 2024, fast forward to the end of my formal career. Last year, we produced a document on the commercial determinants of non-communicable diseases in Europe. The Institute of Economic Affairs called us half-baked Marxists, but there was no evidence in the paper that could be impugned. So once you're entering the highly controversial areas of public health and policymaking, you're going to have your opponents, your vested interests who are happy with the status quo, whether it's education or economics or public health or some intersection of those. You know advocacy is worth its name unless it is evidence-based, then you can be as creative as you want in building up a narrative, in communicating it in venues and channels and experiences that you want to expose the population to. But you have to start all advocacy in public policy should be evidence-based. Unfortunately, we are living in a world where there is in the field of mis- and disinformation, a lot of fact-free advocacy that is happening, that is worrying. As I retired, I reflected on how the world has changed from the golden age. It's what old people always think, that their youth was a golden age of some sort. But certainly I was privileged to live a time when the world was more peaceful and there was more respect for science. The young researcher today has to grapple not only with the rigor of their own evidence, but also with the fact that there are opponents whose job it is to cast doubt on your work and on your motivations.

[00:43:48] Speaker 1: Gauden, you've been battled hard and over the years, have you faced attacks, even personal attacks for your research and or evidence-based advocacy?

[00:44:08] Speaker 2: Yes. You are the examples that I gave, and I think the biggest place where in public health, some of the biggest questions for researchers and practitioners now lie with how you are able to intervene in areas where there are massive vested interests that are going to directly challenge your position. And you had better be strong on the basis for your positions because your job will be on the line multiple times. So, yeah.

[00:45:12] Speaker 1: That's powerful, Gauden, reminding us if you do want to influence policy, sometimes be careful what you wish for as well, because you're sticking your head above the parapet and the dogs could come barking because in policy, there are winners and losers, and your evidence is very likely to fall on one side of a debate. And Ananda just says in response to your answer, says, I love the answer, Gauden and Ananda, that thanks for the question. But, Gauden, I think it's not faint for the heart is a bit of what I'm reading a little bit more deeply in the decades of experience that you have. Do you think it's harder to get evidence into policy making now than it was in the golden age you were just talking about? Is it harder to be a researcher now or has it gotten easier with new AI tools or research networks?

[00:46:15] Speaker 2: You've already given the answer in your question. I envy researchers beginning today, coming from a small country myself. If I was doing a literature review in my young years, I would travel to London to the School of Hygiene where I was trained and I would spend a couple of days of photocopying papers. So I would have done as much as I could of the literature review on the Index Medicus, come up with a list, go to the library there and instead of enjoying London, I would spend two or three days collecting the first round, then pick a paper to build the snowballing into other references until I felt that I had really exhausted the resources there and then come back to write my paper and wait for my next time in London to be able to do the same. And I was privileged. There are many people even today for whom the idea of going to London to photocopy papers would be a luxury. But we all have now access to the Internet, there are publicly available indices, a search strategy can be built with a Boolean string and mesh terms in medicine, at least in medical subject headings. And you can put together a literature review. So in a way, the mechanics of collecting data, the deep research of some of the AI models can give you a good outline already for what you want to write. And you may have a good collaboration with the chat tool that, of course, must never write the paper, but it would be silly not to use the tools that already exist ethically. And I know, David, you've had videos on what ethical use of AI would mean. So using it ethically to expedite the process, the actual production of the research and the evidence is so much easier. In a way, that is going to create greater competition. Another area that I think you face, because of how much easier it is, I think there are a couple of issues that come up for young researchers. One is the issue that your research is going to be consumed by the people who train the large language models. And so your research is going to become the fodder for someone else's answer, but without your gaining a citation. And I think this is it's easier to produce, but how do you keep hold of that contribution? How do you link it to your name? And I think it's an important skill that you start to think in terms of frameworks and catchphrases that can connect with you. So if in public health we use the word social determinants, you will think of Michael Marmot. Despite the fact that many people have worked about it, but there is a close connection. If you think about health promotion, the name Ilona Kickbush comes up at some point. So what are the conceptual models? If you think to be a bit more to be proud of it, if you think the phrase quick buys in the area of non communicable diseases, then David and I are part of a team with whom that idea is identified. So I think apart from the question and its application, you may want to start to think what frameworks, what conceptual models. These aren't linked to a specific paper, they are connected to your overall program of thinking. And that you don't need to know it before you start. It's something that emerges opportunistically as you go from one stage of your research to another.

[00:51:44] Speaker 1: Gowden, I'm really pleased you just mentioned that because it's almost like branding. It's almost marketing. And your papers, when you talked about a moment ago, you said that comment, I just want to highlight again, quality comes with quantity. The way I encourage researchers, we often have them do one paper at a time, get the first paper out. It's a big hurdle to cross. But then think about a pipeline of papers, a stream of papers, not having just one paper trying to accomplish everything. But that's almost what you're talking about is planting that flag again, establishing your name in an area and then establishing tools that you become synonymous with. And that makes it in a way you don't get plagiarized by the AIs because your name is inextricably tied to the conceptual tools that you've brought forward. I think that's really, really a very important point for a lot of researchers who are watching here and maybe even those who are feeling good in publishing papers on how to get to the next level as a researcher. I just wanted to highlight again, Sumaya, just commenting that loves the literature review story. It's fascinating. And I wanted to highlight Jasmine as well, saying it helps debunk the myth that everything has to figure out before you start. This is a really important comment I think Jasmine makes about these friction forces. Also, I see this happen with researchers a lot that, oh, I have to have this. I have to have that. I have to have that. And they don't get anywhere. They don't get the car into gear. These are the people who sit and start organizing their desk or trying to read a million papers before actually producing any evidence. I want to come back and ask you. So with your experience and knowing what you know now, this is always going to be a point of perspective that you would have had at the outset. But knowing what you know now, if you started off again as an early career researcher who wanted to make the world a better place, shape the world through research, what would you do?

[00:53:52] Speaker 2: OK, I see we have seven minutes and I've thrown you the hard questions, harder and harder as we go along. No, I'm happy to answer. It's just that I need to make it, to clip it. So I and you've also told me to try and keep my answers as general as possible because not everyone is from public health. But let me try and knowing what I know now, one of the things that I would say is look at your field carefully and say, where are the institutions that are making a difference that you would like to either contribute to their work or to emulate in some way? So I'll give you an example. If you're in the European region and you're in health, you would really do very well if you look at how, not what, but how the European Observatory on Health Systems and Policies works. For those who don't know the observatory, you can look it up. It's a longstanding WHO partnership that produces evidence to support health policy in the European region. It bridges academics and policymakers. It brings together monitoring system. It analyzes trends. It communicates findings in ways that are directly related to or directly useful for policy decisions. So look at how they set their agenda. Their work always starts with a problem or pressure that the European region is facing. We're dealing today with an aging population. We're dealing with workforce shortages. We're dealing with rising costs. Then they see what are the data sets available to them. They use routine monitoring. They use comparative research and they produce health systems and policy monitors, health systems in transitions, publication to track real time reforms and to look at system responses within countries. I'm deliberately using an example that doesn't come. So it doesn't mean like one's always waving one's own flag. The observatory is a group that I admire a lot and that do a lot of work. So if you are a beginner and you don't know how to get started, there must be something like the European Health Observatory there. And you could look at if you're a public health researcher, for example, you can see that they do comparative reviews across countries. Reach out and suggest that you're interested in producing one for your country if it is not there. Or if you have an insight on some aspect of the health system that you would like to work with them, then that you would like to add to their body of work, then you have an established institution already there and you can bring in your resources and your own research. You also can learn from their product. It's not about deep theory. It's about asking policy relevant questions. Now, they ask questions like, what options have other European countries tried? Every time I talk to a European Ministry of Health and we discuss something about electronic cigarettes or alcohol labeling or whatever, the answer is always, who else is doing it? What did it cost? What were the barriers? How did they overcome them? What enabler, what enablers mattered? And if you are able to answer questions like they do in the observatory, I think you will. Now, again, I don't know the fields of the other fields that might be represented in the audience here. But it's also beautiful that an institution like the observatory gives early career researchers a possibility of making an impact, even if they don't have the senior title that we were talking about earlier. You can contribute to curating or managing their evidence platforms. You can contribute to the writing of policy briefs based on research that you are familiar with and so on. I think I just give a rather extended example so that your audience feel that there is an entry point that is accessible to them. And I'm sure that similar models exist in all fields of human endeavor.

[01:00:04] Speaker 1: Gowden, you've actually said something really quite powerful, and I just want to make sure everybody's really caught up because this model is something that if you're watching this stream and you're interested in influencing policy, you can do today. So I've been trying to synthesize some of the nuggets that you've been becoming fast and furiously at us. And Gowden's kind of saying, in your space, find an effective institution that is moving the needle in policy. And don't just look at what they're doing, look at what's their workflow, what's their how, everything from the questions they're asking to the kinds of evidence that they're producing. And beyond that, see if you can get plugged in. Many of these institutions are resource strapped, resource deprived, even just showing up with a laptop, maybe having the ability to produce evidence can be a huge contribution and will automatically start attuning you to be more effective in policy. I want to make this a more concrete example, Gowden, because I think this model really works across fields. So Ananda came with an example here saying, currently, I'm in between about social media restriction for children in Australia, goes on here, and he goes and asks, if social media restriction, is this a violation of children's rights? If yes, then why research advocates for keeping restrictions for children to use social media? And without getting into the weeds of Ananda's very important question, if I'm understanding you correctly, we want to apply your model, Ananda should look for who are the important organizations who are showing up at this UNCRC and at these G25 conversations, figure out their workflow, mimic it, and maybe even get plugged in with them. Have I got that right, Gowden? And is this what you would suggest to Ananda? Absolutely.

[01:01:54] Speaker 2: And on my other screen, I was just, I'm not seeing it now, but I was looking on Hacker News this morning. Hacker News is not cracker news. It's completely above board and ethical. It's the website that collects news items for people who, the Electronic Frontier Foundation, I think, is the one that is the one that has just opened a project to monitor age based restrictions on the internet.

[01:02:44] Speaker 1: Fantastic. So this would be an equivalent of the observatory example you were just giving that Ananda could follow through. And this raises another point that went unsaid by Gowden. If you have a mentor who's already engaged in these networks, they are often a great way to get plugged in, or at least they can steer you to where that conversation is being had. Gowden, we'll follow up. Don't worry about it too much now, but we'll follow up. And Ananda, we'll share that.

[01:03:09] Speaker 2: I actually have it on my other screen right now. The Electronic Frontier Foundation launches Age Verification Hub as a resource against misguided laws. And it's a press release on December 10th. And then there is a link to their resource hub, an analog exactly to the observatory example that I was giving you earlier in public health.

[01:03:38] Speaker 1: Fantastic. Listen, Gowden, I just want to wrap up here because we're right at an hour. And firstly, just thank you for sharing your time and your insights with us. If people do want to reach out to you, we're happy to make your contact available. I know they can find you on LinkedIn as a Pythonista, but the few, the proud. And I know that we'll also have a Redux session later with members of our research collective, our private mentorship community that helps you go from start to finish, from finding your topic all the way through to publishing in high impact journals. So be excited to have you join us there. And I know, Ananda, I know you're a part of that community. So we'll do that later. And Gowden, any final thoughts about researchers having a real impact on policy?

[01:04:36] Speaker 2: Be curious, maybe, and keep at it. You just do it. You didn't tell me to have my tagline, but I think if you, let's go back to the beginning. If you're constantly keeping yourself accountable by asking the so what question, you will become one of those, not just creators of evidence or generators of evidence, but also interpreters of evidence. And policymakers need skilled interpreters. You will always have a chance to influence policy if you build up that skill.

[01:05:28] Speaker 1: Just do it, Gowden, just start almost like what Martin Luther King even said, take the first step and face, overcome that initial friction. Thank you for joining us. And we will look forward to seeing all of you next Friday at our usual livestream. If there's a topic you'd like to see, if you'd like to submit a question, follow this link here that I'm posting this QR code and you can submit a video question and we will do a clinic. And the next one, we have the special privilege of hosting Gowden today, but we answer all your questions and that opportunity is available for you next Friday. If you are interested in joining some of our mentorship communities and you do want to publish faster and have an influence on policy, tapping the rich insights of those who have come before. Check out the QR code that I'm leaving on the screen here. Let's have a chat and see if you could be a good fit. Gowden, thank you very much. A pleasure as always to see you and have a great weekend.

ai AI Insights
Arow Summary
In a conversation hosted by Professor David Stuckler, Dr. Gauden Galea (long-time WHO director) explains why most research fails to influence policy and how researchers—including early-career scholars—can design work that is actually used by decision-makers. His central critique is that papers often fail to answer the policymaker’s core question: “So what—what do I do on Monday morning?” Policy-relevant research begins by clarifying purpose, implications, and how a researcher’s disciplinary background shapes actionable interpretations. Galea stresses that while not all research must be applied, even conceptual work (e.g., defining “small states”) gains power when connected to real-world consequences such as resource allocation or equity.

He outlines what kinds of evidence get traction: work that is timely (fits a political “Overton window”), synthesized (systematic reviews/meta-analyses that save policymakers time), and usable (includes cost, implementation feasibility, and expected time-to-impact). Evidence is more likely to be adopted when it fits existing policy instruments (taxes, regulations, standards) rather than only criticizing problems. The discussion acknowledges “policy-based evidence” dynamics—politicians may seek data to justify predetermined positions—so researchers must practice “context engineering”: understanding stakeholders, incentives, narratives, the decision calendar, and who holds power in a given process.

For early-career researchers, influence does not require rank but utility. Galea advises specializing in narrow, neglected policy pain points; reading policymakers’ published priorities; producing fast, clear policy briefs alongside journal articles; and building visibility through a steady body of work. He notes both opportunities and challenges of the current era: easier access to literature and AI tools can accelerate research, but also increase competition and risk of uncited absorption into AI training data—making it important to develop identifiable frameworks and concepts associated with one’s name. Finally, he encourages researchers to learn from effective policy-bridging institutions (e.g., the European Observatory on Health Systems and Policies) and plug into their workflows and networks. He closes by urging curiosity, persistence, and continual accountability to the “so what” question to become not just producers but skilled interpreters of evidence for policy.
Arow Title
How Research Actually Influences Policy: The “So What” Test
Arow Keywords
policy impact Remove
evidence-informed policymaking Remove
World Health Organization Remove
public health research Remove
so what question Remove
Overton window Remove
systematic reviews Remove
policy briefs Remove
implementation science Remove
cost-effectiveness Remove
context engineering Remove
policy narrative Remove
commercial determinants of health Remove
early-career researchers Remove
knowledge translation Remove
timeliness Remove
stakeholders Remove
advocacy Remove
evidence-based advocacy Remove
AI in research Remove
Arow Key Takeaways
  • Start with “So what?”: define why the question matters and what action follows from the findings.
  • Policy impact requires timeliness; imperfect but timely evidence can matter more than perfect evidence delivered too late.
  • Synthesis (systematic reviews/meta-analyses) is highly valued because it saves policymakers time and feeds guideline development.
  • Include real-world usability: costs, feasibility, implementation barriers/enablers, and time-to-impact.
  • Frame evidence in terms of existing policy tools (taxes, regulations, standards) to make it actionable.
  • Expect “policy-based evidence”; design communication and engagement around context, incentives, and narrative.
  • Know the policy process calendar and decision-makers; engage before drafts are set and windows close.
  • Early-career researchers can influence policy by becoming useful experts in a narrow, neglected pain point.
  • Produce policy briefs and other fast, clear outputs alongside peer-reviewed articles.
  • Build a recognizable program of work and concepts/frameworks linked to your name to maintain visibility in an AI era.
  • Learn from and collaborate with effective bridging institutions (e.g., observatories, hubs) and adopt their workflows.
  • Evidence-based advocacy can attract backlash; rigor and clarity protect credibility under attack.
Arow Sentiments
Positive: The tone is constructive and motivational, emphasizing practical steps for researchers to increase policy impact, while acknowledging real constraints like political incentives, vested interests, and misinformation.
Arow Enter your query
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript