The H-Index: How One Metric is Ruining Academic Careers and Research
Explore the controversial H-index, its unintended consequences, and why it’s dividing the academic community. Is it time for a new evaluation system?
File
The Toxic Metric Ruining Academia [Researchers Worst Nightmare]
Added on 09/03/2024
Speakers
add Add new speaker

Speaker 1: What if I was to tell you there was one single number that would ruin and dictate people's academic careers? We've done it to ourselves. This is the story of a single metric, a academic evaluation that has serious consequences for everyone going forward. Scary. In the good old days of academia, what did we do? We just sat around thinking about stuff, doing things, telling other people. And that was like the job, thinking. But the thing is, apparently we weren't happy with that. We just wanted something else. Academics need to measure things. And in 2005, we created the single worst invention I think a scientist has ever come up with. And that is the H-index. In 2005, this guy, Jorge E. Hirsch, came up with the H-index. It was an index for quantifying a scientist's publication productivity and is the basis of other things that stem from it. But it's so bad. It's so bad because it stopped academics being able to do what they did best and it turned it into a nasty game. The H-index is the largest number H, such that H articles have at least H citations for each. That can be complicated, but let's look at my H-index. My H-index over here is 11. I don't think that's bad. I don't think it's particularly good. But what it means is that I have 11 articles with 11 citations or more. So if we look at this, I've got 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11. Oh, that's not more than 11. There we are. My H-index is 11. But there's so many issues with this. It's just caused, I think, science and research to become gamified, to become toxic. And I think it can be all traced back to this metric. Let's talk about the consequences of this metric and why it is so bad. Some of these may even surprise you. The research community is divided on the H-index. There's one side that's screaming from the rooftops that this is a terrible idea. And the other side are going, you know what, my H-index is quite large. So I'm not going to say anything because it's working for me at the moment. And even the inventor himself, good old Jorge, he's saying that it was never intended to be used outside of physics where he was an academic. And so here we are. This is from Nature Index and it says, what wrong... This is from Nature Index and it says, what's wrong with the H-index according to its inventor? Severe unintended negative consequences. It was never intended to be used as a single number to dictate someone's career. It's full of issues. So here Jorge proposed it as an objective measure of scientific achievement, but he didn't think it would be used outside theoretical physics. I wasn't even sure whether to publish it or not. Jorge, you shouldn't have published it, mate. It's an absolute nightmare. I graduated in 2007 and to think that there was only two years before that where there was this kind of utopia of academia and all of a sudden you ruined it should have kept your mouth shut, mate. He says that about half the scientific community loves the H-index and half hates it. The H-index of a scientist is a great predictor of whether or not the scientist belongs to the first or second group. Of course you're gonna love it if it benefits you. And that's the first issue with this thing, that scientists that it benefits aren't going to say anything, but then there's so many more issues like these. It's so interesting to me that we do this pantomime that when we are hiring for a certain academic position what we should really do is get all of the candidates and line them up in order of H-index and just start offering them on the way down saying, do you want this job? Do you want this job? All right, this guy wants it. Get him in there. Because that's essentially what we're doing when we start sort of like looking for academics to employ. It's all based on this H-index. Even sort of promotion within a university it's all on H-index. This is just sort of like a really cheap and horrible way that we've resorted to to get a overview of someone's entire career. It's not what it was meant for but it's what we're using it for. It's terrible. One of the worst things that you can say about the H-index is that it actually sort of deters from critical thinking and new ideas. Because what it does is it says to be a successful scientist you need to be cited a lot and you need to publish a lot. And what's the least sort of resistance to that? And that's just go along with the crowd. Publish stuff that's popular. Publish stuff you know will get lots of sites. There's no point fighting for three years to get your really niche out there idea into a scientific paper if in the same time you can just sort of like agree with people and get it published. So there's no doubt that this metric is really sort of hampering new ideas entering research field. I hate saying that because I sound like a bloody conspiracy theorist. I'm not. This is just the nature of the beast. Also I feel like it limits what people are allowed to research. You want to research hot topics that will easily get published and easily get citations. So if something's trending out there in the world in the scientific world and you want to be part of success you of course then need to steer your research towards that successful hot topic. It's just sort of like not allowing for a good diversified set of things to happen at once. Everyone's going to be going towards that hot topic. It happened to me during my PhD in postdoc. I was working on organic photovoltaic devices but then perovskites came onto the scene. The little beacon of light saying we're the best. The whole field shifted towards perovskites in a heartbeat because they knew that's where the citations and the notoriety was to be had. It left a lot of researchers behind. People turned towards this new shiny research field and looked at it and said yes you will be the cause of my success. Citations are coming my way. Here's a quick issue for you. It cannot compare academics in between different research fields because each research field has different number of expectations on citations and publications. So you can't compare someone in physics with someone's in humanities. It just doesn't work. What's the point of a metric that doesn't even compare across fields? This metric was meant to say this person with this many citations is successful but all it's really done is said a successful person is someone who can game the system by producing as many papers as possible. Imagine a researcher that had one paper but it was cited 12 billion times. Billion times. That person would have an h-index of one. Whereas someone who produced a load of rubbish in the same time and had 30 or so sites across 30 papers would have an h-index of 30. Which is the better scientist? Arguably this person but the metric doesn't pick that out. So if you can just publish loads and you're around for a long time your h-index is inevitably going to be bigger. It doesn't say you're a better researcher. It just says well done you've managed to publish the most and you've stayed around without getting fired. Yay. Why do we use this? We are a community of critical thinkers. Why are we allowing our entire careers to be dictated by this one single metric that is obviously flawed? Down here. Common sense should teach us to beware of simplistic and one-dimensional indicators. This wouldn't pass peer review. This wouldn't even pass an undergraduate seminar. People would jump on this like there's no tomorrow and it was only recently invented but apparently we're enough into it now that people just go oh this is just the way it is. Leave it. It's rubbish. Alright if we're not going to go with the h-index where should we take this? In fact Spain says it wants to change how it evaluates scientists. Now this is a little bit different because it's not the h-index but it's also just this sort of like impact factor of journals which is flawed in its own way. But essentially it tells us there are different ways. We invented it. We can remove it and try something else. In Spain they're worried that they've got a dictatorship of papers. Couldn't say it better myself. And under the new system that Spain's proposing it's not just about number of papers and impact factor. It's about publications, patents, reports, studies, technical works, artistic works, exhibitions, archaeological excavations and the creation of bibliographic records. So essentially they're just expanding what success looks like and I'm really interested to see where this goes. We definitely need to fix the system but unfortunately we're still so sort of like locked in to this idea of having a metric that we're just proposing other metrics. I love this one. This was proposed by Adrian and he says p-index a fair alternative to the h-index and this is a popular index. That's what it stands for. So popularity index is based on the scientist's most cited papers represented by the biggest number of non-repeating citing authors. Self and duplicated citations of the same publication by the same authors are not taken into account and the popularity indices are hard to manipulate or artificially increase because the citing authors are considered only once. All right so it's dealt with like one or two things but one thing I love about this is down here what do they call it? The pp index. Why? Why are we sort of doing that? Like I can just imagine a load of these blokes sat around being like oh how big's your pp? Oh you've got a big pp? Do you want to see my pp? I'm working on making my pp bigger. Because obviously in the male dominated world of academia you've got to compare sizes of your pp haven't you? And if we're talking about other kind of out-of-the-box metrics there's also altmetric. They've been around for a while now and it talks about your influence tracked, explained, visualized. So it's just essentially a single metric that has a look at how impactful you are outside of your academic bubble. What they're essentially looking for is evidence of research influence and it looks out beyond sort of like the usual and it says can be used to benchmark the influence of your research against your peers helping you assess and manage your reputation globally. And it's down to this donut. Each one of these sort of like colors is a flavor of attention that people have got and ultimately they look like this. So you can have a look at policy, news, blogs, twitter blah blah blah and that's what you end up with. And I think this is an important step in getting away from traditional academic metrics but it's not the full story and to be honest with you I do not know what is actually going to work in the long run but I know that it is not the h-index. No matter how much we want to hold on to it it is just being gamed. I'd love to know in the comments what you think about this and where we can actually sort of get an idea of someone's actual success as an academic that just doesn't rely on a single metric. So there we have it. There's the worst thing I think scientists have ever ever invented and for some reason we've embraced it with open arms and we're not willing to let it go. It's mine. It's working for me so I'm keeping it. And if you want to know more about this sort of like horrible state of academia check out this video where I talk about the disgraceful state of academia. I'll see you over there. you

ai AI Insights
Summary

Generate a brief summary highlighting the main points of the transcript.

Generate
Title

Generate a concise and relevant title for the transcript based on the main themes and content discussed.

Generate
Keywords

Identify and highlight the key words or phrases most relevant to the content of the transcript.

Generate
Enter your query
Sentiments

Analyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.

Generate
Quizzes

Create interactive quizzes based on the content of the transcript to test comprehension or engage users.

Generate
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript