I'm here at the World Congress on Research Integrity in Hong Kong. Next to me is Miriam Urlings who has investigated citation culture. Hallo. Thanks for being here - Hallo. It's my pleasure. Could you briefly introduce yourself? Yes, of course. I'm Miriam Urlings, and I'm a researcher at Maastricht University. And, as you said just now, I carry out research and have just completed my PhD on citation bias in the biomedical sciences. And why is research into citation bias so important? Yes, that's a fair question and I think that we already know a lot about publication bias. For example, the problem that negative studies are hard to publish, which means we have an excess of positive publications. But I think we also need to be aware that knowledge is developed by means of citation, i.e. by referring to other people and to previous work. Far more positive studies are cited and that also influences the agenda setting that drives research. And also, of course, the general understanding of the discipline. That's why my research is important. And what did you discover? We investigated six different biomedical disciplines and we saw that the degree of citation bias varies between these. So on balance, it’s true that positive studies are cited more often. And we also discovered that the chances of citation very much depend on the journal impact factor, the author’s authority and self-citation. In other words, that selective citation most definitely does occur. Okay. And do you have any tips for researchers? Or funding agencies? What is your advice for us? Yes, I think that one of the important findings we made is that in every field only a few studies are cited a lot. This means that a large part of the literature is not or is scarcely cited. And of course that does not mean there is hardly any good research. So, my tip is to focus less on citations and instead focus far more on the content of research, for example the study's design and size and, for example, to look for systematic reviews or meta-analyses that provide a more comprehensive view of a field, instead of using citations as an indicator for quality. Okay, thanks for your explanation. You've made it clear that we need to stop using journal impact factors and the H indexes. And that we need to start paying more attention to the content of publications.