28 march 2008

Expert surveys, not numbers, show quality of research

British Journal of Political Science (cover)
Political scientists across countries agree on which journals in their professions are the best

Assessing the quality of research in social science is best done by expert surveys, and not by quantitative methods such as the ‘bibliometric approach’, a working paper by an international team of social scientists suggests.

The findings have significant implications for public and private grant-making bodies, which need accurate information about the quality of a researcher’s work before they make an award. Assessments of research quality also form the basis for academic appointments and policy-makers’ decisions.

The working paper ‘Comparative journal rankings: a survey report’ shows why there is scepticism about the merits of the ranking system adopted in the bibliometric approach, for example in the widely used ISI Web of Science database. The researchers argue that such databases can throw up the number of citations of an author or topic from a potential pool of thousands of journals, but the quantity of citations does not always denote quality. Lead author Iain McLean, Professor of Politics at Oxford University, says this ‘noisy’ approach cannot be trusted, for instance a research paper may be cited frequently because it is so bad that people often want to rebut it.

In contrast, the authors found that there was a high degree of consensus regarding the quality of scientific journals. In three separate studies,  political scientists from three different countries, the UK, the US and Canada, rated the quality of a selection of 92 different journals.

In the surveys, seven of the US and UK top 10 journals were common to both lists. American and Canadian political scientists both ranked the American Political Science Review (APSR) top; the British sample ranked the British Journal of Political Science top, and APSR second.

Professor McLean  concludes that the expert study is a robust way of ranking journals, which can thereby provide a reliable measure for scientists and researchers of the quality of the authors and research papers that appear in them.

Professor McLean said: ‘These findings matter for the Research Assessment Exercise  (RAE) that is in progress now, and for discussions between universities and governments about what form of assessment should replace the RAE in future. They show that the political science profession has a stable collective view on what the top journals in our subject are. We asked everybody who teaches politics at a PhD awarding university in Britain, Canada and the US how they would rank the journals. From three countries and a wide range of intellectual approaches, they come up with very similar answers.’

The research was carried out in collaboration with American Professors Micheal Giles from Emory University and James Garand from Louisiana State University, and Professor André Blais from the Université de Montréal, Canada.

Professor Giles said: ‘People are justifiably sceptical about using computer-generated bibliometrics to judge the quality of academic output, particularly in a discipline like political science where citation practices may vary across sub-fields. In our view expert judgments of journal rankings are robust and provide a more prudent approach to grading people, publications, or university departments.’

This research builds on previous surveys of academic political science journals that were initiated by Professor Giles in 1975. The working paper will be presented to members of the Political Studies Association at the PSA conference in Swansea on 2 April.