Expert surveys are reliable way of measuring quality of political scientists’ work
27 March 2008
Research by an international team of social scientists suggests expert surveys are more reliable than bibliometric methods for assessing the quality of research in social science. The researchers describe the bibliometric approach, where quality is largely ranked according to the frequency with which authors’ names and citations appear in political science journals, as a ‘noisy’ signal of quality. However, the authors of the working paper ‘Comparative journal rankings: a survey report’ found that in three separate expert studies of political scientists from three different countries, the respondents demonstrated a high degree of consensus in the way they rated the quality of a selection of 92 different journals. Lead author Iain McLean, Professor of Politics at Oxford University, concludes that the expert study is a robust way of ranking journals, which can thereby provide a reliable measure for scientists and researchers of the quality of the authors and research papers that appear in them.
Professor Iain McLean said: ‘These findings matter for the Research Assessment Exercise (RAE) that is in progress now, and for discussions between universities and governments about what form of assessment should replace the RAE in future. They show that the political science profession has a stable collective view on what the top journals in our subject are. We asked everybody who teaches politics at a PhD awarding university in Britain, Canada and the US how they would rank the journals. From three countries and a wide range of intellectual approaches, they come up with very similar answers.’
The research was carried out in collaboration with American Professors Micheal Giles from Emory University and James Garand from Louisiana State University, and Professor André Blais from the Université de Montréal, Canada. The researchers compiled three separate lists of political science ‘experts’ in each of the three countries. The UK list was compiled from a Political Studies Association membership directory of 1,800 names. Over 400 (432) responded to the invitation from the researchers to take part in the expert survey – 24 per cent of the PSA list, which researchers describe as ‘good for an expert survey without material incentives’. In Canada, the experts are from a list of the country’s PhD granting departments; in the United States, they are members of the American Political Science Association (APSA) employed by PhD granting institutions. The US list numbered 3,486 experts, and over 1,100 (1,134) of them responded – 32.5 per cent of the total list. The Canadian survey achieved 196 responses out of 607 – 32 per cent of the list.
The survey, administered by the Public Policy Research Laboratory at Louisiana State University, asked respondents to rate the quality of selected political science journals with an option to include those journals that did not appear in the survey. They were asked which ones they would choose if they were to submit a very strong paper, and those they would read regularly or otherwise rely on for the best research in their area. They were also asked to rate the journals according to the general quality of the articles published in them. The three separate surveys show a consistency in the way that the experts have rated the quality of journals across all three countries, for instance seven of the US and UK top 10 are common to both lists. American and Canadian political scientists both ranked the American Political Science Review (APSR) top; the British sample ranked the British Journal of Political Science top, and APSR second.
The paper outlines why there is scepticism about the merits of the ranking system adopted in the bibliometric approach, for example in the widely used ISI Web of Science database. The researchers argue that such databases can throw up the number of citations of an author or topic from a potential pool of thousands of journals, but the quantity of citations does not always denote quality. Professor McLean says the ‘noisy’ approach cannot be trusted, for instance a research paper may be cited frequently because it is so bad that people often want to rebut it and common surnames will always crop up more frequently.
Micheal Giles, Goodrich C White Professor of Political Science from the Department of Political Science at Emory University, said: ‘People are justifiably sceptical about using computer-generated bibliometrics to judge the quality of academic output, particularly in a discipline like political science where citation practices may vary across sub-fields. In our view expert judgments of journal rankings are robust and provide a more prudent approach to grading people, publications, or university departments.’
This research builds on previous surveys of academic political science journals that were initiated by Professor Giles in 1975.The working paper will be presented to members of the Political Studies Association at the PSA conference in Swansea on 2 April. It has significant implications for public and private grant-making bodies, which need accurate information about the quality of a researcher’s work before they make an award. Academic appointments and policy-makers’ decisions are also based on assessments on research quality.
This research builds on previous surveys of academic political science journals that were initiated by Professor Giles in 1975.The working paper will be presented to members of the Political Studies Association at the PSA conference in Swansea on 2 April. It has significant implications for public and private grant-making bodies, which need accurate information about the quality of a researcher’s work before they make an award. Academic appointments and policy-makers’ decisions are also based on assessments on research quality.
For more information or to arrange an interview with Professor Iain McLean, please contact the University of Oxford Press Office on 01865 280534 or press.office@admin.ox.ac.uk
