1 / 19

Journal rankings: Can’t live with ‘em, can’t live without ‘em!

Journal rankings: Can’t live with ‘em, can’t live without ‘em!. Professor John Mingers Kent Business School, January 2013 j.mingers@kent.ac.uk. Journal rankings . Many different journal rankings each with its own biases and prejudices

kimo
Download Presentation

Journal rankings: Can’t live with ‘em, can’t live without ‘em!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Journal rankings: Can’t live with ‘em, can’t live without ‘em! Professor John Mingers Kent Business School, January 2013 j.mingers@kent.ac.uk

  2. Journal rankings • Many different journal rankings each with its own biases and prejudices • They are based on often arbitrary criteria. They can be by peer review or behavioural (e.g, impact factors) • The original Kent ranking was simply a statistical combination of other rankings - “Objectivity results from a combination of subjectivities” (Ackoff) • ABS ranking was based on a UWE peer review ranking that was developed for ABS • Since the 2008 RAE it has become the de facto standard and yet is hugely contentious

  3. Advantages of journal rankings • Problems of peer reviewing papers • Simply too time consuming • Disagreements between reviewers (cf journal referees disagreeing) • Bias • Makes more open and transparent what would otherwise be very judgmental and open to bias • Provides a common currency against which to discuss and judge research quality • Provides clear guidance and targets for people to aim at • Provides a lot of information for DoRs, ECRs, libraries etc.

  4. Problems of journal rankings (ABS in particular) • History and development • 2004 Version 1 Bristol BS “Not intended for general circulation”; based on journals submitted to RAE 2001 plus others; grades standardised to 2008 RAE 1* - 4*; decisions made by the editors (In OR/MS there are only 19 journals but 6 have the top grade In IS/IT there were 25 journals of which 7 have the top grade) • 2007 Version 1 of ABS list; based on the BBS one but with input from subject specialists and use of impact factors; many journals downgradedWho were the subject experts? How/why were they chosen? Why were disciplinary bodies, e.g., COPIOR, not included?(In OR there are 40 journals, 5 have top score but 3 of these are statistics journals, so only 2, American, OR journals – Management Science and Operations ResearchIn IS there are 68 journals, but only 4 top ones, all USIn both areas, all the UK/Euro ones had been demoted leaving only US onesThe people on the Panel were Chris Voss, an Ops Mgt person at LBS, and Bob O’Keefe, more an IS person, who had just returned from the US) • 2010 current version. Has become highly contentious, especially in particular fields such as OR, IS and Accounting and Finance.

  5. General coverage of management Numbers of journals in the RAE and the ABS list

  6. Submission statistics for the last three RAEs Adapted from Geary et al (2004), Bence and Oppenheim (2004), RAE (2009a) a Totals differ slightly between different sources. Figures for 2008 are after data cleaning as described later

  7. Number of publications by output type Adapted from Geary et al (2004), Bence and Oppenheim (2004), RAE (2009a). Categories with zero entries have been suppressed

  8. Figure 1 Pareto curve for the number of entries per journal in the 2008 RAE

  9. Disciplinary coverage • There are 22 different subject area in the list which seems like a lot. It is also very ad hoc: • “an eclectic mix of categories consisting of: academic disciplines, business functions, industries, sectors, issues or interests as well as more or less residual categories which includes many of the leading business and management journals” Rowlinson, 2013 • Much of the list is devoted to reference disciplines rather than to B&M and applied areas – Economics (16%), Psychology (5%), Social science (7%) – nearly 30% in total. But: Gen Man (4%), HR (4%), Marketing (7%), OR (4%), Strategy (2%)(In OR, ABS had 35, but COPIOR list has 68 and is growing) • Unequal proportions of 4*Psychology (42%), Gen Mgt (23%), Soc Sci (20%), Econ (13%), HR (11%), Marketing (9%), Fin (7%), Ops Mgt (3%), Ethics/governance (0%), Mgt Ed (0%)(In OR, there are 4x4* so apparently 11%, but in fact 2 are statistics journals so in reality 6%)

  10. Specific disciplinary factors (e.g., why is Business History a 4*?).See Accounting Education December 2011 for critiques from an Accounting perspective(The two 4* OR journals are American and specifically exclude Soft OR, which is one of the major British strengths) • Over-reliance on ISI impact factors – journals not in ISI are ignored or at least lowly graded

  11. 4. Problems of process • Lack of openness about how the list is created or updated • Lack of engagement with disciplinary communities(No members of COPIOR on the committee despite our offers) • Few changes made despite protests; little attempt to address the criticisms(COPIOR overtures ignored so reluctantly we produced our own ranking. This too has been ignored) • Is it now seen as a money-making venture?

  12. Problems with a single dominant list Journals outside the list are inevitably marginalised With the REF, journals at 1*, and increasingly at 2* are devalued It’s very hard for new journals to get started The quality levels given in the list tend to be taken as the quality levels of both the journal and then the papers within it “It’s a 3* paper”, “Jane Bloggs is a 4* researcher” Individual researchers are disciplined into channelling into ABS journals

  13. It discourages cross disciplinary or applied work The particular focus of ABS appears to be on US journals – these tend to be highly theoretical, positivistic, and anti-pluralistThis leads to less practical and engaged work and more arcane theory Ideas or work that is pushing the boundaries will not get published and hence will not get done Journal fetishism - gimmethat 4* “hit” Potentially serious effect on peoples’ careers and sections of Schools – e.g., OR at Warwick which was decimated

  14. Table 9 Proportions of journals in particular ranks comparing ABS with RAE grades Note: we show the proportions in terms of % for ease of comparison but all Chi-Square tests were performed on the underlying frequencies

  15. Conclusions from Table 9 • Overall RAE grades were higher than overall ABS grades (cols 1, 4) but this was because of selectivity of submissions • This can be seen by comparing the ABS submitted with the ABS not submitted (cols 2, 3) • Comparing those journals that are in common the level of grading is very similar (cols 3,6) • In the RAE , ABS journals were graded more highly than non-ABS journals (cols 5,6) • 13% of non-ABS journals were graded 0*

  16. Figure 3 Scattergram showing association between GPA and proportion of an institution’s submitted journals that are in ABS

  17. There are at least 3 possible explanations of this: Better RAE grades Higher % ABS journals “RAE Bias” Higher % ABS journals Higher quality of department “Better depts. more mainstream” Better RAE grades Higher quality of department “Greater selectivity” Higher % ABS journals

  18. Problems with the whole RAE regime • Current measurement regimes are hugely distorting to research: • Narrow focus on types of outputs – ie “4*” English language journal articles • Narrow focus on types of measurements • Narrow focus on types of impact • The RAE/REF has had a huge negative effect on the overall contribution of research in the UK – lack of innovation, opening up new areas; lack of major projects (books); lack of engaged research trying to deal with the real problems of our society and environment • Concentration on peer review and rejection of bibliometrics (current REF Panel) – leads to maintenance of the status quo – “The Golden Triangle” • Should we stop now and develop a system that aims to evaluate quality in a variety of forms, a variety of media, through a variety of measures with the ultimate goal of answering significant questions?

  19. N. Adler and A-W Harzing, 2009 “When Knowledge Wins: Transcending the Sense and Nonsense of Academic Rankings”, Academy of Management Learning and Education 8, 1, 72-95 • S. Hussain, 2011 “Food for Thought on the ABS Academic Journal Quality Guide”, Accounting Edication 20, pp. 545-559 • J. Mingers and L. Leydesdorff, 2013, “Identifying Research Fields within Business and Management: A Journal Cross-Citation Analysis, available from http://arxiv.org/abs/1212.6773 • J. Mingers and H. Willmott, 2012, “Taylorizing Business School Research: On the “One Best Way” Performative Effects of Journal Ranking Lists”, Human Relations, DOI: 10.1177/0018726712467048 • J. Mingers, K. Watson and M. P. Scaparra, 2012 “Estimating Business and Management Journal Quality from the 2008 Research Assessment Exercise in the UK”, Information Processing and Management 48, 6, pp. 1078-1093, http://dx.doi.org/10.1016/j.ipm.2012.01.008 • J. Mingers, F. Macri and D. Petrovici, 2011, “Using the h-index to Measure the Quality of Journals in the field of Business and Management”, Information Processing and Management, 48, 2, pp. 234-241 http://dx.doi.org/10.1016/j.ipm.2011.03.009 • J. Mingers, 2009, “Measuring the Research Contribution of Management Academics using the Hirsch-Index”, J. of the Operational Research Society, 60, 8, pp. 1143-1153,doi 10.1057/jors.2008.94 • J. Mingers and A.-W. Harzing, 2007, “Ranking Journals in Business and Management: A Statistical Analysis of the Harzing Database”, European J. of Information Systems 16, 4, pp. 303-316,http://www.palgrave-journals.com/ejis/journal/v16/n4/pdf/3000696a.pdf • J. Mingers and Q. Burrell, 2006, “Modelling Citation Behavior in Management Science Journals”, Information Processing and Management 42, 6, pp. 1451-1464 • H. Morris, C. Harvey, Aidan Kelly and M. Rowlinson, 2011 “Food for Thought? A Rejoinder on Peer Review and the RAE 2008 Evidence”, Accounting Edication 20, pp. 561-573 • D. Tourish, 2011, “Leading Questions: Journal Rankings, Academic freedom, and Performativity: What is or Should be the Future of Leadership?”, Leadership 7, pp. 367-381 • COPIOR JOURNAL LIST: http://www.copior.ac.uk/Journallist.aspx

More Related