1 / 19

Trends in Research Assessment and Higher Education, and their impact on TS

Trends in Research Assessment and Higher Education, and their impact on TS. Gyde Hansen Copenhagen Business School (CBS) 2008. Criteria: Assessment of departments. International board-memberships Expert referee jobs Invitations from institutions (from abroad)

leal
Download Presentation

Trends in Research Assessment and Higher Education, and their impact on TS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Trends in Research Assessment and Higher Education, and their impact on TS Gyde Hansen Copenhagen Business School (CBS) 2008

  2. Criteria: Assessment of departments • International board-memberships • Expert referee jobs • Invitations from institutions (from abroad) • Organization of international conferences • Editorships of international Journals • Reviewing international papers, abstracts • Having invited guests from other universities

  3. Criteria: World Class Research Environments (WCRE) • Publication: highly ranked journals/publishers • Enhancement of • international cooperation • CBS’ reputation • International competition • Recruitment of doctoral students • Cooperation with business partners • Attraction of external funding

  4. European Quality Assurance: mission statements/goals • Bologna Process: European co-operation on quality – comparable criteria and methodologies – mobility of students and teachers • EUA: European University Association: a coherent system of education and research at the European level, e.g. doctoral programmes • ENQA: European Association for Quality Assurance in Higher Education: cooperation as to quality, sharing experience, methods, standards – improvement – effective systems – accreditation • EQAR: European Quality Assurance Register of Higher Education: publicly accessible ranking system – quality – student mobility – transparency and trust

  5. Interrelated CHANGES influencing TS Globalization –Internationalization – Market forces – Branding Tradition? Cooperation - Competition International Ranking of departments, institutions and Journals Cooperation - Competition Institutional? TS Research? Management (in detail) – Control HE Academicfreedom? Creativity? Quality assurance – Evaluation Extern funding – Money Knowledge? Bildung? Languages etc.

  6. Globalization – Internationalization – Ranking – Branding – Management – Control 1. 2. Cooperation – Competition Departments:social climate? Cooperation – Competition International:doctoral schools, summer schools? TS 3. 4. Ranking of Universities: who assesses? how? what consequences Ranking of Journals and Publishing houses: dominance of English?

  7. 1.Cooperation - Competition: Departments Questions/Problems • Expected outcome of the evaluations? • What is the impact of the ranking of colleagues on the social climate at the department? • Division of labour? • Top-researchers often work on applications or funded research projects – not much time for teaching and administration

  8. 1.1 Expected outcome of the evaluations of departments (CBS 1999 - 2008) • Increased attention on research, dynamic environments • Understanding the necessity (and legitimacy) to discuss own and colleagues research • Action plans, perhaps reorganization of departments • Quality development Did they get it?

  9. 1.2 Social climate at split departments? • Cooperation? • Discussing each others’ research? • Competition? Hierarchies? • Bitterness and isolation? • Lack of influence and lack of democracy?

  10. 2. International doctoral schools Questions/Problems • How do we prevent – in our little field – that not all research / assessment / evaluation will be done in the same way? • It seems to be the same international trainers at the doctoral schools/summer schools • What will happen to / with variety?

  11. 3. Ranking of institutions and departments (see some of the criteria on dias 2 and 3 above) • In future, a part of the basic resources will be given according to “research quality” – Danish universities are expected to compete • Criteria are “performance goals” • Research indicators are taken mostly from bibliometrics / ranking of publications

  12. 3.1 Quality indicators: their reliability, validity, comparability? Indicators and their weighting are quite different • Exchange students (number of international students) • Visiting professors (number of …) • Student – teacher ratio • at some universities this is irrelevant • Attracting extern funding • at some places this is irrelevant • Graduates’ employment rate? • …

  13. 3.2Ranking of Universities as to TS Questions/Problems • What are high standards in our field? • What are the selection indicators? Paradigms? • How are the indicators weighted? • How is validity, reliability, objectivity, guaranteed? • How can “political” interests be prevented?

  14. 4. Ranking of Journals/Publishers • The same criteria in all fields of research • 2 levels – in order not to make it too complicated • 20% of all journals or publications in the world are expected to be ranked as highestlevel. (The average number of articles in about 19.000 journals will be counted.)

  15. 4.1 Criteria Few criteria: consensus among researchers in the field that the journal 1. is absolutely “leading” 2. publishes the most important articles(in the research field) 3. from researchers fromdifferent countries Supported by a Norwegian ranking list: dbh.nsd.uib.no/kanaler/ and by ISI-data; Ulrichs; Digital Article Database Service (DADS)... NB Two September issues on this website by Daniel Gile

  16. 4.2 Ranking of Journals and publishers and the consequences Questions/Problems • What does ranking mean to TS? • What impact does the ranking have on “not top-level” Journals? • Will it only be a few top-researchers, reviewers or colleagues on the editorial boards who decide as to the ranking? • What does this do to variety (again) – innovation? Will we write what publishers/editors/reviewers like, are interested in?

  17. 4.3 Ranking of Journals and publishers and the consequences – money! Questions/Problems f. • “Citedness” – is that an indicator of quality? • see Gile’s September 2008 issues • What will happen to research written in other languages than English? • Not many French, Portuguese, German or Spanish papers are cited • What can be done in order not to forget what is written in languages other than English? • If bibliometrics counts so much for the future budget of TS departments this may be an important question?

  18. Who assesses? How? What indicators? What paradigms? A few people will do the league tables for TS? How will the hopefully responsible colleagues handle the gap between quantitative ranking and quality? • Effects on academic freedom, on science and on TS? • National research careers will depend on citation… • Much power goes to a 20% of the Journals/Publishers • Friends will cite and promote friends…? • Articles will be shorter…? • Not the papers’ content – leage tables will be in focus…? • Reading is not necessary – just a look at the ranking…?

  19. The usefulness of Quality Assurance – Evaluation – Ranking – Control? *Pigs don’t get fat from getting weighted Does this also hold for research? For departments, institutions and Journals? For TS? Grise bliver ikke fede af at blive vejet.

More Related