1 / 28

How can research and artistic performance of university staff members be measured ICT-based?

How can research and artistic performance of university staff members be measured ICT-based? Harald Lothaller University of Music and Performing Arts Graz (Austria). CRIS 2010 - Current Research Information Systems, Aalborg University, Denmark, June 2nd - 5th 2010. Background Design process

misu
Download Presentation

How can research and artistic performance of university staff members be measured ICT-based?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How can research and artistic performance of university staff members be measured ICT-based? Harald Lothaller University of Music and Performing Arts Graz (Austria) CRIS 2010 - Current Research Information Systems, Aalborg University, Denmark, June 2nd - 5th 2010

  2. Background • Design process • Technical implementation • Accompanying measures • Conclusions Development following DIRKS Overview

  3. University of Music and Performing Arts Graz • One out of 22 public universities in Austria • One out of 6 public universities of arts in Austria • One out of 4 public universities in the city of Graz Background

  4. University of Music and Performing Arts Graz • About 420 staff members in teaching, arts, and research • About 150 staff members in administration and services • Nearly 2300 students from 59 countries • Fields of studies: Church music, Composition and music theory, Conducting, Electrical engineering/Sound engineering, Instrumental studies (classical music), Jazz, Music education, Musicology, Performing art, Stage design, and Voice as well as Doctoral study programs and Non-degree programs Background

  5. Austrian Universities act 2002 • §13: …the Intellectual Capital Report (ICR) “shall, as a minimum, present in itemised form (1) the sphere of action, social goals and self-imposed objectives and strategies; (2) its intellectual capital, broken down into human, structural and relationship capital; (3) the performance processes set out in the performance agreement, including their outputs and impacts”. • § 14: “The universities shall develop their own quality management systems in order to assure quality and the attainment of their performance objectives.” Background

  6. Communication of universities • Middaugh, 2001, p.1:“Colleges and universities have done a horrible job of communicating to both internal and external groups what faculty do and how well they do it.” • Middaugh, 2001, p.54: …information about the outcomes of research or public service activities “are integral to a full understanding of what faculty do”. Background

  7. Staff members’ view • The main activities are teaching, research, and the advancement and appreciation of arts • Additional administrative duties are out-of-favor and often disregarded • The work, its quality, and its outcomes can hardly be measured and this holds for quantitative approaches in particular (cf. Middaugh, 2001, p.80ff.) • Institutional rules sometimes are takes as conflicting with “[…] the freedom of scientific and artistic activities […]” (Austrian Universities act 2002, §2) Background

  8. Twomainquestions • How can the research and artistic performance of university staff members be measured? • How can the staff be made to participate in an online tool designed to measure their performance? Background

  9. Ourapproach “a structured and rigorous approach designed to ensure that records and information management is firmly based in the business needs of the organization” (National Archives of Australia, 2001a, p.5) • Preliminary investigation • Analysis of business activity • Identification of recordkeeping requirements • Assessment of existing systems • Identification of strategies for recordkeeping • Design of a recordkeeping system • Implementation of a recordkeeping system • Post-implementation review DIRKS methodology

  10. Preliminaryinvestigation …it was inevitable to get an overview concerning the role of the “organization, its structure, the business, regulatory and sociopolitical environments in which it operates, and major factors affecting its recordkeeping practices” (National Archives of Australia, 2001, A-4). DIRKS – step A

  11. Analysis ofbusinessactivities Austrian University act 2002 explicitly points to eleven tasks universities have to fulfill within their sphere of action: “1. advancement of sciences (research and teaching), and the advancement, appreciation and teaching of the arts […] 11. provision of public information on the performance of the tasks of the universities” (§3) DIRKS – step B

  12. Identificationofrecordkeepingrequirements • Clear focus on the measurement of research and artistic performance, but not teaching or financial aspects… • Two main aims: • The new tool should represent the wide-spread range of activities of the university members. • The new tool should fulfill all requirements from the Austrian Universities act 2002 with regard to quality management and the ICR in particular DIRKS – step C

  13. Assessmentofexistingsystems • Paper & pencil approach for some years • Did not fully comply with the new requirements from the ICR • Was often criticized by staff members as to narrow • Contained a categorization system that gave us a starting point for designing the new tool DIRKS – step D

  14. Identificationofstrategiesforrecordkeeping “The purpose of Step E is to determine the most appropriate policies, practices, standards, tools and other tactics that your organisation should adopt to remedy weaknesses identified in Step D and ensure that they meet recordkeeping requirements identified in Step C” (National Archives of Australia, 2001, E-4). DIRKS – step E

  15. Design of a recordkeepingsystem Four steps: Step 2: external discussions Step 1: revised system Step 4: list of attributes Step 3: internal discussions Project owner: ICR and management requirements in mind DIRKS – step F

  16. Design of a recordkeepingsystem At present, we use eight groups of categories: • Artistic activities (11 categories) • Research-related activities (6 categories) • Pedagogic activities (6 categories) • Publications (17 categories) • Projects (3 categories) • Functions and activities (16 categories) • Awards (5 categories) • Administration and service (8 categories) DIRKS – step F

  17. Implementationof a recordkeepingsystem • Campus management system: CAMPUSonline • From campus and elsewhere (http://online.kug.ac.at) • General information and internal area (with log-in) • New feature „Leistungen“ in 2007 • Administration area for flexible configuration • Transfer of categorizational system into CAMPUSonline • Configuration on 4 levels DIRKS – step G

  18. Implementationof a recordkeepingsystem Level 2 Level 3 Level 4 Level 2 DIRKS – step G

  19. Implementationof a recordkeepingsystem • Transfer into CAMPUSonline was done all at once • Subsequent testing phase: • Test takers from different scientific and artistic fields • Test takers with wide range of computer-handling skills • Adaptations and reductions were necessary • with regard to complex data entering possibilities • with regard to visibility of icons and order of details • Testing several alternatives gradually • Theoretical importance versus practical situation! DIRKS – step G

  20. Implementationof a recordkeepingsystem • Stähler, 2002, p.290f.: Innovation does not necessarily succeed • Even if there is a benefit for the customers or users • Acquirement of knowledge is linked to time (restricted factor) • Acceptance of innovation is high with • High benefits ( highlighted in communication) • Low time demands for acquiring knowledge ( support) DIRKS – step G

  21. Implementationof a recordkeepingsystem • Roll-out to all staff members in January 2008 • Mainly three kinds of accompanying support: • Detailed script of more than 60 pages • Handed out in trainings and available for download • Small-group trainings • Up to 8 persons • From same department • Rather similar computer-handling skills • Personal assistance on demand by email or telephone DIRKS – step G

  22. Implementationof a recordkeepingsystem • Communication is of high importance! • Following the given structures • Users as multipliers in addition • Ok, but could have been better… • Internal report after 6 month (and then every 6 month) • Highlighting two aspects in communication: • Requirements from University act • Personal advantages (e.g., online self-presentation) DIRKS – step G

  23. Post-implementationreview • Presentation of persons, departments or the university to the respective community and to the public • Intellectual Capital Report • Requirements are met well • Better data in quality and quantity • Analyses much easier • Internal reports every 6 month • Evaluations of single persons or departments • Facilitates self-reports • Presentations are more traceable for peer reviewers etc. • Provides quality information in addition to quantity DIRKS – step H

  24. Post-implementationreview • End of 2009: > 10.000 data entries • Participation rate: 2/3 of academic staff • Enhancements are going on: • Adjustments due to user feedback • Expecting changes due to new governmental rules for ICR • Quality assurance • Export function • FAQs via video/podcast DIRKS – step H

  25. Post-implementationreview • Cooperation with the University for Art and Industrial Design Linz (Austria) • CAMPUSonline too • Adjusting our system to their specific needs (i.e. their scientific and artistic fields in particular) • Transferring the system • Adjusting or building respective queries and analyses • Successfully completed in June 2009 DIRKS – step H

  26. Successfulimplementation Tool provides fundamental data for performance statistics Following DIRKS Comply with quality Comply with practical standards requirements Conclusions

  27. Lessonslearnt To summarize our experiences, we have learned that • high effort for involving a wide range of persons pays off twice afterwards, • sometimes research and management interests in data collection have to be pared down in favor of usability in practice, and • extensive instructions, personal assistance, and highlighting beneficial features can make even skeptics or unskilled computer users participate in an ICT-based performance recording. Conclusions

  28. Harald Lothaller Quality Management & Reporting University of Music and Performing Arts Graz (Austria) Email: harald.lothaller@kug.ac.at Homepage: www.kug.ac.at Contact

More Related