1 / 15

Sisob conceptual model

Sisob conceptual model. Richard Walker May 30, 2011. Overview. Goals General approach Entities Operationalizing model D2.2 Table of Contents. Goals. Common vocabulary and approach Homogeneous approach to case studies. General approach. Inspired by computer science

alia
Download Presentation

Sisob conceptual model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sisob conceptual model Richard Walker May 30, 2011

  2. Overview • Goals • General approach • Entities • Operationalizing model • D2.2 Table of Contents

  3. Goals • Common vocabulary and approach • Homogeneous approach to case studies

  4. General approach • Inspired by computer science • Abstract classes for networks, actors and outcomes • Each class has attributes • Real actors in case studies are instances of abstract classes • Measurements give value to attributes

  5. Entities • Production • Actors • Networks • Context • Distribution • Actors • Networks • Context • Consumption • Actors • Networks • Context • Outcomes • Scientific • Economic • Social

  6. Operationalizing model • What are the questions I want to ask? • What are the operational entities relevant to my model? • Example • Who are the production actors? • What are the production networks? • What are my data sources? • Do I already have access to the data I need? • Do I need crawling / data from other partners • How do I characterize my entities using my data sources? • Example • What measurements do I use to characterize networks

  7. D2.2 Table of Contents • 1Objectives and structure of this document • 2The Impact of science on society (Frontiers with contributions from all partners) • 3The SISOB conceptual mode • 3.1Goals of the model (Frontiers with input from all partner) • 3.2Overview • 3.3Entities in the model (Frontiers UDE and ELTE for measurements and UM for tools, all partners) • 4Operationalizing the model – Researcher Mobility (Unito) • 4.1Background • 4.2Goals and hypotheses of the case study • 4.3The model – an overview • 4.4Model entities • 5Operationalizing the model – Knowledge Sharing (UDE) • 5.1Background • 5.2Goals and hypotheses of the case study • 5.3The model – an overview • 5.4Model entities • 6Operationalizing the model – Literature review (Frontiers) • 6.1Background • 6.2Goals and hypotheses of the case study • 6.3The model – an overview • 6.4Model entities • 6.4.1 Appendix 1: Requirements on SISOB tools • Summary of required measurements • Summary of required measurement tools • REFERENCES • Appendix A: Common Network Indicators

  8. Peer review – data requirements Richard Walker

  9. Overview • Scientific questions • Operationalization of conceptual model • Sample hypotheses • Data sources required

  10. Scientific questions • How does peer review affect impact of science • “Traditional” issues • Cognitive biases • Traditional cronyism • “Cognitive cronyism” • “Social” issues • How do relationships among reviewers affect review process? • How do relationships among reviewers and authors affect the review process? • “New” issue • How do new models of review affect review process?

  11. Operationalizing the conceptual model

  12. Sample hypotheses • Traditional • Papers with a woman as a first author are more likely to be accepted review committee includes a woman • Social • Papers are more likely to be accepted if authors are “close” to reviewers in author-reviewer network (cognitive cronyism) • Different techniques of reviewing • Open reviewing (Frontiers) is less affected by bias x than traditional reviewing

  13. Data sets required/1 • Frontiers papers • Attributes of papers, authors, reviewers • Source: Frontiers • Use: reconstruct author and reviewer networks • All papers by Frontiers authors and reviewers (last 10 years) • Source: crawling • Use: enhance author and reviewer networks • All citations of papers in Frontiers data set • Source: crawling • Use: outcome measurement • Productivity of authors and reviewers • Measirement: number, citations, outside references • Source: crawling • Use: outcome measurement • Non-academic citations of papers in Frontiers data set • Source: crawling • Use: outcome measurement

  14. Data sets required • Conference papers (UMA) • Attributes of papers, authors, reviewers • Source: ?? • Use: reconstruct author and reviewer networks • All papers by authors and reviewers (last 10 years) • Source: crawling • Use: enhance author and reviewer networks • All citations of papers in data set • Source: crawling • Use: outcome measurement • Productivity of authors and reviewers • Measirement: number, citations, outside references • Source: crawling • Use: outcome measurement • Non-academic citations of papers in data set • Source: crawling • Use: outcome measurement

  15. Open issues • Availability of data sources • Network indicators • BIG ISSUE 1 – how do we make this useful for policy makers? • BIG ISSUE 2 – how do look this in a community perspective

More Related