1 / 25

Comparative Investigation of Collaboratories: Cross-Cutting Themes

Comparative Investigation of Collaboratories: Cross-Cutting Themes. June 20, 2003 University of Michigan Ann Arbor. Reminder: Where We’ve Been. UM group – 15 years of experience with distributed collaboration SOC project ~40 Collaboratories at a Glance (C@G) 10 in-depth studies

Download Presentation

Comparative Investigation of Collaboratories: Cross-Cutting Themes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Comparative Investigation of Collaboratories: Cross-Cutting Themes June 20, 2003 University of Michigan Ann Arbor

  2. Reminder: Where We’ve Been • UM group – 15 years of experience with distributed collaboration • SOC project • ~40 Collaboratories at a Glance (C@G) • 10 in-depth studies • Sept. 02: SPARC/UARC, CFAR, Bugscope, EMSL • June 03: NEESgrid, InterMed, GriPhyN, iVDGL, AfCS, BIRN • “The Literature” • Your input

  3. What We Learned Here • Review Cross-cutting Themes • Modify • Refine • Eliminate • Add • Framework for generalizations • What leads to success, failure? • Source of design prescriptions • How to do the next one?

  4. Cross-cutting themes: From Prior Work • Collaboration readiness • Collaboration vs. competition in science • Bottom-up vs. top-down origins • Technology readiness • Experience with collaboration tools • Infrastructure readiness • Both technical and social • Common ground • Extent of shared knowledge; critical in interdisciplinary work • Coupling of work • The interdependencies among individuals

  5. Collaboration Readiness • Can a collaboratory be mandated by an external agency (e.g., funding source)? • NEESgrid – collaboratory capability as a condition of funding • High risk – details in presentation & discussion • History of collaboration • High energy physics vs. earthquake engineering • Science driven • AfCS • BIRN

  6. Common Ground • NEESgrid • Differences in terminology between CS & EE communities • InterMed • Importance of establishing shared vocabulary • Boundary objects, pidgins • GryPhyN, iVDGL • Too much common ground  Boundary objects as key concept [G. Bowker] • BIRN • Attention to metadata, ontology

  7. Cross-cutting Themes from SOC Analyses • What is success? • Detailed discussion in June 2001 workshop • What are the incentives for participation? • Survey study in progress • What kinds of collaboratories are there? • Taxonomy – presented later • How do collaboratories evolve? • Some ideas based on our taxonomy – presented later

  8. What is Success? • Use of the collaboratory tools • Software technology • Direct effects on the science • Science careers • Effects on learning, science education • Inspiration for other collaboratories • Learning about collaboratories in general • Effects on funding, public perception

  9. Measures of Success • GriPhyN, iVDGL • Persist beyond ITR funding • Spending less time on tools, more on science • BIRN • Cover story in Nature • Lots of publications • Multiple audiences • Beyond the scientists • Students, government, industry, general public • Collaboratory  NSF STC

  10. Incentives • AfCS • Alliance with Nature • BIRN • Guidance re publications • LHC • Shift in time scale of experiments • Implications for careers

  11. Evolution***** • Ecology of collaborations • Movement from limited to full collaboration • Data – wisdom hierarchy [G. Furnas] • Movement up and down over time and space • Relates to social vs. technical processes • Where did the field come from, where is it going? • Historical context as critical • Multi-tasking of individuals (G. Mark) • Time scale issues • AfCS – bioinformatics earlier? • BIRN –savings across successive BIRNs

  12. Wisdom Knowledge Information Data The world The relationships Practice and Expertise Distributed Research Centers Community Data Systems Shared Instruments

  13. Cross-cutting Themes from SOC Analyses • Do collaborations have an ideal size? • Collaboratories allow for larger ones • How do they scale? • What are various organizational models for how to structure collaboratories? • How does the control and flow of resources affect collaboratory success? • The money flow; the relation to the sponsor(s) • How much flexibility should be designed in? • What kinds of early commitments? • How much flexibility will funders allow?

  14. Ideal size • ATLAS • Collaboration of 2000 • But very organized • Beyond ATLAS? • Manhattan • Apollo • How many working groups can be supported? • Organizational science as source of clues • What does technology enable? • How to scale from literature on teams (G. Mark)

  15. Flexibility • Retrenchment, redefining of goals • G. Bowker – may be key to success • Funding models • AfCS – enough flexibility? (A. Prakash) • Adapting to new developments • InterMed – 1995 shift to focus on guidelines • AfCS – 2003 changing cells

  16. Cross-cutting Themes from SOC Analyses • How important are data issues in collaboratories? • Data seems to be a central component of all collaboratories • For what kind of work do you need real-time vs. asynchronous interactions? • How important is security? • What’s the mix of tailor-made vs. off-the-shelf tools?

  17. Data Issues • Metadata • Provenance • Persistence, archiving • Rationale for transformations • NEESgrid, GriPhyN, iVDGL, AfCS, BIRN • Details of size, usage – different software needs? • What level of processing? Different disciplines may vary [D. Sonnenwald] • Data sharing across jurisdictional boundaries – BIRN • IRB – data from humans • International

  18. Cross-cutting Themes from SOC Analyses • How crucial are platform issues? • What is the emerging role of middleware? • What is the role of emerging infrastructure such as the Grid? • How does one move from early prototypes to production versions of collaboratories? • Why isn’t there more reuse of collaboratory tools? • To what extent are the issues specific to science domain or are general?

  19. Moving to Production Versions • Tensions between CS and domain users • NEESgrid – “innovation vs. extrapolation” • GriPhyN & iVDGL • Moving beyond initial demo stages • Slow adoption • InterMed • Sustaining the investment • NEESgrid – NEES consortium infrastructure set up in advance • GriPhyN, iVDGL – seeking a sustaining support process • BIRN • Incentives • “build hardware” [J. Leigh] • Diffusion of Innovation literature

  20. Domain specificity • The unusual character of HEP • Long history – since Manhattan • Scale – LHC • Common knowledge, self-esteem, etc.

  21. New Issues • Human subjects issues • IRBs across jurisdictional boundaries • Need for new approach? • Management • NEESgrid – management lags implementation • InterMed – need for tight management • GriPhyN & iVDGL – hiring project managers • AfCS – charismatic management • BIRN – governance manual; adding steering committee • Vision • Who’s vision • “Acephalous” projects (G. Bowker) • Leadership issues – charisma

  22. New Issues • What kind of technology? • Specific applications vs. APIs • Generic collab vs collab in specialized tools (S. Poltrock) • Economics of the Grid (M. Cohen) • Standards as a unifying process • Politics of standards setting • BIRN in a box • “If you build it, they will come” • Highly flawed model • NEESgrid • InterMed • GriPhyN, iVDGL • Tied to incentives • Expectation management

  23. New Issues • Intellectual property • Who negotiates? • What are the arrangements? • Evaluation • Who does it? • Within the project – formative • Outside the project – summative • What is it? • Cross sectional • Longitudinal • Over what time period? • Lag effects, long term indirect effects • Be sophisticated • “science” talk vs. “informal” talk (G. Bowker)

  24. Biggest issues – my candidates • What is success? • Evolution – ecology • Transition to production versions, sustaining the vision • Data issues • How to manage collaboratories?

  25. Some SOC Issues • Are we asking the right questions? • Are we doing the right kinds of analyses? • Measures • Control groups • Are our representations useful? • Resource diagrams • Mix of science and engineering

More Related