1 / 11

Report on NREDS at SIGCOMM 2003

Report on NREDS at SIGCOMM 2003. Karen Sollins ICNP 2003 November 7, 2003. Details. NREDS = Network Research: Exploration of Dimensions and Scope Date: August 25, 2003 Location: SIGCOMM 2003, Karlsruhe, Germany Funding: None - intentionally Report will be in CCR

lhitt
Download Presentation

Report on NREDS at SIGCOMM 2003

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Report onNREDS at SIGCOMM 2003 Karen Sollins ICNP 2003 November 7, 2003

  2. Details • NREDS = Network Research: Exploration of Dimensions and Scope • Date: August 25, 2003 • Location: SIGCOMM 2003, Karlsruhe, Germany • Funding: None - intentionally • Report will be in CCR • More details will be at http://www.acm.org/sigcomm

  3. Objective • Take up from Looking Over the Fence: A Neighbor’s View of Network Research • Start discussions about (for example) • the nature and choices about what to do in research, • how to value it • the impact of those valuations on the choices people make of research topics • Doing this outside the agenda of any particular agency or other funding source • Include people from broad set of kinds of “network researchers” • Begin search for what to do beyond this

  4. Committee Mark Allman (ICIR, prev. BBN Technologies) Balaji Prabhakar (Stanford) Stefan Savage (UCSD) Karen Sollins (MIT, chair) Student scribes Steve Bauer (MIT) Mayank Sharma (Stanford) Renata Teixeira (UCSD) Organization

  5. Participation • Solicited position papers: accepted 6 (short), plus invited some of the submitters whose papers were not accepted • Supplemented with other invitations to expand the participation - tried to include people who might otherwise not attend SIGCOMM - partially successful • By invitation only - limited to about 30 people

  6. Structure of the discussions • Not presentation of position papers. • Each session had a brief speaker and a briefer respondent (less than 1/2 hr and over an hour of discussion) • Four major sessions and a conclusion • Do we have a sharedmeaning of “network research”? • Where is the science in network research? • Where is the research beyond the current tipping point? • How do we value and evaluate research? How does/should our field evolve? • Where do we go from here?

  7. Meaning of “network research” • Clear disagreement about actual topics - everyone has their own favorites • Drivers for definition: • Effective impact on industry • Curiosity • Education • Definition of underlying axioms

  8. Science of “network research” • No science in “fitting” - curves, graph theory, or whatever - notice no possibility of failure, can always do curve fitting to data • Little science used in protocol design: not sure of value • TCP • BGP • Challenge is understanding complex systems • There is science in applying control, coding, and information theory - early stages, both in terms of protocol design and more architectural (multilayer design) • Discussed the importance (role and impact) of measurement, data cleansing, and archiving data - repeatable studies, time studies, etc. • Idea of “pockets” of science in the field

  9. The Tipping Point • The point at which economic choice not to change outweighs the economic choice to change - can be graphed with inflection point • What does it mean to influence the inflection point? • Can we move ourselves to an alternative curve with a different inflection point? • Questions of in which dimensions innovation happens when? E.g. process vs. product innovation, other externalities that have impact • Exploration of model of evolution of network • How do we evolve our model of evolution?

  10. Valuing and evaluating research • Find it in nature of organization: academic vs. non-profit lab vs. gov’t lab vs. industrial lab • Funding models - not only amounts, also questions of being stuck on a treadmill of incremental projects in order to keep flow going • Nature of people (senior leadership and jr. PhDs, faculty as small entrepreneurs, etc.) • Kinds of support from organization • Degree of impact of mission on research • Issues of recognition and motivation • Peer acceptance (conferences, publications ,etc.) • Organizational acceptance (promotions) • Funding (how much, from whom)

  11. Random collection of further ideas • More workshops • Small is good for conversation • Large is good for more inclusion • Longer is good for working through issues rather than just raising them • Suggestion: multiple several day workshops on same topic in parallel • Identification of “fundamental” questions of underlying theory (a la Math) • Democratization of valuation by eliminating anonymous and perhaps limited reviewing - try doing signed reviewing open to anyone who wants to have an opinion. • Encourage both broader participation and more churn on program committees - do a certain amount of tracking across committees, to spread load more broadly • Make commitments to cross-disciplinary, high risk, disruptive ideas

More Related