1 / 7

Testbed/System Integration Session Highlights

Testbed/System Integration Session Highlights. Marc Light and John Burger December 17th, 2001. Proposed Discussion Themes. User-oriented testbed issues what analysts, what domains, what task, what data, etc. User studies what metrics, what UI, how often, etc. System integration

nitsa
Download Presentation

Testbed/System Integration Session Highlights

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Testbed/System IntegrationSession Highlights Marc Light and John Burger December 17th, 2001

  2. Proposed Discussion Themes • User-oriented testbed issues • what analysts, what domains, what task, what data, etc. • User studies • what metrics, what UI, how often, etc. • System integration • what platforms, degree of integration, what glue, etc.

  3. Actual Discussion Highlights • Institutional constraints on technology insertion • Frozen Infrastructure Problem • The Internet and other inter-agency nets may provide a workaround • User studies • Usability vs. usefulness • Getting user buy-in • User acceptance • System Integration Proper • End-to-end system integration • Component integration • Catalyst as glue

  4. Points of Agreement:Institutional constraints on technology insertion • Analysts want to test systems on data they know • May not be able to index data directly • May need to interface to existing search engine • Open sources are used by some analysts • There may be open sources analogous to closed ones… • Possible nets of interest: Intelink/JWICS, IC-TestNet • Internet-accessible site could provide first step for testbed involvement • E.g., password-protected www.aquaint.mitre.org • Everything should run out of a web browser

  5. Points of Agreement:User studies • Important that compared systems have similar usability • Must be able to factor interface out of user study results • Except when interface is focus of evaluation • User will over time develop ways of using the system • Need to factor out user learning effects • Most analysts are not early adopters • Some interest in Wizard of Oz experiments

  6. Points of Agreement:System Integration Proper • Distributed system (client-server) is important • Response time issues (some systems/components are slow) • Email interface could alleviate this • Input to end-to-end systems will not be just the question • There will be other input needed • There will be many end-to-end systems evaluated independently • Integrating multiple end-to-end systems will not be straightforward • Session context issues • Cautious optimism about Catalyst as integration glue

  7. Action Items • Stay on top of emerging intel community standards & constraints • E.g., JIVA • Stay on top of the TREC-style evaluation developments as the user testbed should cohere with it • Investigate nets and testing lab opportunities • Intelink/JWICS, IC-TestNet, SIPRNET, etc. • Secret Test Integration Center in DARPA building • Test Integration Center (unclassified) • Contractor development guidelines are needed • E.g., clients can’t write to local disk • MITRE will collect these • Set up Testbed mailing list • Leverage user-study expertise in community • E.g., Jean Scholtz • Initial release of public baseline QA system • Greater education about Catalyst

More Related