1 / 15

Software Surveyor Dynamically Deducing Componentware Configurations Year 1 Plan

Software Surveyor Dynamically Deducing Componentware Configurations Year 1 Plan. David Wells Paul Pazandak Object Services and Consulting, Inc. (OBJS).

roch
Download Presentation

Software Surveyor Dynamically Deducing Componentware Configurations Year 1 Plan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software SurveyorDynamically Deducing Componentware ConfigurationsYear 1 Plan David Wells Paul Pazandak Object Services and Consulting, Inc. (OBJS) The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Defense Advanced Research Projects Agency or the U.S. Government

  2. Would you drive without information? Why use software without information? “This worked yesterday, but not today. Why?” “Do I have the rightversions of everything?” “What direction amI going?” “What’s happening on each connection? When? “I didn’t know there wasconstruction here.” “Where are the hot spots?” “What will be affectedif I reconfigure?” “I evolved the system. Didthe changes work?” “I wish I’d knownI was overheating.” “My map doesn’t showall the streets.” “Where is my datacoming from / going to?” “Are there any unexpected components? Adrift Without a Compass

  3. Software Surveyor Products An extensible gauge toolkit to transparently gather, aggregate, analyze, disseminate, and visualize changing configuration & usage information in multiple technologies and levels of abstraction. • ConfigMapper - monitors software installation, dynamic binding, and component invocation events to construct a time-varying graph of application configuration. [December 2000] • ConfigComparator - compares configuration graphs created by ConfigMapper to identify differences. [March 2001] • ConfigChecker - compares configuration graphs created by ConfigMapper against design specifications to determine how/whether required components & connections were reified. [September 2001] • ActivityMapper - augments configuration graphs with activity details on each connection [December 2001] • ActivityAnalyzer - identifies trends, quiescent periods, etc. in the activity graph produced by ActivityMapper [March 2002]

  4. Ancillary Enabling Products Infrastructure tools developed to support Software Surveyor and shared with other DASADA projects. • JBCI - support tool for inserting probes and probe stubs into Java bytecode. Both GUI and programmatic interfaces to be provided [September 2000 - v0.1] • XML2Java - support tool for translating XML-encoded objects into Java classes. Intended as part of the common Event Dissemination Infrastructure. [November 2000] (existing version needs to be upgraded to use XML Schema instead of DTDs)

  5. Testbed Products Applications or tools that will form part of collaborative demonstrations (see IntelliGauge TIE report). • Smart Data Channels - Web-based information distribution mechanism that uses HTTP, XML, and content & temporal algebras to move information from Web servers to clients. Typical on next-generation information distribution tools and compatible with the GeoWorlds demo application to be used in the IntelliGauge TIE. Exposes a number of dynamic binding issues related to automatic type creation, process migration, and configuration rearrangement (to optimize performance) that cannot be explored in GeoWorlds alone. [October 2000]

  6. Requirements for Use • Because Y1 will only include probes for Java byte code, we will only be able to profile those portions of applications implemented in Java (this restriction will be relaxed in Y2) • All gauges and infrastructure tools are implemented in Java, so will run in any environment supporting a Java environment. • In Y1, only applications implemented in Java will be able to be profiled (we plan to use only Java probes in Y1; additional probes based on Instrumented Connectors and/or CORBA interception will be added in Y2). • Gauges that intend to consume Software Surveyor outputs must be able to consume events specified in the DASADA Event Schema (XML-encoded) and configurations encoded in Acme.

  7. Consumers of Y1 Products • System Administrators will use ConfigMapper, ConfigComparator, and ConfigChecker to diagnose GeoWorlds installation and reconfiguration problems. • TBASSCO (USC/ISI) will use ConfigComparator to help end user Intelligence Analysts to diagnose the sources of suspicious query results. • TBASSCO (USC/ISI) will use ConfigComparator and ConfigChecker to help end user Intelligence Analysts identify inconsistencies in query construction. • BBN and Columbia/WPI will use ConfigMapper to determine where activity monitoring and QoS probes should be installed. • The Event Infrastructure might use XML2Java to map XML-encoded events to Java-encoded events. • All Gauge Developers might use JBCI to place probes and probe stubs into applications. • Columbia and possibly USC/ISI will use the Smart Data Channels application in their demo.

  8. What We Require • Probes for various technologies (we already have probes for Java and in the future can use Instrumented Connectors for DLL-packaged components). • Group agreement on event definitions in XML Schema. • FleXML & associated management tools for event stream encoding. • Common Event Distribution Infrastructure (see IntelliGauge TIE) with events encoded in XML Schema and event posets encoded in FleXML will be available. Current group plan is to use Sienna.

  9. Measures of Success Software Surveyor probes, gauges, and infrastructure tools can be evaluated at several (increasingly meaningful) levels: • Software Quality • Probe & Gauge Coverage • Gauge Precision • Analysis Capability • Task-Specific Evaluation • Scenario-Based Evaluation

  10. Software Quality • Supporting software will be externally used by Columbia, WPI, BBN, and USC/ISI. • Gauges will be demonstrated in the context of the GeoWorlds demo in May 2001. • Gauges will be applied to typical bugs reported on the GeoWorlds Bug Reporting List. The quality of Software Surveyor probes, gauges, and ancillary tools can be evaluated through use by outside groups:

  11. Probe & Gauge Coverage Probes and gauges can be evaluated by how well they perform their intended task. • How completely and accurately can the gauges map an application’s changing configuration? • a function of the ability to place probes at component boundaries (which is in turn dependent on the ability to probe in various technologies, collect the required information at these points, and deal with security restrictions that might detailed preclude reporting). • in Y1, we will only capture information within the Java runtime; additional probing of DDLs and CORBA will be done in future years. • Given that a complete configuration graph may be impossible to construct, how well can the gauges identify and address uncertainty in the graph? • Is the level of completeness and accuracy that can be achieved for a configuration graph useful to an administrator or user?

  12. Gauge Precision The amount of detail that a gauge can provide is an important measure of the potential usefulness of the gauge, since w/o knowing how and why a configuration choice was made, it is difficult to determine if the choice is desirable or how to fix it. • Between components within processes (fine grain - narrow scope) & between processes (coarse grain - wider scope). • The process by which the connection was made • identity of the entity(s) that created the connection (linker, HTTP, CORBA ORB, Trader, manual, ...) • arguments used in creating the connection • source for the arguments (function call, file, …) • how were “open point” arguments resolved? (i.e., to what values) • is the connection static or dynamic? • when was the connection made & modified? • Whether & how the connection has been used.

  13. Analysis Capability Software Surveyor will provide analysis tools to compare configuration graphs and to match reified configurations to design specifications. • Is it possible to match graphs so that corresponding components fill the same roles in both graphs? I.e., can matching be done preserving component roles as well as graph topology? • Is the matching accurate? • Can matching be performed when portions of graphs are unknown? • How fast is the matching as a function of graph size? Is it fast enough to be useful?

  14. Task-Specific Evaluation • Improved diagnostic & debugging for multi-technology distributed software. Goal = 75% reduction in time to identify configurations and activity patterns. • Increased ability to evolve distributed software.Goal = provide 75% of detailed configuration & usage status info needed by evolution planners. • Low development & runtime overhead.Goal = automatic or GUI-enabled insertion & 1% runtime penalty • Reduced component footprintGoal = 10-90% reduction in size of component footprints by identifying unused libraries or portions thereof (applicable only when such excess footprint exists) Software Surveyor gauges can be evaluated based on how the information they provide facilitates certain specific software maintenance and debugging tasks:

  15. Scenario-Based Evaluation Software Surveyor success will be measured by how well it, in combination with other DASADA gauges, can improve the lifecycle behavior of a complex, distributed application. The GeoWorlds intelligence-analysis application is already in use at PACOM and improvements to its lifecycle behavior can be measured against historical data. Specifically: • How efficiently GeoWorlds can be installed in different environments and its services deployed. • How easily complex information management tasks can be scripted with assured semantic and syntactic interoperability. • How reliably the scripts can be executed while maintaining desired quality. • How dynamically the scripts can be evolved based on resource availability and requirement changes. • How efficiently new services can be added to GeoWorlds while maintaining compatibility

More Related