Lesson distribution gap l.jpg
This presentation is the property of its rightful owner.
Sponsored Links
1 / 49

Lesson Distribution Gap PowerPoint PPT Presentation


  • 141 Views
  • Uploaded on
  • Presentation posted in: General

Lesson Distribution Gap. David W. Aha Rosina Weber Héctor Muñoz-Avila Leonard A. Breslow Kalyan Moy Gupta. Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory Booth # 214. Outline. Introduction Contributions Context:

Download Presentation

Lesson Distribution Gap

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Lesson distribution gap l.jpg

Lesson Distribution Gap

David W. Aha

Rosina Weber

Héctor Muñoz-Avila

Leonard A. Breslow

Kalyan Moy Gupta

Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory

Booth # 214


Outline l.jpg

Outline

  • Introduction

    • Contributions

    • Context:

      • lessons learned systems, process, organizations

  • Lesson distribution gap

  • How to bridge this gap? Monitored Distribution

  • Example

  • Evaluation, Results

  • Next Steps

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Contributions l.jpg

Contributions

  • Describe lessons learned process

  • Identify gap in lesson distribution

  • Propose Monitored Distribution

  • Test hypothesis in evaluation

    • Monitored Distribution can improve plan quality

    • Plan evaluator

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Knowledge management context l.jpg

Knowledge management context

  • Three types of KM initiatives

    • knowledge repositories

    • knowledge access and transfer

    • knowledge environment

      From Davenport & Prusak’s (1998): Working Knowledge

  • Types of knowledge repositories

    • industry oriented (alert systems, best practices)

    • organization oriented (lessons learned systems)

    • for example, ..

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide5 l.jpg

government

Construction Industry Inst.

Honeywell

GM

Hewllet Packard

Bechtel Jacobs Company

Lockheed Martin E. Sys, Inc

DynMcDermott Petroleum Co.

Xerox

IBM

BestBuy

Siemens

int’l

US

European Space Agency

Italian (Alenia)

French (CNES)

Japanese (NASDA)

United Nations

Air Force

Army

Coast Guard

Joint Forces

Marine Corps

Navy

int’l

US

Department of Energy: SELLS

NASA (Ames, Goddard)

Canadian Army Lessons Learned Centre

non-government

non-military

military


Lessons learned systems l.jpg

KNOWLEDGE

ARTIFACTS

Lessons learned systems

Lessons learned systems are repositories of a knowledge artifact called lessons learned

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Lessons learned definition l.jpg

Lessons learned definition…

…or organizational lessons, lessons, lessons identified

Definition:

A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999)

Definition:

A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999)

Definition:

A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999)

Definition:

A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999)

Definition:

A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999)

Definition:

A lesson learned consists of knowledge or understanding gained by experience. The experience may be positive, as in a successful test or mission, or negative, as in a mishap or failure. A lesson must be significant in that it has a real or assumed impact on operations; valid in that is factually and technically correct; and applicable in that it identifies a specific design, process, or decision that reduces or eliminates the potential for failures and mishaps, or reinforces a positive result.” (Secchi et al., 1999)

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Lessons learned process l.jpg

REUSE

RETRIEVE

REVISE

RETAIN

Lessons learned process

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Lessons learned representation l.jpg

  • indexing elements (case problem)

  • reuse elements (case solution)

  • applicable task

  • preconditions

  • lesson suggestion

  • rationale

Lessons learned representation

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide10 l.jpg

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Lessons learned example l.jpg

Lessons learned example

applicable task

Installing custom stereo speakers.

preconditions

The car is the Porsche Boxster.

lesson suggestion

Make sure you distinguish the wires leading to the speakers from the wires leading to the side airbag.

rationale

Somebody has cut the wrong wire because they look alike and the airbag went off with explosive force. This means spending several thousand dollars to replace the airbag in addition to be a potential hazard.

From article “Learning from Mistakes” about Best Buy in knowledge management magazine, April 2001.

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Lessons learned process12 l.jpg

Lessons learned process

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Lesson distribution methods l.jpg

Lesson distribution methods

Broadcastingbulletins, doctrine

Passivestandalone repository

Pull

Push

Active castinglist servers,information gathering tools

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Problems with lesson distribution methods l.jpg

Problems with lesson distribution methods

  • Distribution is divorced from targeted organizational processes.

  • Users maynot know or be reminded of the repository, as they need to access a standalone tool to search for lessons.

  • Users maynot beconvinced of the potential utility of lessons.

  • Users maynot have the time and skills to retrieve and interpret textual lessons.

  • Users maynotbe able to apply lessons successfully.

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Here is the gap l.jpg

Organization’s

members

Repository of lessons learned

Here is the gap

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


How to bridge this gap l.jpg

Organization’s

members

Repository of lessons learned

How to bridge this gap?

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide17 l.jpg

Organization’s

members

Repository of lessons learned

How to bridge this gap?

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide18 l.jpg

Organization’s

members

Repository of lessons learned

How to bridge this gap?

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide19 l.jpg

Organization’s

members

Repository of lessons learned

How to bridge this gap?

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide20 l.jpg

Organization’s

members

Repository of lessons learned

How to bridge this gap?

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide21 l.jpg

Organization’s

members

Repository of lessons learned

How to bridge this gap?

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide22 l.jpg

Organization’s

members

Repository of lessons learned

How to bridge this gap?

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide23 l.jpg

Organization’s

members

Repository of lessons learned

How to bridge this gap?

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide24 l.jpg

Organization’s

members

Repository of lessons learned

How to bridge this gap?

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Monitored distribution l.jpg

Organization’s

members

Repository of lessons learned

Monitored distribution

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Monitored distribution26 l.jpg

Organization’s

members

Repository of lessons learned

Monitored distribution

Lesson repository is in the same context astargeted processes

Organizational processes

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Problems solutions l.jpg

Problems & Solutions

Distribution is divorced from targeted organizational processes.

Users need to access a standalone tool to search for lessons.

Lessons are distributed to users in the context of the organizational processes.

Users don’t need to access a standalone tool.

Distribute in the same context does not suffice!


Problems solutions28 l.jpg

Problems & Solutions

Users maynot have the time and skills to retrieve relevant lessons.

Users may not be convinced of the potential utility of lessons.

Users maynotbe able to apply lessons successfully.

Intrusive methods may cause more problems than solutions.

No significant additional time or skills are required.

Users can assess the potential utility of lessons easily.

Whenever possible, an ‘apply’ button allows the lesson to be automatically executable.

Distribution tightly integrated to the targeted processes.


Monitored distribution characteristics l.jpg

Monitored Distribution Characteristics

  • Distribution tightly integrated to the targeted processes so that lessons are distributed when and wherethey are needed.

    • Represent lessons as cases (knowledge modeling).

    • Lessons are indexed by their applicability.

  • Additional benefits are:

    • Case representation facilitates interpretation.

    • Users assess potential utility with lesson rationale.

    • Whenever possible, an ‘apply’ button allows the lesson to be automatically executable.


Slide30 l.jpg

Noncombatant Evacuation Operations (NEO)

Military operations to evacuate noncombatants whose lives are in danger and rescue them to a safe haven

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide31 l.jpg

Assembly Point

Campaign headquarters

Intermediate Staging Base

.

safe haven

NEO site


Example in hicap l.jpg

Example in HICAP

  • HICAP is a plan authoring tool suite

  • Users interact with HICAP by refining an HTN (hierarchical task network) through decompositions

  • http://www.aic.nrl.navy.mil/hicap

  • Muñoz-Avila et al., 1999

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide33 l.jpg

safe haven

NEO site

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Selecting the suggested case l.jpg

Selecting the Suggested Case…

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Expanding yields l.jpg

Expanding yields…

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


And the user is notified of a lesson l.jpg

And the user is notified of a lesson

RATIONALE:

TYPE: advice

Clandestine SOF should not be used alone

WHY: The enemy might be able to infer that SOF are involved, exposing them.

RATIONALE:

TYPE: advice

Clandestine SOF should not be used alone

WHY: The enemy might be able to infer that SOF are involved, exposing them.

RATIONALE:

TYPE: advice

Clandestine SOF should not be used alone

WHY: The enemy might be able to infer that SOF are involved, exposing

WHY: The enemy might be able to infer that SOF are involved, exposing

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


After applying the lesson l.jpg

After applying the lesson

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Evaluation l.jpg

Evaluation

  • Hypothesis

    • Using lessons will improve plan quality

  • Methodology

    • Simulated HICAP users generated NEO plans with and without lessons

    • Plan evaluator implemented plans

      • Plan total duration

      • Plan duration before medical assistance

      • Casualties: evacuees, friendly forces, enemies

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Plan evaluator l.jpg

Plan evaluator

  • non-deterministic (100 plans 10 times each)

  • 30 variables: 12 random

    • e.g., weather, airports

  • length of plans 18 steps

    • e.g., transportation mode, supplies, team

  • size of planning space 3,000,000

  • 13 actual lessons

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Plan implementation l.jpg

Plan implementation

  • Plans where evacuees were transported by land modes have an increased chance of being attacked by enemies.

  • When an attack happens it increases the number of casualties among evacuees and friendly forces (in proportion to # of evacuees).

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Results l.jpg

Results

*The resulting values are averages

no lessons

with lessons

reduction

NEO plan

total duration*

32h48

18 %

39h50

duration until

medical assistance*

24h13

18 %

29h37

casualties

among evacuees

24 %

11.48

8.69

casualties among

friendly forces

6.57

30 %

9.41

casualties

among enemies

-2 %

3.08

3.14

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Next steps l.jpg

Next Steps

  • Collection tool

  • Verify methods, reasoning

  • Integration of informal groups’ and user’s individual features

  • Evaluation with human subjects (simulated users in HICAP) and let human subjects decide on applying lessons

  • Extend MD to other decision support systems and other knowledge artifacts

  • Investigate distribution of experiential knowledge with training knowledge

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Slide43 l.jpg

David W. Aha

Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory, Washington, DC

Rosina Weber

Department of Computer Science, University of Wyoming

Fall 2001 at Drexel University, PA

Héctor Muñoz-Avila

Department of Computer Science, University of Maryland

Fall 2001 at Leehigh University, PA

Leonard A. Breslow

Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory,Washington, DC

Kalyan Moy Gupta

IIT Industries, AES Division, Alexandria, Virginia

Questions?


Slide44 l.jpg

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Cbr cycle and knowledge processes l.jpg

distribute

CBR Cycle and Knowledge Processes

Aamodt & Plaza 1994


Discussion l.jpg

Discussion

  • Intrusive method requires good precision

  • Knowledge representation is costly and so are lives!

  • What’s the worth of 35,000 unused lessons?

  • Knowledge representation can be also support validation

  • Good news: collect lessons into case representation.

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Plan evaluator example lesson l.jpg

Plan evaluator: example lesson

  • applicable task:

  • Assign security element.

  • Conditions for applicability:

  • There are hundreds or more evacuees as to justify a security effort.

  • Lesson suggestion:

  • Recommend that EOD* personnel is utilized in security element.

  • Rationale:

  • Success.

  • EOD two DET ten personnel were employed in a force protection role and assisted USS Nassau security teams in identifying and investigating suspect items brought aboard by evacuees.

  • *EXPLOSIVE ORDNANCE DISPOSAL

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Clarification l.jpg

Clarification

  • How is monitored distribution (MD) different from Clippie?

    • In MD, case base task is applicability

    • MD distributes experiential knowledge collected from users in similar roles as the potential reuser

    • Clippie is activated by single word

    • Clippie distributes general instructions/information

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


Additional lesson l.jpg

additional lesson

  • Conditions for applicability:

  • There are representatives of different branches assigned to participate.

  • Lesson suggestion:

  • Assign representatives of all forces to plan.

  • Rationale:Lack of representatives prevent good communication causing delays and miscommunication.

Rosina Weber IJCAI01 8 Aug 2001 Seattle, WA


  • Login