inconsistency tolerance in sneps
Download
Skip this Video
Download Presentation
Inconsistency Tolerance in SNePS

Loading in 2 Seconds...

play fullscreen
1 / 56

Inconsistency Tolerance in SNePS - PowerPoint PPT Presentation


  • 298 Views
  • Uploaded on

Inconsistency Tolerance in SNePS. Stuart C. Shapiro Department of Computer Science and Engineering, and Center for Cognitive Science University at Buffalo, The State University of New York 201 Bell Hall, Buffalo, NY 14260-2000 [email protected] http://www.cse.buffalo.edu/~shapiro/.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Inconsistency Tolerance in SNePS' - Jimmy


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
inconsistency tolerance in sneps

Inconsistency Tolerance in SNePS

Stuart C. Shapiro

Department of Computer Science and Engineering,

and Center for Cognitive Science

University at Buffalo, The State University of New York

201 Bell Hall, Buffalo, NY 14260-2000

[email protected]

http://www.cse.buffalo.edu/~shapiro/

acknowledgements
Acknowledgements
  • João Martins
  • Frances L. Johnson
  • Bharat Bhushan
  • The SNePS Research Group
  • NSF, Instituto Nacional de Investigação Cientifica, Rome Air Development Center, AFOSR, U.S. Army CECOM

S. C. Shapiro

outline
Outline
  • Introduction
  • Some Rules of Inference
  • ~I and Belief Revision
  • Credibility Ordering and Automatic BR
  • Reasoning in Different Contexts
  • Default Reasoning by Preferential Ordering
  • Summary

S. C. Shapiro

sneps
SNePS
  • A logic- and network-based
  • Knowledge representation
  • Reasoning
  • And acting
  • System [Shapiro & Group ’02]

This talk will ignore network and acting aspects.

S. C. Shapiro

logic
Logic
  • Based on R, the logic of relevant implication

[Anderson & Belnap ’75; Martins & Shapiro ’88, Shapiro ’92]

S. C. Shapiro

supported wffs
Supported wffs

P{… …}

Set of hypotheses

From which P

has been derived.

hyphypothesis

derderived

Origin set tracks relevance and ATMS assumptions.

S. C. Shapiro

outline7
Outline
  • Introduction
  • Some Rules of Inference
  • ~I and Belief Revision
  • Credibility Ordering and Automatic BR
  • Reasoning in Different Contexts
  • Default Reasoning by Preferential Ordering
  • Summary

S. C. Shapiro

rules of inference hypothesis
Rules of Inference:Hypothesis

Hyp: P {}

: whale(Willy) and free(Willy). wff3: free(Willy) and whale(Willy) {}

S. C. Shapiro

rules of inference e
Rules of Inference:&E

&E: From A and B {}

infer A {} or B {}

wff3: free(Willy) and whale(Willy) {}

: free(Willy)? wff2: free(Willy) {}

S. C. Shapiro

rules of inference andore
Rules of Inference:andorE

The os is the union of os's of parents

wff3: free(Willy) and whale(Willy) {}

wff6:all(x)(andor(0,1){manatee(x), dolphin(x), whale(x)})

{}: dolphin(Willy)?

wff9: ~dolphin(Willy) {}

At most 1

S. C. Shapiro

rules of inference e11
Rules of Inference:=>E

The origin set is the union of os's of parents.

Since wff10: all(x)(whale(x) => mammal(x)) {}

and wff1: whale(Willy){}

I infer wff11: mammal(Willy) {}

S. C. Shapiro

rules of inference i
Rules of Inference:=>I

origin set is diff of os's of parents.

wff12: all(x)(orca(x) => whale(x)) {}

: orca(Keiko) => mammal(Keiko)?

Let me assume that wff13: orca(Keiko) {}

Since wff12: all(x)(orca(x) => whale(x)) {}and wff13: orca(Keiko){}

I infer whale(Keiko) {}

S. C. Shapiro

rules of inference i cont d
Rules of Inference:=>I (cont’d)

origin set is diff of os's of parents.

Since wff10: all(x)(whale(x) => mammal(x)) {}

and wff16: whale(Keiko) {}I infer mammal(Keiko) {}

Since wff14: mammal(Keiko) {}was derived assuming

wff13: orca(Keiko) {}I infer

wff15: orca(Keiko) => mammal(Keiko) {}

S. C. Shapiro

outline14
Outline
  • Introduction
  • Some Rules of Inference
  • ~I and Belief Revision
  • Credibility Ordering and Automatic BR
  • Reasoning in Different Contexts
  • Default Reasoning by Preferential Ordering
  • Summary

S. C. Shapiro

i and belief revision
~I and Belief Revision
  • ~I triggered when a contradiction is derived.
  • Proposition to be negated must be one of the hypotheses underlying the contradiction.
  • Origin set is the rest of the hypotheses.
  • SNeBR [Martins & Shapiro ’88] involved in choosing the culprit.

S. C. Shapiro

adding inconsistent hypotheses
Adding Inconsistent Hypotheses

wff19: all(x)(whale(x) => fish(x)){}

wff20: all(x)(andor(0,1){mammal(x), fish(x)})

{}

wff21: all(x)(fish(x) <=> has(x,scales))

{}

S. C. Shapiro

finding the contradiction
Finding the Contradiction

: has(Willy, scales)?

Since wff19: all(x)(whale(x) => fish(x)) {}

and wff1: whale(Willy) {}I infer fish(Willy) {}

Since wff21: all(x)(fish(x) <=> has(x,scales))

{}

and wff23: fish(Willy) {}I infer has(Willy,scales) {}

Since wff20:

all(x)(andor(0,1){mammal(x), fish(x)})

{}

and wff11: mammal(Willy) {}I infer it is not the case that wff23: fish(Willy)

S. C. Shapiro

manual belief revision
Manual Belief Revision

A contradiction was detected

within context default-defaultct. The contradiction involves the newly derived proposition: wff24: ~fish(Willy) {} and the previously existing proposition: wff23: fish(Willy) {} You have the following options: 1. [c]ontinue anyway, knowing that a contradiction is derivable; 2. [r]e-start the exact same run in a different context which is not inconsistent; 3. [d]rop the run altogether. (please type c, r or d)=><= r

S. C. Shapiro

br advice
BR Advice

In order to make the context consistent you must delete at least one hypothesis from the set listed below.

This set of hypotheses is known to be inconsistent:

1 : wff20: all(x)(andor(0,1){mammal(x),fish(x)})

{} (1 dependent proposition: (wff24))

2 : wff19: all(x)(whale(x) => fish(x)) {} (2 dependent propositions: (wff23 wff22))

3 : wff10: all(x)(whale(x) => mammal(x)){} (3 dependent propositions: (wff24 wff15 wff11))

4 : wff3: free(Willy) and whale(Willy) {} (8 dependent propositions:

(wff24 wff23 wff22 wff11 wff9 wff5 wff2 wff1))

User deletes #2: wff19.

S. C. Shapiro

willy has no scales
Willy has no Scales

Since wff21: all(x)(fish(x) <=> has(x,scales))

{}

and it is not the case that wff23: fish(Willy)

{}

I infer it is not the case that

wff22: has(Willy,scales) {}

wff26: ~has(Willy,scales){}

S. C. Shapiro

final kb hyps positive ders
Final KB: hyps & positive ders

: list-asserted-wffs

wff3: free(Willy) and whale(Willy) {}

wff6: all(x)(andor(0,1){manatee(x),dolphin(x),whale(x)})

{}

wff10: all(x)(whale(x) => mammal(x)) {}

wff12: all(x)(orca(x) => whale(x)) {}

wff20: all(x)(andor(0,1){mammal(x),fish(x)}) {}

wff21: all(x)(fish(x) <=> has(x,scales)) {}

wff1: whale(Willy) {}

wff2: free(Willy) {}

wff11: mammal(Willy) {}

wff15: orca(Keiko) => mammal(Keiko) {}

S. C. Shapiro

final kb hyps negative ders
Final KB: hyps & negative ders

: list-asserted-wffs

wff3: free(Willy) and whale(Willy) {}

wff6: all(x)(andor(0,1){manatee(x),dolphin(x),whale(x)})

{}

wff10: all(x)(whale(x) => mammal(x)) {}

wff12: all(x)(orca(x) => whale(x)) {}

wff20: all(x)(andor(0,1){mammal(x),fish(x)}) {}

wff21: all(x)(fish(x) <=> has(x,scales)) {}

wff9: ~dolphin(Willy) {}

wff24: ~fish(Willy) {}

wff25: ~(all(x)(whale(x) => fish(x))) {}

wff26: ~has(Willy,scales) {}

S. C. Shapiro

summary
Summary
  • Logic is paraconsistent:

P{},

~P{}

~hj

  • When a contradiction is explicitly found, the user is engaged in its resolution.

S. C. Shapiro

outline24
Outline
  • Introduction
  • Some Rules of Inference
  • ~I and Belief Revision
  • Credibility Ordering and Automatic BR
  • Reasoning in Different Contexts
  • Default Reasoning by Preferential Ordering
  • Summary

S. C. Shapiro

credibility ordering and automatic belief revision
Credibility Ordering and Automatic Belief Revision*
  • Hypotheses may be given sources.
  • Sources may be given relative credibility.
  • Hypotheses inherit relative credibility from sources.
  • Hypotheses may be given relative credibility directly. (Not shown.)
  • SNeBR may use relative credibility to choose a culprit by itself. [Shapiro & Johnson ’00]

*Not yet in released version.

S. C. Shapiro

contradictory sources
Contradictory Sources

wff1: all(x)(andor(0,1){mammal(x),fish(x)}) {}

wff2: all(x)(fish(x) <=> has(x,scales)) {}

wff3: all(x)(orca(x) => whale(x)) {}

: Source(Melville, all(x)(whale(x) => fish(x)).).

wff5: Source(Melville,all(x)(whale(x) => fish(x)))

{}

: Source(Darwin, all(x)(whale(x) => mammal(x)).).

wff7: Source(Darwin,all(x)(whale(x) => mammal(x)))

{}: Sgreater(Darwin, Melville). wff8: Sgreater(Darwin,Melville) {}

wff11: free(Willy) and whale(Willy) {}

Note: Source & Sgreater props are regular object-language props.

S. C. Shapiro

finding the contradiction27
Finding the Contradiction

: has(Willy, scales)?Since wff4: all(x)(whale(x) => fish(x)) {}and wff9: whale(Willy) {}I infer fish(Willy) {}

Since wff2: all(x)(fish(x) <=> has(x,scales)) {}and wff14: fish(Willy) {}I infer has(Willy,scales)Since wff6: all(x)(whale(x) => mammal(x)) {}and wff9: whale(Willy) {}I infer mammal(Willy)Since wff1: all(x)(andor(0,1){mammal(x),fish(x)})

{}and wff15: mammal(Willy) {}I infer it is not the case that

wff14: fish(Willy) {}

S. C. Shapiro

automatic br
Automatic BR

A contradiction was detected within context default-defaultct.

The contradiction involves the newly derived proposition: wff17: ~fish(Willy) {}

and the previously existing proposition: wff14: fish(Willy) {}

The least believed hypothesis: (wff4)

The most common hypothesis: (nil)

The hypothesis supporting the fewest wffs: (wff1)

I removed the following belief: wff4: all(x)(whale(x) => fish(x)) {}

I no longer believe the following 2 propositions: wff14: fish(Willy) {} wff13: has(Willy,scales) {}

S. C. Shapiro

summary29
Summary
  • User may select automatic BR.
  • Relative credibility is used.
  • User is informed of lost beliefs.

S. C. Shapiro

outline30
Outline
  • Introduction
  • Some Rules of Inference
  • ~I and Belief Revision
  • Credibility Ordering and Automatic BR
  • Reasoning in Different Contexts
  • Default Reasoning by Preferential Ordering
  • Summary

S. C. Shapiro

reasoning in different contexts
Reasoning in Different Contexts
  • A context is a set of hypotheses and all propositions derived from them.
  • Reasoning is performed within a context.
  • A conclusion is available in every context that is a superset of its origin set. [Martins & Shapiro ’83]
  • Contradictions across contexts are not noticed.

S. C. Shapiro

darwin context
Darwin Context

: set-context Darwin ()

: set-default-context Darwin

wff1: all(x)(andor(0,1){mammal(x),fish(x)})

{}

wff2: all(x)(fish(x) <=> has(x,scales)) {}

wff3: all(x)(orca(x) => whale(x)) {}

wff4: all(x)(whale(x) => mammal(x)) {}

wff7: free(Willy) and whale(Willy) {}

S. C. Shapiro

melville context
Melville Context

: describe-context((assertions (wff8 wff7 wff4 wff3 wff2 wff1)) (restriction nil) (named (science))): set-context Melville (wff8 wff7 wff3 wff2 wff1)((assertions (wff8 wff7 wff3 wff2 wff1)) (restriction nil) (named (melville))): set-default-context Melville((assertions (wff8 wff7 wff3 wff2 wff1)) (restriction nil) (named (melville))): all(x)(whale(x) => fish(x)). wff9: all(x)(whale(x) => fish(x)) {}

S. C. Shapiro

melville willy has scales
Melville: Willy has scales

: has(Willy, scales)?Since wff9: all(x)(whale(x) => fish(x)){}and wff5: whale(Willy) {}I infer fish(Willy) {}

Since wff2: all(x)(fish(x) <=> has(x,scales))

{}and wff11: fish(Willy) {}I infer has(Willy,scales) {}

wff10: has(Willy,scales) {}

S. C. Shapiro

darwin no scales
Darwin: No scales

: set-default-context Darwin

: has(Willy, scales)?Since wff4: all(x)(whale(x) => mammal(x)) {}and wff5: whale(Willy) {}I infer mammal(Willy)Since wff1: all(x)(andor(0,1){mammal(x),fish(x)})

{}and wff12: mammal(Willy) {}I infer it is not the case that wff11: fish(Willy)

Since wff2: all(x)(fish(x) <=> has(x,scales)) {}and it is not the case that wff11: fish(Willy)

{}I infer it is not the case that wff10: has(Willy,scales)

wff15: ~has(Willy,scales) {}

S. C. Shapiro

summary36
Summary
  • Contradictory information may be isolated in different contexts.
  • Reasoning is performed in a single context.
  • Results are available in other contexts.

S. C. Shapiro

outline37
Outline
  • Introduction
  • Some Rules of Inference
  • ~I and Belief Revision
  • Credibility Ordering and Automatic BR
  • Reasoning in Different Contexts
  • Default Reasoning by Preferential Ordering
  • Summary

S. C. Shapiro

default reasoning by preferential ordering
Default Reasoning by Preferential Ordering
  • No special syntax for default rules.
  • If P and ~P are derived
    • but argument for one is undercut by an argument for the other
    • don’t believe the undercut conclusion.
  • Unlike BR, believe the hypotheses, but not a conclusion.

[Grosof ’97, Bhushan ’03]

S. C. Shapiro

preclusion rules in sneps
Preclusion Rules in SNePS*
  • P undercuts ~P if
    • Precludes(P, ~P) or
    • Every origin set of ~P has some hyp h such that there is some hyp q in an origin set of P such that Precludes(q, h).
  • Precludes(P, Q) is a proposition like any other.

*Not yet in released version.

S. C. Shapiro

animal modes of mobility
Animal Modes of Mobility

wff1: all(x)(orca(x) => whale(x))

wff2: all(x)(whale(x) => mammal(x))

wff3: all(x)(deer(x) => mammal(x))

wff4: all(x)(tuna(x) => fish(x))

wff5: all(x)(canary(x) => bird(x))

wff6: all(x)(penguin(x) => bird(x))

wff7: all(x)(andor(0,1){swims(x),flies(x),runs(x)})

wff8: all(x)(mammal(x) => runs(x))

wff9: all(x)(fish(x) => swims(x))

wff10: all(x)(bird(x) => flies(x))

wff11: all(x)(whale(x) => swims(x))

wff12: all(x)(penguin(x) => swims(x))

S. C. Shapiro

using preclusion for exceptions
Using Preclusion for Exceptions

wff13: Precludes(all(x)(whale(x) => swims(x)),

all(x)(mammal(x) => runs(x)))

wff14: Precludes(all(x)(penguin(x) => swims(x)),

all(x)(bird(x) => flies(x)))

wff15: orca(Willy)

wff16: tuna(Charlie)

wff17: deer(Bambi)

wff18: canary(Tweety)

wff19: penguin(Opus)

S. C. Shapiro

who swims contradictory conclusions
Who Swims?(Contradictory Conclusions)

: swims(?x)?

I infer swims(Opus)

I infer swims(Charlie)

I infer swims(Willy)

I infer flies(Tweety)

I infer it is not the case that swims(Tweety)

I infer flies(Opus)

I infer it is not the case that wff20: swims(Opus)

I infer runs(Willy)

I infer it is not the case that wff24: swims(Willy)

I infer runs(Bambi)

I infer it is not the case that swims(Bambi)

S. C. Shapiro

using preclusion to arbitrate contradictions 1
Using Preclusionto Arbitrate Contradictions (1)

Since wff13: Precludes(all(x)(whale(x) => swims(x)),

all(x)(mammal(x) => runs(x)))

and wff11: all(x)(whale(x) => swims(x)) {}

holds within the BS defined by context default-defaultct

Therefore wff34: ~swims(Willy)

containing in its support

wff8: all(x)(mammal(x) => runs(x))

is precluded by wff24: swims(Willy)

that contains in its support

wff11:all(x)(whale(x) => swims(x))

S. C. Shapiro

using preclusion to arbitrate contradictions 2
Using Preclusionto Arbitrate Contradictions (2)

Since wff14: Precludes(all(x)(penguin(x) => swims(x)),

all(x)(bird(x) => flies(x)))

and wff12: all(x)(penguin(x) => swims(x))

holds within the BS defined by context default-defaultct

Therefore wff31: ~swims(Opus)

containing in its support

wff10:all(x)(bird(x) => flies(x))

is precluded by wff20: swims(Opus)

that contains in its support

wff12: all(x)(penguin(x) => swims(x))

S. C. Shapiro

the swimmers and non swimmers
The Swimmersand Non-Swimmers

wff38: ~swims(Bambi) {}

wff28: ~swims(Tweety) {}

wff24: swims(Willy) {}

wff22: swims(Charlie) {}

wff20: swims(Opus) {}

S. C. Shapiro

two level preclusion
Two-Level Preclusion

wff1: all(x)(robin(x) => bird(x))

wff2: all(x)(kiwi(x) => bird(x))

wff3: all(x)(bird(x) => flies(x))

wff4: all(x)(bird(x) => (~flies(x)))

wff5: all(x)(robin(x) => flies(x))

wff6: all(x)(kiwi(x) => (~flies(x)))

Example from Delgrande & Schaub ‘00

S. C. Shapiro

preferences
Preferences

wff7: Precludes(all(x)(robin(x) => flies(x)),

all(x)(bird(x) => (~flies(x))))

wff8: Precludes(all(x)(kiwi(x) => (~flies(x))),

all(x)(bird(x) => flies(x)))

wff12: (~location(New Zealand))

=> Precludes(all(x)(bird(x) => flies(x)),

all(x)(bird(x) => (~flies(x))))

wff14: location(New Zealand)

=> Precludes(all(x)(bird(x) => (~flies(x))),

all(x)(bird(x) => flies(x)))

wff10: ~location(New Zealand)

wff15: Precludes(location(New Zealand),

~location(New Zealand))

S. C. Shapiro

who flies
Who flies?

wff16: robin(Robin)

wff17: kiwi(Kenneth)

wff18: bird(Betty)

: flies(?x)?

S. C. Shapiro

outside new zealand
Outside New Zealand

wff24: ~flies(Kenneth){,

,

}

wff21: flies(Robin) {,

}

wff19: flies(Betty) {}

S. C. Shapiro

inside new zealand
Inside New Zealand

: location("New Zealand").

wff9: location(New Zealand)

: flies(?x)?

wff24: ~flies(Kenneth) {,

,

}

wff21: flies(Robin) {,

}

wff20: ~flies(Betty) {}

S. C. Shapiro

summary51
Summary
  • Contradictions may be handled by DR instead of by BR.
  • Hypotheses retained; conclusion removed.
  • DR uses preferential ordering among contradictory conclusions or among supporting hypotheses.
  • Precludes forms object-language proposition that may be reasoned with or reasoned about.

S. C. Shapiro

outline52
Outline
  • Introduction
  • Some Rules of Inference
  • ~I and Belief Revision
  • Credibility Ordering and Automatic BR
  • Reasoning in Different Contexts
  • Default Reasoning by Preferential Ordering
  • Summary

S. C. Shapiro

summary inconsistency tolerance in sneps
SummaryInconsistency Tolerance in SNePS
  • Inconsistency across contexts is harmless.
  • Inconsistency about unrelated topic is harmless.
  • Explicit contradiction may be resolved by user.
  • Explicit contradiction may be resolved by system using relative credibility of propositions or sources.
  • Explicit contradiction may be resolved by system using preferential ordering of conclusions or hypotheses.

S. C. Shapiro

for more information
For more information

http://www.cse.buffalo.edu/sneps/

S. C. Shapiro

references i
References I

A. R. Anderson, A. R. and N. D. Belnap, Jr. (1975) Entailment Volume I (Princeton: Princeton University Press).

B. Bhushan (2003) Preferential Ordering of Beliefs for Default Reasoning, M.S. Thesis, Department of Computer Science and Engineering, State University of New York at Buffalo, Buffalo, NY.

J. P. Delgrande and T. Schaub (2000) The role of default logic in knowledge representation. In J. Minker, ed. Logic-Based Artificial Intelligence (Boston: Kluwer Academic Publishers) 107-126.

B. N. Grosof (1997) Courteous Logic Programs: Prioritized Conflict Handling for Rules, IBM Research Report RC 20836, revised.

S. C. Shapiro

references ii
References II

J. P. Martins and S. C. Shapiro (1983) Reasoning in multiple belief spaces, Proc. Eighth IJCAI (Los Altos, CA: Morgan Kaufmann) 370-373.

J. P. Martins and S. C. Shapiro (1988) A model for belief revision, Artificial Intelligence 35, 25-79.

S. C. Shapiro (1992) Relevance logic in computer science. In A. R. Anderson, N. D. Belnap, Jr., M. Dunn, et al.Entailment Volume II (Princeton: Princeton University Press) 553-563.

S. C. Shapiro and The SNePS Implementation Group (2002) SNePS 2.6 User's Manual, Department of Computer Science and Engineering, University at Buffalo, The State University of New York, Buffalo, NY.

S. C. Shapiro and F. L. Johnson (2000) Automatic belief revision in SNePS. In C. Baral & M. Truszczyński, eds., Proc. 8th International Workshop on Non-Monotonic Reasoning.

S. C. Shapiro

ad