The Loose Speak Project
Download
1 / 43

the loose speak project james fan knowledge-based systems ... - PowerPoint PPT Presentation


  • 230 Views
  • Uploaded on

The Loose Speak Project James Fan Knowledge-Based Systems Research Group University of Texas at Austin May 8, 2002. Outline. Loose speak overview Two types of LS: Complex nominal interpretation Metonymy Future work on LS. Loose Speak.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'the loose speak project james fan knowledge-based systems ...' - paul2


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Slide1 l.jpg
The Loose Speak ProjectJames FanKnowledge-Based Systems Research GroupUniversity of Texas at AustinMay 8, 2002


Outline l.jpg
Outline

  • Loose speak overview

  • Two types of LS:

    • Complex nominal interpretation

    • Metonymy

  • Future work on LS


Loose speak l.jpg
Loose Speak

  • Loose speak(LS) : the phenomenon that human listeners are able to correctly interpret a speaker's imprecise utterance.

  • Relevance:

    • LS common in human communication, but rarely supported in human-computer-interaction.

    • Lack of LS requires SMEs talk in precise terms to KB, makes KA tedious and error prone, and contributes to the brittleness of KB.

Here's a KA example without LS ...







Loose speak project l.jpg
Loose Speak Project

  • Task: given an imprecise expression, correctly form an interpretation consistent with the existing KB.

  • Goal:

    • To show that (semi)-automatic interpretation of imprecise terms is possible.

    • It can accelerate KA.

Here's how the previous example works with LS ...



Ka with ls11 l.jpg
KA With LS

After clicking on the “See CMAP” link …



Outline13 l.jpg
Outline

  • Loose speak overview

  • Two types of LS:

    • Complex nominal interpretation

    • Metonymy

  • Future work on LS


Complex nominal interpretations l.jpg
Complex Nominal Interpretations

  • Complex nominal is an expression that has a head noun proceeded by a modifying element. [Levi '78] the semantic relation between the two is implicit.

    • Marble statue = statue made of marble

    • Animal cell = cell that is the basic structural unit of an animal

    • Metal detection = detecting metal.

  • Complex nominal interpretation task: given a complex nominal, return the semantic relation between the head noun and it's modifying element.


Related work l.jpg
Related Work

  • A set of rules that includes most of the semantic relations in complex nominals [Levi '78].

  • Hand coded rules [Leonard '84].

  • Statistically learned rules [Lauer '95].

  • Learned rules under the user's guidance [Barker '98].


Our approach l.jpg
Our Approach

  • Given a complex nominal made of two concepts H & M,

    • Search KB up to certain depth, return any relations between H & any of M's super/subclasses, or vice-versa.

    • If no relation is returned, select from a set of templates based on domain/range match.

Let's see what the templates are ...


Templates l.jpg
Templates

  • Templates: A set of 32 relations, which includes most of the common semantic relations occur in complex nominals.

  • Example:

    • (a H with (element-type ((a M))))

    • (a H with (is-part-of ((a M))))

    • ...

  • Zero, one, or multiple relations may be returned.

Let's step through a few examples ...


Example 1 l.jpg

M = Animal & H = Cell

Example 1

  • Given a complex nominal made of two concepts H & M,

    • Search KB up to certain depth, return any relations between H & any of M's super/subclasses, or vice-versa.

    • If no relation is returned, select from a set of templates based on domain/range match.

  • Do breadth-first KB search, and found the following in KB:

  • (every Cell has

    • (is-basic-structural-unit-of ((a Organism)))

  • Return:

  • (a Cell with

    • (is-basic-structural-unit-of ((a Animal)))


Example 2 l.jpg

M = Cell & H = Locomotion

Example 2

  • Given a complex nominal made of two concepts H & M,

    • Search KB up to certain depth, return any relations between H & any of M's super/subclasses, or vice-versa.

    • If no relation is returned, select from a set of templates based on domain/range match.

  • Do breadth-first KB search, and found the following in KB:

  • (every Locomotion has

    • (object ((a Tangible-Entity))))

  • Return:

  • (a Locomotion has

    • (object ((a Cell))))


Example 3 l.jpg

M = Bond & H = Energy.

Example 3

  • Given a complex nominal made of two concepts H & M,

    • Search KB up to certain depth, return any relations between H & any of M's super/subclasses, or vice-versa.

    • If no relation is returned, select from a set of templates based on domain/range match.

  • Do breadth-first KB search, and found the nothing in KB:

  • Select from the templates:

  • (a Create with (raw-material ((a Bond))) (result ((a Energy)))) -- match.

  • (a Create with (result ((a Bond))) (agent ((a Energy))) -- match.

  • (a Energy with (element-type ((a Bond)))) -- mismatch.

    ... ...


Performance measurements l.jpg
Performance Measurements

  • Precision = C / A, where C = number of instances in which a correct answer is returned, A = number of instances in which an answer is returned.

  • Recall = C / T, where C = number of instances in which a correct answer is returned, T = the total number of test instances.

  • Avg. ans. length = L / A, where L = total lengths of all the answers returned, A = number of instances in which an answer is returned.


Evaluation l.jpg
Evaluation

  • Tested on 2 sets of data from Alberts [Alberts, el] with a total of 184 test examples.

    • Our approach has similar precision and recall values as the templates method does.

    • Our approach has much shorter average answer length.

    • The distribution of answer lengths is bi-modal: 65% answers have 1 or 2 choices; 19% have 9 or 10 choices.


Evaluation continued l.jpg
Evaluation (Continued)

  • Our approach is compared to a templates based method because the templates resemble the hand-coded rules approach.

  • Mistakes from data set 2 are caused by

    • invalid data entries (e.g. phosphate residue -> phosphate substance translation)

    • incomplete KB (e.g. topic slot missing from KB).


Future work for complex nominal interpretation l.jpg
Future Work for Complex Nominal Interpretation

  • Gather more data for further evaluation.

  • Integrate the KB search with the templates.


Kb search and templates integration l.jpg
KB Search And Templates Integration

  • KB search is bounded by a certain depth.

  • The selections from the templates can direct deeper searches.

  • Example:

    • Cell Ribosome.

    • KB search: found nothing.

    • Templates:

      • (a Ribosome with (is-part-of ((a Cell))))

      • (a Ribosome with (material ((a Cell))))

      • … …

    • Deeper search reveals:

      • (a Ribosome with (element-type-of ((a Aggregate with (is-part-of ((a Cytoplasm with (is-part-of ((a Cell))))))))))


Outline26 l.jpg
Outline

  • Loose Speak Overview

  • Two types of LS:

    • Complex Nominal interpretation

    • Metonymy

  • Future work on LS


Metonymy l.jpg
Metonymy

  • Metonymy: a figurative speech in which "one entity [the metonym] is used to refer to another [the referent] that is related to it". [Lakoff & Johnson '80]

  • Example:

    • Joe read Shakespeare. It was good.

  • Metonymy resolution task: given an input expression denoting a piece of knowledge, identify any occurrence of metonymy, uncover the referent, and returned the paraphrased version of the input expression.



Traditional approaches fass 91 hobbs 93 markert hahn 97 harabagiu 98 l.jpg
Traditional Approaches [Fass '91][Hobbs '93][Markert & Hahn '97][Harabagiu '98]

  • Given an input (often in the form of sentences in natural language):

    • Detect metonymy based on detection of type constraints,

    • Resolve metonymy based on a search in metonymy space.

    • Anaphora is used to validate the result of the metonymy resolution.

Let's what the metonymy space is ...


Metonymy space l.jpg
Metonymy Space

  • Metonymy space: the set of entities related to the metonym.

  • Metonymy space construction:

    • given the metonym A, return set S = {X | exists A-r1-A1-r2-A2- ...-rn-X} where r1, r2, ..., rn are members of a fixed set of slots, such as has-part, material, agent, result, etc., and A1, A2, ..., X are frames.

    • Given A = Shakespeare, S = {Shakespeare, His Head, His Text, ...} because

      • Shakespeare, r1= self

      • Shakespeare-has-part-His Head, r1= has-part

      • Shakespeare-agent-of-Write-result-His Text, r1 = agent-of, A1= Write, r2= result

Let's step through a few examples ...


Metonymy example 1 traditional approach l.jpg

Given: “Joe read Shakespeare. It was good.”

Metonymy Example 1 (Traditional Approach)

  • Given an input (often in the form of sentences in natural language):

    • Detect metonymy based on detection of type constraints,

    • Resolve metonymy based on a search in metonymy space.

    • Anaphora is used to validate the result of the metonymy resolution.

  • Type constraints:

    • agent-of-read: Person.

    • object-of-read: Text

  • MetonymySpace = {Shakespeare, His Head, His Text ... }.

  • Selects His Text

  • Anaphora It confirms His Text fits better than Shakespeare.


Metonymy example 2 traditional approach l.jpg

Given: “electrons are removed from water molecules.”

Metonymy Example 2 (Traditional Approach)

  • Given an input (often in the form of sentences in natural language):

    • Detect metonymy based on detection of type constraints,

    • Resolve metonymy based on a search in metonymy space.

    • Anaphora is used to validate the result of the metonymy resolution.

  • Type Constraints:

    • object-of-remove: Entity.

    • base-of-remove: Entity.

  • No violation found, no metonymy resolution needed.


Metonymy example 2 continued l.jpg
Metonymy Example 2 (Continued)

  • However the input, “Electrons are removed from water molecules”, does need metonymy resolution in our representation because:

    • Remove requires the base have the object as its part, e.g. water molecule should have a part called electron.

    • Water molecule does not have a part called electron. It has a hydrogen atom part, which has a electron part, and it has an oxygen atom part, which has a electron part.

  • Therefore the literal translation of the input does NOT work, and the traditional approach does NOT give the correct answer either.


Our approach34 l.jpg

Given (a Read with (object (Shakespeare))), translate it into: Read-object-Shakespeare

Our Approach

  • Given KM expression that can be translated into X-r-Y, where X, Y are frames, and r is a slot:

    Do

    X' = X, Y' = Y

    I = Ideal(X, r)

    M = MetonymySpace(Y)

    Y = m such that m Î M and distance(m, I) < distance(m', I) for all m' Î M

    I = Ideal(Y, rinverse)

    M = MetonymySpace(X)

    X = m such that m Î M and distance(m, I) < distance(m', I) for all m' Î M

    Until (X = X' and Y = Y')

    Return X'-r-Y'

  • 1st iteration

    • X’ = Read, Y’ = Shakespeare

  • I = (a Text with (purpose ((a Role with (in-event ((a Read))))))

  • M = {Shakespeare, His Head, His Text, …}

  • Y = His Text

  • I = (a Event)

  • M = {Read}

  • X = Read

2nd iteration

ReturnRead-object-His Text


Ideal and metonymy space l.jpg
Ideal and Metonymy Space

  • Ideal:

    • Type constraints.

    • Add/delete/precondition list of the action.

    • Teleological constraints.

  • Metonymy Space: given the metonym A, return set S: {X | exists A-r1-A1-r2-A2- … -rn-X} where r1, r2, … , rn are members of a fixed set of slots, such as has-part, material, agent, result, etc., and A1, A2, … , X are frames.

  • Search depth: the n in the A-r1-A1-r2-A2- … -rn-X path mentioned above. For example, search depth of A = 0, search depth of A1= 1, etc.


Distance measurement and comparison l.jpg
Distance Measurement and Comparison

  • Distance - (p, n, t): the similarity between an element from the metonymy space and the ideal.

    • p: number of shared properties between the element and the ideal.

    • n: search depth of the element in the metonymy space.

    • t: taxonomical distance between the element and the ideal.

  • Given (p1, n1, t1) and (p2, n2, t2), then.

    • (p1, n1, t1) < (p2, n2, t2) if:

      • p1 > p2 or.

      • p1 = p2 and n1 < n2 or.

      • p1 = p2 and n1 = n2 and t1 < t2.


Metonymy example 2 continued37 l.jpg

Given (a Remove with (object ((a Electron))) (base ((a Water-Molecule)))), translate into Remove-object-Electron and Remove-base-Water-Molecule. Let’s consider Remove-base-Water-Molecule:

Metonymy Example 2 (Continued)

  • Given KM expression that can be translated into X-r-Y, where X, Y are frames, and r is a slot:

    Do

    X' = X, Y' = Y

    I = Ideal(X, r)

    M = MetonymySpace(Y)

    Y = m such that m Î M and distance(m, I) < distance(m', I) for all m' Î M

    I = Ideal(Y, rinverse)

    M = MetonymySpace(X)

    X = m such that m Î M and distance(m, I) < distance(m', I) for all m' Î M

    Until (X = X' and Y = Y')

    Return X'-r-Y'

  • 1st iteration:

    • X’ = Remove, Y’ = Water-Molecule

  • I=(a Tangible-Entity;;type constraint

  • with

  • (purpose ((a Role with (in-event ((a Remove)))) ;;teleological constraint

  • (has-part ((a Electron)))) ;; del-list

  • M = {Water-Molecule, Oxygen-Atom, Hydrogen-Atom, Electron, …}

  • Y = Oxygen-Atom

  • I = (an Event}

  • M = {Remove}

  • X = Remove

  • 2nd iteration:

  • Return: (a Remove with (object ((a Electron))) (base ((a Oxygen-Atom with (is-part-of ((a Water-Molecule)))))))


Metonymy example 3 l.jpg

Given (a Nucleus with (content ((a Object)))), translate it into Nucleus-content-Object.

Metonymy Example 3

  • Given KM expression that can be translated into X-r-Y, where X, Y are frames, and r is a slot:

    Do

    X' = X, Y' = Y

    I = Ideal(X, r)

    M = MetonymySpace(Y)

    Y = m such that m Î M and distance(m, I) < distance(m', I) for all m' Î M

    I = Ideal(Y, rinverse)

    M = MetonymySpace(X)

    X = m such that m Î M and distance(m, I) < distance(m', I) for all m' Î M

    Until (X = X' and Y = Y')

    Return X'-r-Y'

1st iteration.

X' = Nucleus, Y' = Object

I = (a Tangible-Entity)

M = {Object }

Y = Object

I = (a Container)

M = {Nucleus, Container, Nucleoplasm, ...}

X = Container

2nd iteration:

Return: (a Nucleus with (purpose ((a Container with (content ((a Object)))))))


Metonymy example 4 l.jpg

Given (a Catalyze with (instrument ((a Mitochondrion)))), translate it into Catalyze-instrument-Mitochondrion.

Metonymy Example 4

  • Given KM expression that can be translated into X-r-Y, where X, Y are frames, and r is a slot:

    Do

    X' = X, Y' = Y

    I = Ideal(X, r)

    M = MetonymySpace(Y)

    Y = m such that m Î M and distance(m, I) < distance(m', I) for all m' Î M

    I = Ideal(Y, rinverse)

    M = MetonymySpace(X)

    X = m such that m Î M and distance(m, I) < distance(m', I) for all m' Î M

    Until (X = X' and Y = Y')

    Return X'-r-Y'

  • 1st iteration:

  • X' = Catalyze, Y' = Mitochondrion

  • I = (a Chemical-Object with (purpose ((a Catalyst))))

  • M = {Mitochondrion, Container, Aggregate, Oxido-Reductase, ...}

  • Y = Oxido-Reductase

  • I = (a Event)

  • M = {Catalyze}

  • X = Catalyze

  • 2nd iteration:

  • ...

  • Return:(a Catalyze with (instrument ((a Oxido-Reductase with (element-type-of ((a Aggregate with (content-of ((a Be-Contained with (in-event-of ((a Container with (purpose-of ((a Mitochondrion)).


Future work on metonymy l.jpg
Future Work on Metonymy

  • Test data bounded by KB. More data needed for evaluation.

  • Other applications of the metonymy resolution algorithm.


Other applications of metonymy resolution l.jpg
Other Applications of Metonymy Resolution

  • Shields SMEs from the idiosyncrasy of the representation:

    • roles,

    • spatial representations,

    • aggregates,

    • properties.

      • E.g. instead of (a Car with (color ((a Color-Value with (value (:pair *red Object)))))), do (a Car with (color (*red))) directly

  • LS generation for concise display of the knowledge to users.


Outline42 l.jpg
Outline

  • Loose speak overview

  • Two types of LS:

    • Complex nominal interpretation

    • Metonymy

  • Future work on LS


Future work on ls project l.jpg
Future Work on LS Project

  • Discover more patterns of LS.

    • Overly general speak: stating knowledge in an overly general way, often using a general concept in the place of a specific one.

    • Example: "there may be 15 [RNA] polymerases speeding along the same stretch of DNA ..".

  • More extensive evaluations.

  • Explore the process of theory and validation.


ad