A short guide to the meaning text linguistic theory
This presentation is the property of its rightful owner.
Sponsored Links
1 / 61


  • Uploaded on
  • Presentation posted in: General

A SHORT GUIDE TO THE MEANING-TEXT LINGUISTIC THEORY. JASMINA MILIĆEVIĆ DALHOUSIE UNIVERSITY - HALIFAX (CANADA) 2006, Journal of Koralex , vol. 8: 187-233. Contents. 0. Introduction (1-2) Postulates and methodological principle (2-4) Meaning-Text models (4-6)

Download Presentation


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

A short guide to the meaning text linguistic theory




2006, Journal of Koralex, vol. 8: 187-233



0. Introduction (1-2)

  • Postulates and methodologicalprinciple (2-4)

  • Meaning-Textmodels (4-6)

  • Illustration of the linguisticsynthesis in the Meaning-Textframework (6-27)

  • Summary of MTT’s main features (27-30)

  • Basic Meaning-Textbibliography(30-36)

0 introduction

0. Introduction

  • MTT = theoreticalframework for the construction of models of languages

  • Launched in Moscow (Žolkovskij & Mel’čuk 1967)

  • Developed in Russia, Canada, Europe

  • Formalcharacter computer applications

  • Relatively marginal

1 postulate 1

1. Postulate 1

  • “Natural language is (considered as) a many-to-many correspondence between an infinite denumerable set of meanings and an infinite denumerable set of texts.” (2)

    {SemRi} <=language=> {PhonRj} │0 < i, j ∞

Postulate 2

Postulate 2

  • “The Meaning-Text correspondence is described by a formal device which simulates the linguistic activity of the native speaker—a Meaning-Text Model.”(3)

Postulate 3

Postulate 3

  • “Given the complexity of the Meaning-Text correspondence, intermediate levels of (utterance) representation have to be distinguished: more specifically, a Syntacticand a Morphologicallevel.”(3)

Methodological principle


  • “The Meaning-Text correspondence should be described in the direction of synthesis, i.e., from Meaning to Text (rather than in that of analysis, i.e., from Text to Meaning).” (3)

A short guide to the meaning text linguistic theory


  • Producing speech is an activity that is more linguistic than understanding speech;

  • Some linguistic phenomena can be discovered only from the viewpoint of synthesis (ex: lexical co-occurrence = collocations).

    • Corollary:

      • study of paraphrases (and lexicon) occupies a central place in the M-T framework.



  • Synonymy = fundamental semantic relation in natural language  “to model a language means to describe its synonymic means and the ways it puts them in use”.

  • Meaning = invariant of paraphrases

  • Text = “virtual paraphrasing”

  • Lexical paraphrase  semantic decomposition of lexical meanings

S emantic decomposition of criticize

Semantic decomposition of ‘criticize’

  • (definiendum): ‘X criticizes Y for Z’

  • ≈ (definiens):

    • ‘Y having done21Z which X considers2 bad2 for Y or other people1,

    • and X believing3 that X has good11 reasons12for considering2 Z bad2, ||

    • X expresses31X’s negative11opinion1 of Y because of Z(Y),

    • specifying what X considers2 bad2 about Z,

    • with the intention2 to cause2that people1 (including Y) do not do21Z.’

2 meaning text m odels characteristics

2. Meaning-TextModels: Characteristics

  • Equative = transductive generative (Postulate 1)

  • Completelyformalized (Postulate 2)

  • Stratificational model (Postulate 3)

Mtm architecture

MTM Architecture




Neuvel.net, (adapted from Mel'chuk 1988: 49)

2 mtm peripheral structures

2. MTM: peripheral structures

  • Reflectdifferentcharacerizations of the central entity= provideadditional information relevant ateachlevel.

  • Peripheral: they do not existindependently of the central structure.

  • Purpose: to articulate the SemSinto a specific message, by specifyingthe wayitwillbe ‘packaged’ for communication.

Central and peripheral s level of r

Central and peripheral S / level of R

  • SemR = <SemS, Sem.CommS, RhetS, RefS>

  • DSyntR = < DSyntS, Dsynt-CommS, DSynt.-ProsS, Dsynt-AnaphS)

  • SSyntR = <SSyntS, SSynt-CommS, SSynt-ProsS, SSynt-AnaphS>

  • DMorphR = <DMorphS, Dmorph-ProsS>

2 mtm rules

2. MTM: rules

3 illustration linguistic synthesis

3. Illustration: LinguisticSynthesis

  • Synthesis: 1 SemR (X 2)  3 PhonR (X 2)

  • SemR [1]: Theme = mediaPhonR (1 a, b, c)

  • SemR [2]: Theme = decisionPhonR(2 a, b, c)

Semr s central structure sems

SemR’s central structure = SemS

  • A SemSrepresents the propositionalmeaning of a set of paraphrases.

  • SemS = network: nodes and arcs

  • Nodes: labeledwithsemantemes.

  • Arcs: labeledwithnumbers (predicate-argument relations).

Sems example

SemS (example)

Peripheral structure sem comms

Peripheral structure Sem-CommS

  • Sem-CommSrepresents the communicative intent of the Speaker.

  • Formally, Sem-CommS = division of the SemSinto communicative areas, eachmarkedwith one of mutually exclusive values.

Eight communicative oppositions

Eight communicative oppositions

  • Thematicity = {Theme, Rheme, Specifier}

  • Giveness = {Given, New}

  • Focalization = {Focalized, Non-Focalized}

  • Perspective = {Backgrounded, Foregrounded, Neutral}

  • Emphasis = {Emphasized, Neutral}

  • Assertiveness = {Asserted, Presupposed}

  • Unitariness = {Unitary, Articulated}

  • Locutionality = {Communicated, Signaled, Performed}

Other peripheral sem structures

Otherperipheral Sem-structures

  • Sem-RhetSrepresents the Speaker’srhetoricalintent.

  • Sem-RefS = set of pointers fromsemantic configurations to the correspondingentities in the real world.

Theme media

Theme: media

  • a. [The media]T[harshlycriticized the Government for itsdecision to increaseincome taxes]R

    b. [The media]T[seriouslycriticized the Government for itsdecision to raiseincome taxes]R

    c. [The media]T[leveledharshcriticismat the Government for itsdecision to increaseincome taxes]R

Theme media1

Theme = Media

Theme government s decision

Theme = government’sdecision

  • a. [The government’sdecision to increaseincome taxes]T[wasseverelycriticized by the media]R

    b. [The government’sdecision to raiseincome taxes]T[drewharshcriticismfrom the media]R

    c. [The government’sdecision to increaseincome taxes]T[came underharshcriticismfrom the media]R

Theme government s decision1

Theme = government’sdecision

Syntactic dependency


  • Relation of strict hierarchy

  • Characteristics:

    • Antireflexive

    • Antisymmetric

    • Antitransitive

Syntactic structure

Syntactic structure

  • Tree

  • Nodes labeled with lexical units; not linearly ordered

  • Top node does not depend on any lexical unit in the structure, while all other units depend on it, directly or indirectly.

  • Arcs (= branches) labeled with dependency relations



  • Nodes: labeledwithdeep lexical units (≠ pronouns and ‘structural words’) subscripted for all meaning-bearinginflections.

  • Branches: labeledwithnames of deepsyntacticdependency relations.

  • Deep lexical unit = lexeme, (full) phraseme or name of a lexical function.

Lexical functions

Lexical functions

  • LF = formaltoolsused to model lexical relations, i.e., restricted lexical co-occurrence (= collocations), and semanticderivation. Theyhave different lexical expressions contingent on the keyword.

  • LF corresponds to a meaningwhose expression isphraseologicallybound by a particularlexeme L (= argument of the LF).

Lexical functions examples

Lexical functions: examples

  • Magn ‘intense/very’

    • Magn(wind) = strong, powerful

    • Magn(rain(N)) = heavy, torrential // downpour

    • Magn(rain(V)) = heavily, cats and dogs

  • S1 ‘person/objectdoing L’

    • S1(crime) = author, perpetrator [of ART ˷ ] // criminal

    • S1(kill) = killer

Lexical functions classification

Lexical functions: classification

  • According to theircapacity to appear in the textalongside the keywords: syntagmatic (normally do) and paradigmatic (normally do not)

  • According to theirgenerality/universality: standard (general/universal) and non-standard (neithergeneralnoruniversal)

  • According to theirformal structure: simple and complex



  • Magn: syntagmatic, standard, simple LF

  • S1: paradigmatic, standard, simple LF

  • A YEAR that has 366 days= leap [˷] = non-standard LF: itonlyapplies to one keyword (year) and has just one value (leap); not universal (not valid cross-linguistically)

  • CausePredPlus: complex LF

Lfs realized in 1 and 2

LFsrealized in (1) and (2)

  • Magn(criticize) = bitterly, harshly, seriously, strongly // blast

  • Magn(criticism) = bitter, harsh, serious, severe, strong

  • CausePredPlus(taxes) = increase, raise

  • QSØ(criticize) = criticism

  • QSØ(decide) = decision

  • Oper1(criticism) = level[˷ at N|N denotes a person], raise[˷ against N], voice[˷]

  • Oper2(criticism) = come[under˷], draw[˷ from N], meet[with˷]

Deep lexical units

Deep lexical units

  • Do not correspond one-to-one to the surface lexemes: in the transition towards surface syntax, somedeep lexical unitsmaygetdeleted or pronominalized and some surface lexemesmaybeadded.

12 deep syntactic relations

12 Deep-Syntactic Relations

  • 6 actantialDSyntRels (I, II, III,…, VI) + 1 DSyntRel for representing direct speech (=variant of DSyntRel II)

  • 2 attributive DSyntRels: ATTRrestr(ictive) and ATTRqual(ificative)

  • 1 AppenditiveDSyntRel (APPEND): links the Main Verb to ‘extra-structural’ sentence elements (sentential adverbs, interjections,…)

  • 2 coordinative DSyntRels: COORD and QUASI-COORD

Dsyntr 1a

DSyntR – (1a)

Dsyntr 1b

DSyntR– (1b)

Dsyntr 1c

DSyntR– (1c)

Semantic module correspondence rules

Semantic module:correspondencerules

  • Lexicalizationrules

  • Morphologizationrules

  • Arborizationrules

  • Communicative rules

  • Prosodicrules

Semr 1 dsyntrs 1a and 1b

SemR[1] DSyntRs (1a) and (1b)

Semantic module equivalence rules

Semantic module: equivalencerules

  • = paraphrasingrules

  • Semanicequivalencerules equivalencebetween (fragments of) 2 SemRs

  • Lexico-syntacticrules: formulated in terms of lexical functions equivalencebetween (fragments of) 2 DSyntRs.

Ex lexical syntactic equivalence rule

Ex.: lexical-syntacticequivalencerule

From d to ssyntr the deep syntactic module

From D to SSyntR: the Deep-Syntactic module

  • SSyntS: dependencytree; nodeslabeledwithactuallexeme; branches labeledwithnames of languagespecific surface-syntacticdependency relations.

  • DSyntS≠ SSyntS:

    • Lexically: onlysemantically full lexemesvs all lexemes (including full and structural words + pronouns)

    • Syntactically : onlyuniversaldependency relations vs specificdependency relations

Dsyntr ssyntr 1a

DSyntR / SSyntR (1a)

Ssyntr 1b

SSyntR (1b)

Ssyntr 1c

SSyntR (1c)

Deep syntactic module major types of rules

Deep-Syntactic module: major types of rules

  • Phrasemicrules

  • Deep-Syntacticrules

  • Pronominalizationrules

  • Ellipsisrules

  • Communicative rules

  • Prosodicrules

6 phrasemic rules 1 a c

6 phrasemicrules (1 a-c)

  • SSyntS (1a)

    • 1) Magn(CRITICIZE) <=> harshly;

    • 2) CausPredPlus(TAXES) <=> increase

  • SSyntS (1b)

    • 3) Magn(CRITICIZE) <=> seriously;

    • 4) CausPredPlus(TAXES) <=> raise

  • SSyntS (1c)

    • 5) Oper1(CRITICISM) <=> level;

    • 6) Magn(CRITICISM) <=> harsh

Constraints examples

Constraints: examples

  • (3) a. The media raised harsh criticism against the Government for its decision to impose highertaxes. / The media leveled harsh criticism at the Government for its decision to impose higher taxes.

  • b. The media raised harsh criticism against the Government’s decision to impose higher taxes. vs. *The media leveled harsh criticism at the Government’s decision to impose higher taxes.

  • (4) ?The media raised harsh criticism against the Government for its decision to raise taxes.

Dsynt rule 1 1a 1b

DSynt-rule 1 (1a – 1b)

Dsynt rule 2 1a 1b

DSynt-rule 2 (1a-1b)

From ssyntr to dmorphr the surface syntactic module

FromSSyntR to DMorphR: the Surface-Syntactic Module

  • DMorphS = string of fullyorderedlexemessubscriptedwith all inflectional values

  • DMorph-ProsS = specification of semantically + syntacticallyinduced prosodies

Dmorphrs 1

DMorphRs (1)

  • Sentence (1a)


  • Sentence (1b)

    • THE MEDIApl|| SERIOUSLY CRITICIZEact, ind, past, 3 (?)sgTHE GOVERNMENTsg, possessive DECISIONsg| TO RAISEinfINCOMEsgTAXpl|||

  • Sentence (1c)


Ssynt module major types of rules

SSynt-module: major types of rules

  • Linearizationrules

    • Local (and semi-local):

      (5) a. [the government’s]elementary.ph. [decision]elementary.ph. [to increase]elementary.ph. [taxes]elementary.ph.

      b. [[the Government’s decision]complex ph. [to increase taxes]complex ph. ]complex ph.

    • Global

  • Morphologizationrules

  • Prosodizationrules

Example local linearization rule 1c

Example: local linearizationrule (1c)

4 main features of the mtt

4. Main features of the MTT

  • Globality, descriptive orientation

  • Semantic bases and synthesis orientation, essential role of the paraphrase and of communicative organization

  • Strongemphasis on the lexicon

  • Relationalapproach to language: the use of dependenciesat all levels of linguistic description

  • Formalcharacter

  • Stratificational and modularorganization of MTMs

  • Implementability: the MTT lendsitselfwell to computer applications

5 7 computational linguistics and nlp applications

5.7 Computational Linguistics and NLP Applications

  • ApresjanJu. et al. (2003). ETAP-3 Linguistics Processor: a Full-Fledged Implementation of the MTT. In: Kahane, S. & Nasr, A., eds. (2003), 279-288.

    • (1992). Lingvističeskii processor dljasložnyxinformacionnyx system [A LinguisticProcessor for Complex Information Systems]. Moskva: Nauka.

    • (1989). Lingvističeskoeobespečeniesistemy ÈTAP-2 [Linguistic Software for the System ETAP-2]. Moskva: Nauka.

  • Apresjan, Ju. & Tsinman, L. (1998). Perifrazirovanie na kompjutere [Paraphrasing on the Computer]. Semiotika i informatika36, 177-202.

  • Boguslavskij, I., Iomdin. L. & Sizov. V. (2004). Multilinguality in ETAP-3. Reuse of LinguisticRessources. In: Proceedings of the Conference Multilingual Linguistic Ressources. 20th International Conference on Computational Linguistics, Geneva 2004, 7-14.

5 7 computational linguistics and nlp applications1

5.7 Computational Linguistics and NLP Applications

  • Boyer, M. & Lapalme, G. (1985). Generating Paraphrases from Meaning-Text Semantic Networks. Montreal: Université de Montréal.

  • CoGenTex (1992). BilingualTextSynthesis System for Statistics Canada Database Reports : Design of Retail Trade Statistics (RTS) Prototype. Technical Report 8. CoGenTex Inc., Montreal.

  • Iordanskaja, L., Kim, M., Kittredge, R., Lavoie, B. & Polguère, A. (1992). Generationof Extended Bilingual Statistical Reports. In: COLING-92, Nantes, 1019-1022.

  • Iordanskaja, L., Kim, M. & Polguère, A. (1996). Some Procedural Problems in the Implementation of Lexical Functions for Text Generation. In: Wanner, L., ed., (1996), 279-297.

5 7 computational linguistics and nlp applications2

5.7 Computational Linguistics and NLP Applications

  • Iordanskaja, L., Kittredge, R. & Polguère, A. (1991). Lexical Selection and Paraphrase in a Meaning-Text Generation Model. In: Paris, C. L., Swartout, W. R. & Mann, W. C., eds., Natural LanguageGeneration in Artificial Intelligence and ComputationalLinguistics. Boston: Kluwer, 293-312.

  • Iordanskaja, L. & Polguère, A. (1988). Semantic Processing for Text Generation. In: Proceedings of the First International Computer Science Conference-88, Hong Kong, 19-21 December 1988, 310-318.

  • Kahane, S. & Mel’čuk, I. (1999). Synthèse des phrases à extraction en français contemporain (Du graphe sémantique à l’arbre de dépendance). T.A.L., 40:2, 25-85.

  • Kittredge, R. (2002). Paraphrasing for Condensation in Journal Abstracting. Journal of BiomedicalInformatics35: 4, 265-277.



  • MILIĆEVIĆ, Jasmina (2006): « A Short Guide to the Meaning-TextLinguisticTheory », Journal of Koralex, vol.8: 187-233.

  • NEUVEL, Sylvain: LinguisticTheories> Meaning-TextLinguistics > Introduction <http://www.neuvel.net/meaningtext.htm> (8/5/2011)

  • Login