1 / 21

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank. Richard Socher , Alex Perelygin , Jean Y. Wu, Jason Chuang, Christopher D. Manning, Andrew Y. Ng and Christopher Potts. Presented by Ben King For the NLP Reading Group November 13, 2013

tomai
Download Presentation

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recursive Deep Models for Semantic CompositionalityOver a Sentiment Treebank Richard Socher, Alex Perelygin, Jean Y. Wu, Jason Chuang, Christopher D. Manning, Andrew Y. Ng and Christopher Potts Presented by Ben King For the NLP Reading Group November 13, 2013 (some material borrowed from the NAACL 2013 Deep Learning tutorial)

  2. Sentiment Treebank

  3. Need for a Sentiment Treebank • Almost all work on sentiment analysis has used mostly word-order independent methods • But many papers acknowledge that sentiment interacts with syntax in complex ways • Little work has been done on these interactions because they’re very difficult to learn • Single-sentence sentiment classification accuracy has languished at ~80% for a long time

  4. Goal of the Sentiment Treebank • At every level of the parse tree, annotate the sentiment of the phrase it subsumes • Use a 5-class scheme (--, -, 0, +, ++)

  5. Construction of the Sentiment Treebank • For 11,855 sentences, parse and break into phrases (215,154 total) • The sentiment of each phrase is annotated with Mechanical Turk

  6. Construction of the Sentiment Treebank

  7. Deep Learning in NLP • Deep learning’s biggest victory in NLP has been to create good word representations • Instead of representing a word as a sparse vector, deep learning gives us dense vectors • These representations also encode a surprising amount of semantic information

  8. Parsing with Deep Learning • Goals: • combine word vectors into meaningful vectors of phrases • Preserve word order information

  9. Parsing with Deep Learning • At an abstract level, we have a neural network that for each pair of words gives us: • A vector that represents their combination • A plausibility score

  10. Parsing with Deep Learning • All consecutive pairs of words are examined

  11. Parsing with Deep Learning • The most plausible pair is combined • We then start the process again

  12. Parsing with Deep Learning

  13. Pros and Cons of this Approach

  14. Matrix Vector RNN (MV-RNN) • Each word has both • An associated vector (it’s meaning) • An associated matrix (it’s personal composition function) This is a good idea, but in practice, it’s way too many parameters to learn If the vectors are d-dimensional, then every word, has (d+1)×d parameters.

  15. Recursive Neural Tensor Network (RTNN) • At a high level: • The composition function is global (a tensor), which means fewer parameters to learn • In the same way that similar words have similar vectors, this lets similar words have similar composition behavior

  16. What is this model able to do? • Learns structures like “X but Y”

  17. What is this model able to do? • Small changes are able to propagate all the way up the tree

  18. What is this model able to do? • Learns how negation works, including many subtleties

  19. Negation Evaluation

  20. Sentiment Analysis Evaluation

  21. Positive and Negative N-grams

More Related