1 / 30

Finding Structure in Time

Finding Structure in Time. Jeffrey L. Elman Presented by: Kaushik Choudhary. Outline. Introduction The Problem with Time Networks with Memory Experiments with Exclusive-OR Structure in Letter Sequences Discovering the Notion “Word” Simple Sentences Conclusion. Introduction.

carrie
Download Presentation

Finding Structure in Time

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Finding Structure in Time Jeffrey L. Elman Presented by: KaushikChoudhary

  2. Outline • Introduction • The Problem with Time • Networks with Memory • Experiments with Exclusive-OR • Structure in Letter Sequences • Discovering the Notion “Word” • Simple Sentences • Conclusion

  3. Introduction • How might one represent temporal events in PDP models? • We utter words in a sequence and not all together! • This paper discusses an approach to account for time by the “effect it has on processing”

  4. Outline • Introduction • The Problem with Time • Networks with Memory • Experiments with Exclusive-OR • Structure in Letter Sequences • Discovering the Notion “Word” • Simple Sentences • Conclusion

  5. The Problem with Time • Possible approach - Represent temporal events by elements in a pattern vector • Problems with the approach • Would require an interface to buffer the input. It would be impossible to determine when to examine the buffer • Buffers would impose a limit on the input size and demand it to be fixed • The vectors 011100000 and 000111000 are different locations in space and thus the similarity goes undetected by PDP models.

  6. Outline • Introduction • The Problem with Time • Networks with Memory • Experiments with Exclusive-OR • Structure in Letter Sequences • Discovering the Notion “Word” • Simple Sentences • Conclusion

  7. Networks with Memory • Jordan (1986) proposed a network with recurrent connections. • In such networks the hidden units could see their previous outputs to determine the future outputs – memory of the network.

  8. Networks with Memory • In this paper, Elman proposes a similar network with additional units at the input layer. • These units are referred to as “Context Units” and are also hidden. • The input and context units activate the hidden units which in turn activate the output units and feed back the context units.

  9. Networks with Memory Output Hidden Units Input Context Units Elman’s proposed recurrent network.

  10. Networks with Memory • In the above architecture, context units remember prior internal state for a specific output • The hidden units develop a mapping to remember the temporal properties of the input • This lends the network temporal sensitivity.

  11. Outline • Introduction • The Problem with Time • Networks with Memory • Experiments with Exclusive-OR • Structure in Letter Sequences • Discovering the Notion “Word” • Simple Sentences • Conclusion

  12. Experiments with Exclusive-OR • Sample input : • Sample output: • Every third bit is XOR of 1st and 2nd • Objective of the network is to predict the next bit. • It is only possible to predict every third bit accurately.

  13. Experiments with Exclusive-OR

  14. Outline • Introduction • The Problem with Time • Networks with Memory • Experiments with Exclusive-OR • Structure in Letter Sequences • Discovering the Notion “Word” • Simple Sentences • Conclusion

  15. Structure in Letter Sequences • Sample input: Consonants b,d and g combined randomly. Then replaced with b->ba, d->dii and g->guuu. • Each letter was assigned a unique 6-bit vector.

  16. Structure in Letter Sequences • Objective of the network was to predict the next letter in the input sequence. • Network structure: 6 input units, 6 output units, 20 hidden units and 20 context units. • The network was trained through 200 passes over the sequence diibaguuubadiidiiguuu…

  17. Structure in Letter Sequences

  18. Outline • Introduction • The Problem with Time • Networks with Memory • Experiments with Exclusive-OR • Structure in Letter Sequences • Discovering the Notion “Word” • Simple Sentences • Conclusion

  19. Discovering the Notion “Word” • Input to the network: 200 sentences with no breaks between them (1270 words, 4963 letters) • Each letter represented by a 5-bit vector • Network structure: 5 input units, 5 output units, 20 hidden units and 20 context units. • Objective of the network was to predict the next letter in the sequence.

  20. Discovering the Notion “Word”

  21. Discovering the Notion “Word” • The authors defend the ambiguity in results indicating that the experiment had only set out to show that there is predictability in boundaries of words in the sequence. • And that the recurrent network is able to extract this information!

  22. Outline • Introduction • The Problem with Time • Networks with Memory • Experiments with Exclusive-OR • Structure in Letter Sequences • Discovering the Notion “Word” • Simple Sentences • Conclusion

  23. Simple Sentences • 10,000 random sentences were created. • Each word in the sentence was assigned a 31-bit vector with each bit representing a different word. • No breaks between sentences thus giving a stream of 27,534 words. • The network experienced six passes over this stream.

  24. Simple Sentences • The objective of the network was to predict the next word.

  25. Simple Sentences • The RMS error calculated based on successive words was about 0.88. • The RMS error calculated based on probability of occurrence of a word was about 0.053. • Impressive!

  26. Simple Sentences

  27. Outline • Introduction • The Problem with Time • Networks with Memory • Experiments with Exclusive-OR • Structure in Letter Sequences • Discovering the Notion “Word” • Simple Sentences • Conclusion

  28. Conclusion • Problems defined in terms of temporal events change nature. • RMS error calculated over time may be used to evaluate temporal structures. • More sequential dependencies does not necessarily translate to worse performance. • Representations of time and hence memory depend on the task in hand. • Representations may be structured.

  29. Outline • Introduction • The Problem with Time • Networks with Memory • Experiments with Exclusive-OR • Structure in Letter Sequences • Discovering the Notion “Word” • Simple Sentences • Conclusion

  30. Thank you!

More Related