1 / 80

A computational model of the reviewing of object-files

A computational model of the reviewing of object-files. Michael Liddle Alistair Knott Anthony Robins. Introduction . Selective visual attention. Object-files and the object-specific advantage (OSA). Computational modeling of cortical vision.

aulani
Download Presentation

A computational model of the reviewing of object-files

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A computational model of the reviewing of object-files Michael Liddle Alistair Knott Anthony Robins

  2. Introduction • Selective visual attention. • Object-files and the object-specific advantage (OSA). • Computational modeling of cortical vision. • A neural network model of object-file reviewing (first of its kind?)

  3. Selective visual attention

  4. Managing limited resources • Retinal image contains an enormous amount of information. • Processing complexity subject to combinatorial explosion. • Solution: only bother processing information about one object at a time.

  5. Explaining the solution • What actually happens in the brain when we “attend” to an object? • Experiments indicate that attention is the means by which feature “conjunction” and “binding” occurs (Treisman & Gelade, 1980). • What is the medium of this binding? Object-files!

  6. Object-files and the object-specific advantage

  7. Object-files • Kahneman & Treisman (1984) • Provide stable repositories for visual information about four or five objects. • Maintain identity and continuity of objects during a perceptual episode. • Analogy: police files for investigations.

  8. Object-files • When attending to an object for the first time: “open” an object-file. • When reattending to an object: “review” the information in its object-file. • Reviewing involves reconciling old information with new.

  9. The object-specific advantage • Evidence for a object-specific type of priming (Kahneman, Treisman, & Gibbs, 1992), linked to object-file reviewing. • Facilitation for perceptually coherent objects, greater than general priming. • Suggestion is that previous perception of an object allows stored information to speed recognition.

  10. V Q Example: Preview

  11. Example: Linking

  12. V Example: SO condition

  13. Q Example: DO condition

  14. S Example: NM condition

  15. Recognition times NM DO SO

  16. Computational models of cortical vision

  17. Providing a foundation • Object-files must exist at a relatively high level of visual perception. • Important to consider both current thought about neurology of visual attention, as well as existing computational models.

  18. Models of object detection and recognition • Models of detection: • Retinotopic maps of salient regions (saliency maps). • Guide attentional processes. • Models of recognition: • Hierarchical structures (increasing selectivity/receptive field size) • Output encoding of feature conjunctions.

  19. A neural network model of object-file reviewing

  20. Neural network modeling • Connect collection of simple “neuron-like” components via weighted “synapse-like” components. • Basic neuron sums its inputs, and applies an “activation-function” to determine output. • Output is interpreted as a firing rate.

  21. Modeling the object-specific advantage • Need a recognition procedure that can be subject to facilitation (i.e. involves a time course). • Need to store “bottom-up stimulus” information in an object-specific way. • Need to provide “top-down expectation” based on stored information for currently attended object.

  22. Modeling the OSA • Correct expectation should lead to facilitation. • Incorrect expectation should not destroy general priming.

  23. Modeling facilitation • Use type based classification: when type is known, recognition is complete. • Enforce single winning type by lateral competition. • Winner is called the “stimulus type”. • Enhance time factor by using “cascaded activation” neurons.

  24. Type layer Hierarchical feature encoder V V Q S J Modeling facilitation

  25. Type layer Hierarchical feature encoder V V Q S J Modeling facilitation

  26. Type layer Hierarchical feature encoder V V Q S J Modeling facilitation

  27. Type layer Hierarchical feature encoder V V Q S J Recognised Modeling facilitation

  28. Storing stimulus: object specificity • FINSTs (Fingers of INSTantiation) identify “proto-objects” in the scene (Pylyshyn, 1989) • Track their proto-objects as they move and change size/shape. • Set of four or five FINSTs constantly assigned/reassigned from saliency map • Provide candidates for attention.

  29. Storing stimulus: object specificity • Associate a neuron with each FINST. • Selecting a FINST for attention activates its neuron. • Associate stimulus type with current FINST. • Thus a level of indirection is introduced between retinal location and mental representation.

  30. V Q S V J Q Storing stimulus: object specificity Association “stuff”

  31. Association “stuff” V Q S V J Q Storing stimulus: object specificity

  32. V Q S J Association stuff?

  33. Feedforward V Q Type FINST S J Feedback “Object-file” *Excitatory connections shown only Association stuff

  34. Feedforward V Type Q FINST S J V Feedback “Object-file” Q *Excitatory connections shown only Storing stimulus: feedback “stuff”

  35. Feedforward V Q Type FINST S J V Feedback “Object-file” Q *Excitatory connections shown only Storing stimulus: feedback “stuff”

  36. Feedforward V Type Q FINST S J V Feedback “Object-file” Q *Excitatory connections shown only Providing expectations: feedforward “stuff”

  37. Feedforward V Q Type FINST S J V Feedback “Object-file” Q *Excitatory connections shown only Providing expectations: feedforward “stuff”

  38. Storing stimulus:opening an object-file

  39. Feedforward V Q Type FINST S J V Q Feedback “Object-file” *Excitatory connections shown only Storing stimulus

  40. Feedforward V Q Type FINST S J V Q Feedback “Object-file” *Excitatory connections shown only Storing stimulus

  41. Feedforward V Q Type FINST S J V Q Feedback “Object-file” *Excitatory connections shown only Storing stimulus

  42. Feedforward V Q Type FINST S J V Q Feedback “Object-file” *Excitatory connections shown only Storing stimulus

  43. Feedforward V Q Type FINST S J V Q Feedback “Object-file” *Excitatory connections shown only Storing stimulus Recognised

  44. Feedforward V Q Type FINST S J V Q Feedback “Object-file” *Excitatory connections shown only Storing stimulus Recognised

  45. Feedforward V Q Type FINST S J V Q Feedback “Object-file” *Excitatory connections shown only Storing stimulus Recognised Stored

  46. Providing correct expectation:the SO condition

  47. Feedforward V Q Type FINST S J V Feedback “Object-file” *Excitatory connections shown only Providing correct expectation (SO)

  48. Feedforward V Q Type FINST S J V Feedback “Object-file” *Excitatory connections shown only Providing correct expectation (SO)

  49. Feedforward V Q Type FINST S J V Feedback “Object-file” *Excitatory connections shown only Providing correct expectation (SO)

  50. Feedforward V Q Type FINST S J V Feedback “Object-file” *Excitatory connections shown only Providing correct expectation (SO) Recognised

More Related