1 / 9

Dynamic Neural Networks

Dynamic Neural Networks. Joseph E. Gonzalez Co-director of the RISE Lab jegonzal@cs.berkeley.edu. What is the Problem Being Solved?. Neural network computation increasing rapidly Larger networks are needed for peak accuracy Big Ideas: Adaptively scale computation for a given task

kguthrie
Download Presentation

Dynamic Neural Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dynamic Neural Networks Joseph E. Gonzalez Co-director of the RISE Labjegonzal@cs.berkeley.edu

  2. What is the Problem Being Solved? • Neural network computation increasing rapidly • Larger networks are needed for peak accuracy • Big Ideas: • Adaptively scale computation for a given task • Select only the parts of the network needed for a given input

  3. Early Work: Prediction Cascades • Viola-Jones Object Detection Framework (2001): • “Rapid Object Detection using a Boosted Cascade of Simple Features” CVPR’01 • Face detection on 384x288 at 15 fps (700MHz Pentium III) Most parts of the image don’t contain a face. Reject those regions quickly.

  4. for fast and accurate inference Dynamic Networks • IDK Cascades: Using the fastest model possible [UAI’18] SkipNet: dynamic execution within a model [ECCV’18] Skipped Blocks Query Prediction Conv Conv Conv Conv Gate Conv Conv Gate Conv FC

  5. Task Aware Feature Embeddings[CVPR’19] FF Net FF Net FF Net Baby Feature Network Emb. Network Task Aware Meta-Learner Params x Params x Params x More accurate and efficient than existing dynamic pruning networks FC Layer FC Layer FC Layer

  6. Dynamic Networks Task Aware Feature Embeddings[CVPR’19] FF Net FF Net FF Net Yes Feature Network Emb. Network Task Aware Meta-Learner Params x Params x Params x Task Description: 4 - 15% improvement on attribute-object tasks “Smiling Baby” FC Layer FC Layer FC Layer

  7. Neural Modular Networks Jacob Andreas et al., “Deep Compositional Question Answering with Neural Module Networks”

  8. Trends Today • Multi-task Learning to solve many problems • Zero-shot learning • Adjust network architecture for a given query • Neural Modular Networks • Capsule Networks • Language models … more on this in future lectures • Why are these dynamic? How does computation change with input?

  9. Dynamic Networks  Systems Issues • Reduce computation but do they reduce runtime? • Limitations in existing evaluations? • Implications on hardware executions? • Challenges in expressing dynamic computation graphs… • Likely to be the future of network design? • Modularity …

More Related