1 / 8

Multi-Layer Perceptron On A GPU

Multi-Layer Perceptron On A GPU. Scott Finley ECE 539 Fall 2008 UW-Madison. General Purpose GPU. Modern GPUs are have 100s of “stream processors” Can now be used for non-graphics computing nVida CUDA (used for this project) openCL. Three MLP Implementations.

rich
Download Presentation

Multi-Layer Perceptron On A GPU

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi-Layer Perceptron On A GPU Scott Finley ECE 539 Fall 2008 UW-Madison

  2. General Purpose GPU • Modern GPUs are have 100s of “stream processors” • Can now be used for non-graphics computing • nVida CUDA (used for this project) • openCL

  3. Three MLP Implementations • Basic Linear Algebra Subprograms (BLAS) • CPU-Only • nVidia’scuBLAS library • No explicit GPU use, library uses GPU “under the hood” • Lots of copies of data from CPU to GPU • cuBLAS with CUDA • Same cuBLAS use as above, non-BLAS operations done with CUDA.

  4. Classifying Forestry Data • Data from US forestry service • Large feature vectors: 54 • Large number of training samples: 500 per epoch • Two hidden layers • Number of neurons per layer varied

  5. Small, Contrived Data Set

  6. Cross-Platform GUI

  7. Conclusion • GPU is very powerful parallel processor • Up to two orders of magnitude improvement possible • Much more effective for large comutations • Many improvements possible • CUDA-only version needed

More Related