1 / 16

A critical review of RNN for sequence learning Zachary C. Lipton zlipton@cs.ucsd

A critical review of RNN for sequence learning Zachary C. Lipton zlipton@cs.ucsd.edu. Time series. Definition : A  time series is a series of  data points  indexed (or listed or graphed) in time order.  It is a sequence of  discrete-time  data .

brentonm
Download Presentation

A critical review of RNN for sequence learning Zachary C. Lipton zlipton@cs.ucsd

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A critical review of RNN for sequence learningZachary C. Lipton zlipton@cs.ucsd.edu

  2. Timeseries • Definition:A time series is a series of datapoints indexed (or listed or graphed) in time order. It is a sequence of discrete-time data. • Feature: data points space sample from continuous real-word process • Example: stillimagesthatcomprisetheframesofvideos, clinicalmediadata,naturallanguage

  3. Neural Networks • Activationfunction: add the non-linear elements in network

  4. Neural Networks • Activationfunction: add the non-linear elements in network

  5. Neural Networks • Training process: backpropagation algorithm • Gradient decent + Chain rule • Eg: partialderivativeof e=(a+b)*(b+1) respective with respect to a and b

  6. Neural Networks • Training process: backpropagation algorithm

  7. Neural Networks • Training process: backpropagation algorithm

  8. Neural Networks • Training process: backpropagation algorithm

  9. Neural Networks • Training process: backpropagation algorithm

  10. Neural Networks • Training process: backpropagation algorithm

  11. What is RNN? • Feedforward neural network with inclusion of edge that span adjacent step times. • Input for every time step contains the input of temporary time step and the output of last time step.

  12. What is RNN? • Training method: backporpagation , gradient decent. • Limitations: Vanishing gradients.

  13. Vanishing gradient • loss function: • partial derivative of output: • partial derivative of (t-1) layer: • partial derivative of (t-q) layer: • relationship of gradients between (t-q) and t layer:

  14. LSTM (long short-term memory) • To solve the problem of vanishing gradient

  15. RNNs for Outlier Detection • Classification problem • Training RNN weights to minimise the error by normal data. • Since RNN attempts to represent the input patterns in the output, representing outliers are less well produced by the trained RNN have a higher reconstruction error.

  16. Conclusion • RNN can remember previous input. • When the problems involve continuous, prior knowledge related task, it could show advanced capability. • RNN is a data inference method, which can get the probability disribution function from x(t) mapping to y(t).--- finding the relationship between 2 time series.

More Related