8 Matching Annotations
  1. Jul 2023
  2. Nov 2021
    1. To review, the Forget gate decides what is relevant to keep from prior steps. The input gate decides what information is relevant to add from the current step. The output gate determines what the next hidden state should be.Code DemoFor those of you who understand better through seeing the code, here is an example using python pseudo code.
  3. Mar 2019
    1. A Gentle Tutorial of Recurrent Neural Network with ErrorBackpropagation

      A Gentle Tutorial of Recurrent Neural Network with ErrorBackpropagation

  4. Jan 2019
    1. MAD-GAN: Multivariate Anomaly Detection for Time Series Data with Generative Adversarial Networks

      这 paper 挺神的,用 GAN 做时序数据异常检测。主要神在 G 和 D 都仅用 LSTM-RNN 来构造的!不仅因此值得我关注,更因为该模型可以为自己思考“非模板引力波探测”带来启发!

  5. Dec 2018
    1. Sound Event Detection Using Spatial Features and Convolutional Recurrent Neural Network.

      输入数据是多通道音频信号,网络是结合了CNN 和 LSTM。

  6. Apr 2017
    1. Almost all exciting results based on recurrent neural networks are achieved with them.

      lstm是rnn的一种,但是一般所RNN指传统标准的RNN

  7. Jul 2016
    1. the LSTM are additive with respect totime, alleviating the gradient vanishing problem. Gradientexploding is still an issue, though in practice simple opti-mization strategies (such as gradient clipping) work well

      How is this problem of vanishing or exploding gradient related to eigenvalues of the W operator? Is there any research on this?