RNN(Recurrent Neural Networks)

1. Introduction

Today, We are going to talk about another Neural Network, we call it RNN(Recurrent Neural Networks).This type of network has many significant meanings. Recurrent Neural Network is very effective for data with sequence rows, and it can mine the temporal information and semantic information in the data.

2. Topology

RNN and its mathematical deduction

RNN is mostly similar to BP networks, however, it adds a recurrent layer to its hidden layer. When we unfold it, we could find its special structure. This structure determines that it can memorize the former information. According to its characteristic, we can implement statement recognition.

Mathematic deduction

First, we are supposed to introduce a few quantities.

RNN and its mathematical deduction
Above is a neuronal node, we call it S1. In the bottom of this node, there is h1, and its mathematic expression is h1 = UX1 + WS0. And in the top of this node, we may find an f(x), we call is activation function, this function may be sigmoid, RELU, TANH and so on .

Then we will do our deduction(This deduction we call it BPTT algorithm ):

  1. find the gradient of each parameter:
    RNN and its mathematical deduction
    RNN and its mathematical deduction
    RNN and its mathematical deduction

  2. find the update method of w:
    RNN and its mathematical deduction

And, the third equation, we could express as:
RNN and its mathematical deduction

(Hints: The plus sign on the exponent in this figure means that the derivation is only related to S3)

In the same way: the third equation of the above expression could express as :
RNN and its mathematical deduction

In the same way: RNN and its mathematical deduction

Then let us integrate these three equations:

RNN and its mathematical deduction

In fact, we are able to introduce a more simpler expression to reface this equation:
RNN and its mathematical deduction

In the same way, other two parameters’ update method we could conclude as:
RNN and its mathematical deduction
RNN and its mathematical deduction

Above are the specific deduction of RNN.

Conclusion:

RNN is a useful algorithm that could solve semantic information problems. But this is not enough. As we can see in its topology, the number of nodes of recurrent layer above are just three. When we need to add more nodes to its layer, it may be not as effective as we are supposed to because it could lost lots of information. Hence, in order to handle these problems, we are supposed to introduce another algorithm called LTMS algorithms which we will talk in my next blog.

At last, thanks all my classmates, teachers and friends for helping me solve these tough problems !! Meanwhile, thank all the readers for reading this blog completely!! There may be many mistakes in this blog, so welcome you guys to comment on this blog!!!

相关文章: