Sequence Model-week1编程题1(RNN step by step)

一步步搭建循环神经网络

将在numpy中实现一个循环神经网络

Recurrent Neural Networks (RNN) are very effective for Natural Language Processing and other sequence tasks because they have "memory". 他们可以读取一个输入 \(x^{\langle t \rangle}\) (such as words) one at a time, 并且通过隐藏层激活 从一个 time-step 传递到下一个 time-step 来记住一些信息(information/context). 这允许单向RNN(uni-directional RNN)从过去获取信息来处理后面的输入,双向RNN(A bidirection RNN) 可以从过去和未来中获取上下文。

Notation:

  • Superscript \([l]\) denotes an object associated with the \(l^{th}\) layer.

    • Example: \(a^{[4]}\) is the \(4^{th}\) layer activation. \(W^{[5]}\) and \(b^{[5]}\) are the \(5^{th}\) layer parameters.
  • Superscript \((i)\) denotes an object associated with the \(i^{th}\) example.

    • Example: \(x^{(i)}\) is the \(i^{th}\) training example input.
  • Superscript \(\langle t \rangle\) denotes an object at the \(t^{th}\) time-step.

    • Example: \(x^{\langle t \rangle}\) is the input x at the \(t^{th}\) time-step. \(x^{(i)\langle t \rangle}\) is the input at the \(t^{th}\) timestep of example \(i\).
  • Lowerscript \(i\) denotes the \(i^{th}\) entry of a vector.

    • Example: \(a^{[l]}_i\) denotes the \(i^{th}\) entry of the activations in layer \(l\).

猜你喜欢

转载自www.cnblogs.com/douzujun/p/13179646.html