Sequential DataMany learning problems involve data whose meaning depends on order.
Recurrent ComputationA feedforward neural network processes inputs through a fixed sequence of layers. Once the output is produced, the computation ends. There is no memory of previous inputs.
Backpropagation Through TimeRecurrent networks reuse the same parameters at every time step.
Vanishing Gradients in RNNsRecurrent neural networks were designed to process sequential data by maintaining a hidden state over time.
Bidirectional NetworksStandard recurrent neural networks process sequences in one direction, usually from left to right. At time step $t$, the hidden state summarizes only the past:
Sequence Modeling ApplicationsRecurrent neural networks were among the first deep learning architectures capable of handling variable-length sequential data.