Natural Language Processing
Recurrent Neural Networks — Part 1
The neural network architectures such as multi-layers perceptron (MLP) were trained using the current inputs only. We did not consider previous inputs when generating the current output. In other words, our systems did not have any memory elements. RNNs address this very basic and important issue by using memory (i.e. past inputs to the Read more…