This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
We want to train a neural network to generate new funny words for children's book. Which architecture can we use?
Word-level LSTM
Character-level LSTM
Word-level RNN
Character-level perceptron
Recurrent neural network is called recurrent, because:
A network is applied for each input element, and output from the previous application is passed to the next one
It's trained by a recurrent process
It consists of layers that include other subnetworks
What is the main idea behind LSTM network architecture?
Fixed number of LSTM blocks for the whole dataset
It contains many layers of recurrent neural networks
Explicit state management with forgetting and state triggering
You must answer all questions before checking your work.
Continue
Need help? See our troubleshooting guide or provide specific feedback by reporting an issue.