Across
- 3. Method for training neural networks
- 5. Refers to the problem of learning from information that is many steps back in time
- 7. Part of LSTM that decides what information to output from the cell state
- 8. Refers to the slope of the neural network's loss function
- 11. Information transferred from one step to the next in RNNs and LSTMs
Down
- 1. Part of LSTM that decides what information to discard from the cell state
- 2. Component of LSTM that stores the state information
- 4. Basic type of recurrent neural network
- 6. Part of LSTM that decides what new information to store in the cell state
- 9. Neural network architecture that avoids the vanishing gradient problem
- 10. Mechanism used by LSTM to control the flow of information
