CS486 - Introduction to Neural Networks and Deep Learning -Minor Test
Across
- 2. No of class labels in ImageNet
- 4. Determines the direction of next step in CNN
- 6. Activation function used for Binary Class classification
- 8. Activation function that Squashes the input to 1 to -1
- 10. Unsupervised Learning rule in ANN
- 14. No of convolution layers in VGG16
- 15. Converts stimulus from the human body
- 16. Loss function used for classification
- 18. Algorithm used for Hyperparameter optimization
Down
- 1. Squashes the Input data to +1/2 to -1/2
- 3. Neural network model that uses linear activation function
- 5. Gate used to keep information from previous step in LSTM
- 6. can be done through many-to-one RNN model
- 7. Trainable Parameters in LSTM
- 9. Fundamental unit of a neural network
- 11. determines the depth of the convoluted image in CNN
- 12. Increase/ Decrease the net input of the activation function
- 13. GATE is a linearly inseparable data
- 14. Trainable Parameters in Vanilla RNN
- 17. No of trainable parameters in ANN