NNDL assignment

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162
Across
  1. 2. A type of learning where the model learns from labeled data
  2. 4. Layer functioning as FCNN in VGG16
  3. 5. Technique of creating new training examples by applying transformations
  4. 7. Strategy for setting initial values in a neural network
  5. 9. Table used to assess the performance of a classification model
  6. 10. A hyperparameter tuning process
  7. 12. No. of trainable parameters in GRU
  8. 13. Neural network designed for sequential data
  9. 15. Google's pretrained ML model
  10. 16. Backpropagation algorithm for updating weights in an RNN
  11. 19. Another name for True Negative Rate in Confusion matrix
  12. 20. One complete pass through the entire training dataset
  13. 24. Activation function used in the output layer for classification
  14. 29. A pooling operation that computes the average value in each region
  15. 30. Evaluation of a model on a separate dataset to tune hyperparameters
  16. 31. Activation function similar to sigmoid, ranges from -1 to 1
  17. 33. An algorithm used to update the weights in a neural network
  18. 34. Model performs well on training data but poorly on new data
  19. 36. Controls the size of steps in gradient descent
  20. 37. Training algorithm for updating weights in a neural network
  21. 38. Error Square algorithm for updating weights in neural networks
  22. 40. A pooling operation that selects the maximum value in each region
  23. 42. A pretrained Model with 13 CNN and 3 Dense Layers
  24. 43. Issue where gradients become very small during training
  25. 44. A layer to convert a multi-dimensional tensor into a 1D vector
  26. 47. A classification problem where each sample can belong to one of two classes
  27. 48. Technique to prevent overfitting nodes in neural networks
  28. 50. Architecture where information flows in one direction
  29. 51. A technique to stop training when the validation loss stops improving
  30. 52. Optimization algorithm that accumulates past gradients to accelerate learning
  31. 54. Basic building block of a neural network
  32. 56. Adaptive Linear Neuron, a single-layer neural network
  33. 58. Determines the output of a node in a neural network
  34. 60. Key operation in Convolutional Neural Networks (CNNs)
  35. 61. A technique to fine-tune a pre-trained model for a new task
  36. 62. Technique to prevent exploding gradients in RNN
Down
  1. 1. Optimization algorithm for finding the minimum
  2. 3. A type of Neural Network handling sequential data
  3. 6. Optimization algorithm using one example at a time
  4. 8. Process of selecting and representing relevant information from input data
  5. 11. Standard version of a neural network or algorithm
  6. 14. Categorical Loss function in CNN
  7. 17. Adapts a pre-trained model for a new task
  8. 18. Also known as recall or TPR
  9. 21. a CNN Layer where dimensions of the input are reduced
  10. 22. Configuration setting parameters to the model
  11. 23. No. of trainable parameters in LSTM
  12. 25. A set of data samples used in one iteration of training
  13. 26. Popular activation function introducing non-linearity
  14. 27. A type of learning where the model learns from unlabeled data
  15. 28. Simplest form of a neural network, single-layer binary classifier
  16. 32. Additional parameter representing an offset in neural networks
  17. 35. An unsupervised Learning rule
  18. 39. A loss function used in regression problems in CNN
  19. 41. Non-linear transformation applied to a neuron's output
  20. 45. Linearly inseparable logic gate
  21. 46. Model with high bias and low variance
  22. 49. a type of recurrent neural network cell with memory like operation
  23. 53. Adding extra pixels or values around the edges of an image in CNN
  24. 55. Another name for Kernel to reduce dimensions in CNNs
  25. 57. The number of pixels to slide the kernel across the input
  26. 59. A type of recurrent neural network with memory cell