NNDL Assignment

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859
Across
  1. 1. A model performs well on training data but poorly on new data.
  2. 3. A technique to normalize the inputs of a layer.
  3. 8. A type of neural network architecture with nested layers.
  4. 11. A basic building block of a neural network.
  5. 12. A technique to ignore certain timesteps in a sequence.
  6. 17. One complete pass through the entire training dataset.
  7. 19. A type of neural network architecture with residual connections.
  8. 22. A key operation in Convolutional Neural Networks (CNNs).
  9. 24. A pooling operation that selects the maximum value in each region.
  10. 25. A technique to stop training when the validation loss stops improving.
  11. 26. A type of learning where the model learns from unlabeled data.
  12. 27. A type of neural network architecture with parallel branches.
  13. 29. A layer where all nodes are connected to all nodes in the previous layer.
  14. 31. A CNN Layer where dimensions of the input are reduced.
  15. 35. A type of neural network architecture with self-attention.
  16. 37. The simplest form of a neural network, single-layer binary classifier.
  17. 39. A technique to normalize the inputs of a layer.
  18. 41. Adapts a pre-trained model for a new task.
  19. 44. Determines the output of a node in a neural network.
  20. 45. A popular activation function introducing non-linearity.
  21. 46. A layer to learn a low-dimensional representation of categorical data.
  22. 47. A type of operation that preserves the spatial dimensions of the input.
  23. 49. A type of neural network architecture with small convolutional filters.
  24. 50. A mechanism to attend to different parts of the same input.
  25. 51. Adding extra pixels to the input to preserve its dimensions.
  26. 52. A technique to prevent overfitting in neural networks.
  27. 56. A configuration setting parameters to the model.
  28. 58. A mechanism to focus on important parts of the input.
  29. 59. A technique to fine-tune a pre-trained model for a new task.
Down
  1. 2. A neural network designed for sequential data.
  2. 4. A training algorithm for updating weights in a neural network.
  3. 5. A strategy for setting initial values in a neural network.
  4. 6. An activation function similar to sigmoid, ranges from -1 to 1.
  5. 7. A technique to prevent overfitting nodes in neural networks.
  6. 9. A type of learning where the model learns from labeled data.
  7. 10. Controls the size of steps in gradient descent.
  8. 13. A layer to concatenate multiple tensors along a specific axis.
  9. 14. A type of recurrent neural network with memory cell.
  10. 15. A pooling operation that computes the average value in each region.
  11. 16. A type of neural network architecture for unsupervised learning.
  12. 18. A technique to prevent overfitting in neural networks.
  13. 20. A type of connection that bypasses one or more layers.
  14. 21. Number of samples processed before
  15. 23. An additional parameter representing an offset in neural networks.
  16. 28. A type of operation that preserves the temporal dimensions of the input.
  17. 30. A mechanism to attend to multiple parts of the input.
  18. 32. A layer to convert a multi-dimensional tensor into a vector.
  19. 33. An architecture where information flows in one direction.
  20. 34. A type of neural network architecture with local response normalization.
  21. 36. An activation function used in the output layer for classification.
  22. 38. An issue where gradients become very small during training.
  23. 40. A technique to artificially increase the size of the training dataset.
  24. 42. An optimization algorithm for finding the minimum.
  25. 43. The number of pixels to slide the kernel across the input.
  26. 48. A type of neural network architecture for generative modeling.
  27. 53. A matrix used for convolution operation.
  28. 54. A type of neural network architecture with gates.
  29. 55. A type of encoding for sequential data.
  30. 57. A type of Neural Network handling sequential data.