CNN

123456789101112131415161718192021222324252627282930313233343536373839404142434445
Across
  1. 4. Model with high bias and low variance
  2. 7. Activation function used in the output layer for classification
  3. 8. Categorical Loss function in CNN
  4. 11. Training algorithm for updating weights in a neural network
  5. 12. Google's pretrained ML model
  6. 15. Popular activation function introducing non-linearity
  7. 16. Technique of creating new training examples by applying transformations
  8. 19. Another name for Kernel to reduce dimensions in CNNs
  9. 22. A type of recurrent neural network with memory cell
  10. 23. A type of Neural Network handling sequential data
  11. 27. Optimization algorithm using one example at a time
  12. 28. Issue where gradients become very small during training
  13. 29. Adapts a pre-trained model for a new task
  14. 32. Optimization algorithm that accumulates past gradients to accelerate learning
  15. 34. A set of data samples used in one iteration of training
  16. 35. Process of selecting and representing relevant information from input data
  17. 39. Controls the size of steps in gradient descent
  18. 40. Strategy for setting initial values in a neural network
  19. 42. Optimization algorithm for finding the minimum
  20. 43. Technique to prevent overfitting nodes in neural networks
  21. 44. Adaptive Linear Neuron, a single-layer neural network
  22. 45. Determines the output of a node in a neural network
Down
  1. 1. Configuration setting parameters to the model
  2. 2. Non-linear transformation applied to a neuron's output
  3. 3. Key operation in Convolutional Neural Networks (CNNs)
  4. 5. Activation function similar to sigmoid, ranges from -1 to 1
  5. 6. a type of recurrent neural network cell with memory like operation
  6. 7. No. of trainable parameters in GRU
  7. 9. Simplest form of a neural network, single-layer binary classifier
  8. 10. Also known as recall or TPR
  9. 13. Table used to assess the performance of a classification model
  10. 14. A pretrained Model with 13 CNN and 3 Dense Layers
  11. 17. Model performs well on training data but poorly on new data
  12. 18. No. of trainable parameters in LSTM
  13. 20. a CNN Layer where dimensions of the input are reduced
  14. 21. Neural network designed for sequential data
  15. 22. Error Square algorithm for updating weights in neural networks
  16. 24. Layer functioning as FCNN in VGG16
  17. 25. Basic building block of a neural network
  18. 26. Additional parameter representing an offset in neural networks
  19. 30. One complete pass through the entire training dataset
  20. 31. Technique to prevent exploding gradients in RNN
  21. 33. Architecture where information flows in one direction
  22. 36. Adding extra pixels or values around the edges of an image in CNN
  23. 37. Evaluation of a model on a separate dataset to tune hyperparameters
  24. 38. An unsupervised Learning rule
  25. 41. Standard version of a neural network or algorithm