Crossword
Across
- 4. Regularization technique, randomly omitting units.
- 5. LSTM gate storing important info.
- 9. Learning based on connectedness.
- 11. Activation for multi-class classification.
- 13. Layers in between input and output.
- 14. Activation point for a neuron.
- 15. Learning with labeled training data.
- 16. Model's sensitivity to training data.
- 20. Loss function for classification.
- 22. Function defined by different equations.
- 23. Distribution characterized by a mean and variance.
- 24. Derivative indicating slope direction.
- 27. Accuracy of positive predictions.
- 28. Convolutional kernel in CNNs.
- 31. Basic unit of a neural network.
- 35. Small matrix for convolutional operations.
- 36. Training algorithm adjusting weights.
- 39. Reshaped into a one-dimensional array.
- 40. Optimization inspired by natural selection.
- 42. Large dataset for image classification.
- 43. Adaptive Linear Neuron.
- 44. Activation function, outputs between 0 and 1.
Down
- 1. Model configuration setting.
- 2. Technique preventing overfitting.
- 3. Connection point between neurons.
- 6. Simplest form of a neural network.
- 7. Subset of the training data.
- 8. Predicting numerical values.
- 10. Optimization algorithm enhancement.
- 11. Random or probabilistic.
- 12. Step size in convolutional operations.
- 17. Multi-branch convolutional architecture.
- 18. LSTM gate discarding unnecessary info.
- 19. Additional parameter, helps fit data.
- 21. Basic or standard version.
- 25. Rectified Linear Unit activation.
- 26. Adjusting pre-trained model for new task.
- 29. Dataset used to tune hyperparameters.
- 30. Activation outputs -1 or 1.
- 32. One pass through the entire training dataset.
- 33. Model fits training data too closely.
- 34. Information flows in one direction.
- 37. Adding extra pixels to input data.
- 38. Pioneering deep CNN architecture.
- 41. Down-sampling technique in CNNs.