Deep learning
Across
- 3. A statistical quantity that captures how widely data points scatter around their average.
- 5. A shrinkage technique that tends to produce sparse models by driving some coefficients to zero.
- 6. The model's talent for performing well on unfamiliar data, beyond what it has memorized.
- 8. The interval at which a convolution filter advances across the input.
- 9. The small matrix that slides over data in convolutional computations.
- 10. An open-source framework from Google, central to deep learning practice
- 11. Number of layers in a model is referred to as
- 13. A technique where neurons are "silenced" at random to prevent overfitting.
- 14. Artificial boundaries added to inputs, often with zeros, to preserve dimensions during convolution
- 15. A popular non-linear activation function that outputs zero for negatives and identify for positives.
Down
- 1. Algorithm used to adjust weights by calculating gradients
- 2. A state where the model memorizes the training set so rigidly that it performs poorly on test data.
- 4. The evaluation of a trained model using unseen data samples.
- 7. An activation function resembling a stretched letter "S".
- 10. A convolution type often called "deconvolution", used to up sample feature maps.
- 12. A layer that shrinks input dimensions, usually by taking the maximum or average.