Deep learning

123456789101112131415
Across
  1. 3. A statistical quantity that captures how widely data points scatter around their average.
  2. 5. A shrinkage technique that tends to produce sparse models by driving some coefficients to zero.
  3. 6. The model's talent for performing well on unfamiliar data, beyond what it has memorized.
  4. 8. The interval at which a convolution filter advances across the input.
  5. 9. The small matrix that slides over data in convolutional computations.
  6. 10. An open-source framework from Google, central to deep learning practice
  7. 11. Number of layers in a model is referred to as
  8. 13. A technique where neurons are "silenced" at random to prevent overfitting.
  9. 14. Artificial boundaries added to inputs, often with zeros, to preserve dimensions during convolution
  10. 15. A popular non-linear activation function that outputs zero for negatives and identify for positives.
Down
  1. 1. Algorithm used to adjust weights by calculating gradients
  2. 2. A state where the model memorizes the training set so rigidly that it performs poorly on test data.
  3. 4. The evaluation of a trained model using unseen data samples.
  4. 7. An activation function resembling a stretched letter "S".
  5. 10. A convolution type often called "deconvolution", used to up sample feature maps.
  6. 12. A layer that shrinks input dimensions, usually by taking the maximum or average.