ml

12345678910111213141516171819202122232425
Across
  1. 3. Algorithm used to update weights during training
  2. 7. The term for adjusting weight values using gradients
  3. 10. When market or consumer behaviour changes, causing model accuracy to drop
  4. 13. The final dataset used to evaluate general performance
  5. 14. The subset of data used for hyperparameter tuning and early stopping
  6. 16. One complete cycle through the training dataset
  7. 18. Neural network element that receives weighted inputs
  8. 19. A problem where a model fits too well to training data but performs poorly on new data
  9. 21. The process of dividing data into training, validation, and testing subsets
  10. 23. A computational model inspired by the human brain used for prediction and learning
  11. 24. When a model is too simple and fails to capture business patterns
  12. 25. Number of samples processed before weights are updated
Down
  1. 1. When a trained model memorizes the training data too closely and performs poorly on the testing dataset
  2. 2. Error resulting from overly simplistic models
  3. 4. Function measuring how far predictions deviate from actuals
  4. 5. The phenomenon where model performance worsens due to changes in market or business conditions
  5. 6. Evaluation method that mitigates dependence on one specific train–test split
  6. 8. Provides nonlinearity inside a neural network
  7. 9. A matrix of pixel values fed into a neural model
  8. 11. Input and output example pair used during training
  9. 12. A plot comparing training and validation error across epochs
  10. 15. Opposite of overfitting
  11. 17. Error caused by overly complex models
  12. 20. Popular activation function used in modern neural networks
  13. 22. Direction of steepest descent during optimization