Across
- 1. Limitation or restriction impacting solution space in optimization problems.
- 3. Group of data samples processed together in training a neural network.
- 5. Perform calculations using data in a systematic manner.
- 6. Sorting data points into predefined categories or labels.
- 9. A fundamental neural network algorithm for fine-tuning weights by minimizing error.
- 12. Labeling process crucial for training supervised learning models.
- 14. Starting point for model performance before any optimization.
- 16. Deep learning architecture commonly used for image recognition.
- 17. Feature in PyTorch enabling automatic differentiation.
- 21. Type of data that represents distinct groups or categories without any intrinsic ordering.
- 24. Optimizer named after the first man in biblical history.
- 25. Reference point for measuring model performance.
- 28. Adaptive learning strategy in reinforcement learning optimizing action selection while exploring options.
- 30. Estimation technique used in machine learning to generalize models for prediction.
- 31. Parallel computing platform and programming model by NVIDIA.
- 32. Common operation to compare elements in two sets.
- 33. A multiplier that quantifies the relationship between variables in linear regression.
- 34. Related to a probabilistic approach that updates beliefs in light of new evidence.
- 37. Type of attack where input data is subtly altered to deceive a model
- 39. Link between nodes in a neural network.
- 43. Autonomous entity in a reinforcement learning environment.
- 44. Optimization goal during training when loss function decreases.
- 45. Sequential model structure linking multiple layers.
- 46. Ensemble method involving random sampling with replacement.
- 47. Algorithm known for combining weak classifiers to form a strong one.
- 48. Neural network component responsible for generating output sequences in machine translation.
Down
- 1. The metric optimized in training to minimize error.
- 2. Technique of expanding training data by applying transformations.
- 4. Standard used for evaluating machine learning models
- 7. Group similar data points together without supervision.
- 8. Core component central to processing tasks in most computers.
- 10. Unexpected data point that deviates from the norm
- 11. The central part of a neural network where the main computations take place.
- 13. Boost in training speed achieved by hardware like GPUs or TPUs.
- 15. A key operation in neural networks, especially effective for image processing.
- 18. A technique used in deep learning to stabilize and accelerate neural network training by normalizing layer inputs.
- 19. Statistical distribution in ML algorithms inspired by a physicist's thermodynamics work.
- 20. Influential variable impacting predictions in a model
- 22. A collection of text or data used for training linguistic models.
- 23. A foundational concept in probabilistic models and statistics, often representing the likelihood of an event occurring.
- 25. Tendency of a model to make consistent errors on certain data.
- 26. Obtain annotations or data by gathering contributions from a large group of people.
- 27. Father of a theorem crucial to probabilistic modeling and inference
- 29. Function applied to neurons in a neural network to introduce non-linearity.
- 35. Bring into proper position for model optimization
- 36. Ensemble method that sequentially adjusts weak models to improve performance.
- 38. A neural mechanism that helps models focus on specific parts of input sequences.
- 40. AI model that assigns input data to specific categories or labels.
- 41. Direction of error propagation in neural network training.
- 42. Canada's neural network pioneer and deep learning stalwart.
