Across
- 1. How far the filter moves in every step along one direction
- 4. An unsupervised learning rule that attempts to connect the psychological and neurological underpinnings of learning &
- 8. Function used to calculate the total net input &
- 9. Most popularly used activation function in the areas of convolutional neural networks and deep learning: f(x) is zero when x is less than zero and f(x) is equal to x when x is above or equal to zero &
- 11. Applied after the convolutional layer, used to reduce the dimensions of the feature maps
- 13. A gradient descent learning rule for updating the weights of the inputs to artificial neurons in a single-layer neural network &
- 15. Addition of extra pixels around the borders of the input images or feature map
- 18. The output remains the same as the input &
- 20. A machine learning training method based on rewarding desired behaviors and punishing undesired ones &
- 21. Simplest type of RNN
- 23. A RNN, aimed to deal with the vanishing gradient problem present in traditional RNNs
- 24. A process by which an initial set of data is reduced by identifying key features of the data for machine learning &
- 26. A technique to prevent exploding gradients in very deep networks, usually in RNNs
- 27. A mathematical function that converts a vector of real numbers into a probability distribution &
- 28. A type of RNN that, in certain cases, has advantages over LSTM
- 29. It is a widely adopted activation function for a special type of neural network known as Backpropagation Network &
- 31. A supervised machine learning method where the model tries to predict the correct label of a given input data &
- 32. A systematic error that occurs in the machine learning model itself due to incorrect assumptions in the ML process &
- 33. A type of artificial neural network which uses sequential data or time series data
- 34. A matrix of weights which are multiplied with the input to extract relevant features
Down
- 2. A commonly used activation function: it gives 1 as output of the input is either 0 or positive &
- 3. A type of artificial neural network, which is widely used for image/object recognition and classification
- 5. A branch of artificial intelligence and computer science which focuses on the use of data and algorithms to imitate the way that humans learn, gradually improving its accuracy &
- 6. An algorithm that is designed to test for errors working back from output nodes to input nodes &
- 7. When all the training data is used at once and is defined as the total number of iterations of all the training data in one cycle for training the machine learning model &
- 10. Uses machine learning algorithms to analyze and cluster unlabeled datasets &
- 12. A single layer neural network with multiple nodes where each node accepts multiple inputs and generates one output &
- 14. Function decides whether a neuron should be activated or not by calculating the weighted sum and further adding bias to it &
- 16. A hyper-parameter used to govern the pace at which an algorithm updates or learns the values of a parameter estimate &
- 17. A type of machine learning that learns the relationship between input and output &
- 19. An algorithm for supervised learning of binary classifiers &
- 22. A method in artificial intelligence that teaches computers to process data in a way that is inspired by the human brain &
- 25. Kernel
- 30. The need for this activation function stems from the fact that many learning algorithms require the activation function to be differentiable and hence continuous &
