Gen AI and LLM - Set 1

123456789101112131415
Across
  1. 1. Information used to train AI models, including text and images.
  2. 5. A numerical representation of words or phrases in vector space for better processing by models.
  3. 6. An input given to a model to generate a response or output.
  4. 8. A conversational interaction, often facilitated by AI.
  5. 9. Adjusting a pre-trained model on a smaller dataset for specific tasks.
  6. 11. The process of teaching a model using large datasets to improve its performance.
  7. 12. Referring to models that can create new content, such as text or images.
  8. 13. A mechanism in neural networks that allows models to focus on relevant parts of the input data when generating output.
  9. 15. A learning approach where a model performs tasks without specific training examples for those tasks.
Down
  1. 2. A type of neural network architecture that underpins many modern LLMs.
  2. 3. A modeling error where a model learns noise in the training data instead of the underlying pattern.
  3. 4. The primary output of language models, consisting of written words.
  4. 7. The surrounding information that helps models understand the meaning of text.
  5. 10. A unit of text processed by language models, typically a word or subword.
  6. 14. The act of generating predictions or outputs from a trained model.