Methods in CL -- Text Processing

123456789101112131415161718192021222324252627282930313233343536
Across
  1. 2. What we do when we want to split the personality of a word
  2. 4. *-kappa -- a coefficient that helps us estimate how well a set of annotators agree
  3. 5. Words seen through n-dimensional glasses
  4. 8. More than a point in n-dimensional space
  5. 9. Is what distance is without the math
  6. 10. For us it often has more than 3 dimensions
  7. 11. * agreement -- how much annotators agree by accident
  8. 12. One or more units
  9. 15. A model that is based on connections
  10. 16. Something we count in corpora, usually after some processing
  11. 17. The dimensionality we want
  12. 18. One way of formalizing linguistic phenomena
  13. 20. Semantic * : the attempted periodic table for language
  14. 23. An additional layer of information
  15. 24. * models : can be used to learn a composition function
  16. 25. How two representations should be if the words they correspond to are similar
  17. 26. No dimensional in n-dimensional space
  18. 29. An understandable type of meaning representation
  19. 30. Something we count in corpora, after minimal processing
  20. 31. Is a basic metric, and one of the reasons to formalize word representations
  21. 33. The most flexible and rich representation formalism (according to Vivi)
  22. 34. The smallest unit that has meaning
  23. 35. The most frequent English word in most corpora
  24. 36. The process of obtaining a word of different class by adding grammatical morphemes to a word stem
Down
  1. 1. * matrix : often used to represent a graph
  2. 2. A model that comes from many directions
  3. 3. *-kappa -- a coefficient that helps us estimate how well two annotators agree
  4. 6. When we tell the learning algorithm exactly what we want
  5. 7. Something we want that approximates something we observe
  6. 9. One word can have more of these
  7. 11. What we rely on to build the meaning of a larger phrase
  8. 13. The process of obtaining a word of the same class by adding (grammatical) morphemes to a word stem
  9. 14. Our most usual unit
  10. 19. Collections of texts, on which everything we do is based
  11. 21. What we try to capture
  12. 22. When a word is more that what it seems
  13. 26. The class of a word, that tells us how the word behaves (4,2,6)
  14. 27. The process of creating a word by combining words
  15. 28. The simplest method to model compositionality (6,7)
  16. 32. A non-interpretable type of meaning representation