Generative AI

12345678910111213141516171819
Across
  1. 3. The moderation of AI-generated content through outsourced human classification.
  2. 4. Carbon-based energy sources sustaining most contemporary computing infrastructures.
  3. 8. Specialised computational capacity, chiefly GPUs, required for machine learning.
  4. 13. Extracted minerals and metals essential to digital hardware production.
  5. 14. A vital cooling resource increasingly strained by data-centre expansion.
  6. 15. The shift of AI from analysing communication to generating cultural expression.
  7. 16. Media and policy narratives that shape how AI is feared, trusted, or regulated.
  8. 18. Excess thermal output from servers requiring intensive cooling systems.
  9. 19. Human micro-labour used to refine, score, and personalise AI model outputs.
Down
  1. 1. Discarded digital hardware accumulating as toxic and poorly regulated e-waste.
  2. 2. Global extractive networks supplying resources for chips, batteries, and servers.
  3. 4. The long-term geological trace left by digital infrastructure and electronic waste.
  4. 5. Large-scale collections of digital material extracted to train machine-learning models.
  5. 6. Planetary-scale data-centre infrastructures enabling large AI models to function.
  6. 7. Statistical systems that synthesise new content from learned patterns in data.
  7. 9. Critical cartography that exposes hidden power relations behind AI technologies.
  8. 10. The exponential growth of data, computation, and capital in AI development.
  9. 11. AI systems that reproduce cultural forms through probabilistic mimicry.
  10. 12. Electricity consumption required to power and cool AI data centres.
  11. 15. Venture-backed companies commercialising generative models and coordinating global infrastructures.
  12. 17. Market mechanisms that compensate emissions without reducing industrial growth.