ISCC - Chapter 1 to 5

123456789101112131415161718192021222324252627282930
Across
  1. 2. – Graphical schema representing entities, attributes, and cardinalities within a database.
  2. 3. – A decentralized form of money verified by network consensus rather than a central authority.
  3. 5. – A coordinated combination of people, processes, hardware, software, and data that supports decision-making in organizations.
  4. 7. – AI structure modeled after the human brain that processes data through interconnected nodes.
  5. 11. – Field focused on designing systems that perceive, reason, learn, and act like humans.
  6. 15. – The assurance that data remains accurate, consistent, and reliable throughout its lifecycle.
  7. 16. – Database model organizing data into interrelated tables defined by primary and foreign keys.
  8. 18. – Software that integrates media elements to develop interactive multimedia content.
  9. 19. – Delivery model providing shared computing resources and storage through Internet-based infrastructure.
  10. 20. – Environment merging physical and virtual elements to interact in real time.
  11. 22. – The problem of storing duplicate fields across files, leading to inconsistency in traditional file systems.
  12. 25. – Verification stage ensuring that all integrated modules operate correctly within an IS environment.
  13. 26. – Evaluation method ensuring multimedia or IS applications are efficient, learnable, and satisfying to users.
  14. 27. – Demographic group characterized by constant connectivity and multitasking through digital media.
  15. 28. – Declarative language used to create, query, and manipulate relational database structures.
  16. 29. – Iterative agile framework where work is divided into time-boxed sprints with defined deliverables.
  17. 30. – The societal condition where technology shapes values, communication norms, and behaviors.
Down
  1. 1. – Architectural approach emphasizing software reusability through independent, loosely coupled services.
  2. 4. – The transformation of input data into meaningful output using defined procedures and logic.
  3. 6. – The model illustrating how data evolves into information, knowledge, and wisdom.
  4. 8. – A system that analyzes large data sets to help managers make semi-structured decisions.
  5. 9. – Distributed ledger technology that prevents double-spending by linking immutable blocks of data cryptographically.
  6. 10. – Prototyping methodology combining design and analysis phases for accelerated system delivery.
  7. 12. – Agile practice that enforces pair programming and continuous integration for quality assurance.
  8. 13. – Specialist responsible for maintaining data security, integrity, backup, and recovery procedures.
  9. 14. – The process of examining large datasets to extract patterns and support evidence-based decisions.
  10. 17. – Technology overlaying virtual information on real-world scenes for enhanced visualization.
  11. 21. – The degree to which information is accurate, timely, relevant, and cost-effective for its intended use.
  12. 23. – Fully immersive digital simulation where users interact with 3D computer-generated environments.
  13. 24. – Planned approach for switching from an old system to a new one, minimizing operational risk.