Big Idea 2

123456789101112
Across
  1. 3. Tests conducted using models.
  2. 6. Hardware used to digital circuits used to represent Boolean functions (true/false).
  3. 10. Can occur due to problems with how decimal points are stored in computers, causing inaccuracies.
  4. 12. Number system used in computer science with “0” representing no electric charge and “1” representing charge.
Down
  1. 1. The table used to convert text to binary codes (can only use 128 characters so it isn’t as popular as unicode).
  2. 2. Coded instructions for running programs, also called functions.
  3. 4. Simplified versions of objects/environments.
  4. 5. The simplification of a command that causes it to be easier and multipurpose (like the display or print commands).
  5. 7. Number system with a base 8, used in older computers.
  6. 8. Binary digits (like a 1 or 0 in binary).
  7. 9. Number system that uses 16 digits, often used by people to represent an easier version of binary.
  8. 11. Unit of computer storage consisting of 8 bits.