Unit 2 Notes: Digital Information

Binary Representation of Data

  • Binary System: Base-2 numeral system used by computers to represent data.
  • Bits and Bytes: Basic units of data representation; 1 byte consists of 8 bits.
  • Binary Numbers: Representation of numbers using only two digits, 0 and 1.
  • ASCII (American Standard Code for Information Interchange): Standard encoding system for representing text in computers, where each character is represented by a unique 7-bit binary number.

Compression Algorithms and Techniques

  • Compression: Process of reducing the size of data to save storage space or decrease transmission time.
  • Lossless Compression: Compression technique that preserves all data, allowing the original data to be perfectly reconstructed from the compressed data.
  • Lossy Compression: Compression technique that sacrifices some data to achieve higher compression ratios, commonly used for multimedia files like images, audio, and video.
  • Common Compression Algorithms: Examples include ZIP (lossless), JPEG (lossy for images), and MP3 (lossy for audio).

Encryption and Cryptography

  • Encryption: Process of converting plaintext (unencrypted data) into ciphertext (encrypted data) using an algorithm and a key.
  • Decryption: Process of converting ciphertext back into plaintext using the appropriate decryption key.
  • Symmetric Encryption: Encryption method that uses a single key for both encryption and decryption (e.g., AES).
  • Asymmetric Encryption: Encryption method that uses a pair of keys, a public key for encryption and a private key for decryption (e.g., RSA).
  • Cryptographic Hash Functions: Algorithms that generate fixed-size, unique hash values from input data, commonly used for ensuring data integrity and password storage.

Data Integrity and Error Detection

  • Data Integrity: Assurance that data has not been altered or corrupted during storage, transmission, or processing.
  • Error Detection: Techniques used to detect errors in data transmission or storage.
  • Parity Check: Simple error detection technique that adds an additional bit to data to ensure the total number of bits is even or odd.
  • Checksum: Error detection method that involves summing the values of data and appending the result to the data for verification.
  • Cyclic Redundancy Check (CRC): More advanced error detection technique that generates a checksum based on the data content, often used in network communications.