Tutorial: Communication

Main Menu           Previous Topic                                                           Next Topic

The Communication Tutorial

Note: in order to run the dynamic simulations referred to in this tutorial, go here, to the Java Applet version. You will be directed to download the latest version of the Java plug-in.

This tutorial explores a formalization of information and communication as introduced by Claude Shannon in his 1948 master's thesis, “A Mathematical Theory of Communication” (The Bell System Technical Journal, Vol. 27, pp. 379-423, 623-656, July, October, 1948). This paper introduces entropy as a measure of information and proves Shannon's famous channel capacity theorem. This remarkable theorem initiated the modern field of information theory. In this tutorial, we provide some visual demonstrations of Shannon's ideas and summarize some of his results. Some contemporary perspective is taken from Elements of Information Theory by Thomas Cover and Joy Thomas (Donald Schilling, Ed. New York: John Wiley & Sons, Inc., 1991.)


  1. Introduction
    1. Describing Information
    2. Describing Communication
    3. The Information Channel
  2. Characterizing Messages
    1. Channel Capacity
    2. Information Sources
    3. Redundancy and Compression
    4. Entropy
    5. Compression Example
  3. Designing Encodings
    1. Transmission over a Noiseless Channel
    2. A Noisy Channel
    3. Noisy Channel Example
  4. Closing Remarks

  5. Technical Appendix
    1. Agent Representation of Communication System
    2. Hamming Code Details

                   Previous Slide                                                           Next Slide