Information theory is a mathematical, structural way of looking at signals that disregards the knowledge-content or meaning of a message.
In information theory, information measures uncertainty. Information is a measure of all of the possible messages an act of communication might contain. The answer to a yes/no question contains less information (and less uncertainty) than an answer to a question about a favorite color, because there are fewer possible answers to the yes/no question.
Information Theory studies signal transmission from a source to a target. Part of this process is noise, in which distortion enters into the signal from an outside source. Noise increases the uncertainty and the therefore the amount of information in a message.
Although noise in the engineering of signal transmission is usually undesirable, in games noise can be a productive design element. According to the lusory attitude, players seek out inefficient activities in a game. Thus noise, which makes communication more difficult and uncertain, makes games such as Charades possible, in which difficulty in communication is the premise of the game.
Redundancy in a system acts to balance out noise by ensuring that not every component of a message is necessary. The English language as a form of communication contains about 50 percent redundancy.
In an information theory system of communication, greater freedom of choice, greater uncertainty, and greater information all increase together. The concept of choice in this sense relates directly to the space of possibility and meaningful play. As a complex system, a game design must strike a balance between too little and too much uncertainty, flexibility, and information.