Rules.of.Play.Game.Design.Fundamentals [Electronic resources] نسخه متنی

اینجــــا یک کتابخانه دیجیتالی است

با بیش از 100000 منبع الکترونیکی رایگان به زبان فارسی ، عربی و انگلیسی

Rules.of.Play.Game.Design.Fundamentals [Electronic resources] - نسخه متنی

Katie Salen, Eric Zimmerman

| نمايش فراداده ، افزودن یک نقد و بررسی
افزودن به کتابخانه شخصی
ارسال به دوستان
جستجو در متن کتاب
بیشتر
تنظیمات قلم

فونت

اندازه قلم

+ - پیش فرض

حالت نمایش

روز نیمروز شب
جستجو در لغت نامه
بیشتر
لیست موضوعات
توضیحات
افزودن یادداشت جدید





Information Theory


The field of information theory arose after WWII in tangent with the emerging telecommunications industry. Originally intended as the study of signal transmission, information theory is an interdisciplinary set of concepts and methodologies that cuts across mathematics, physics, and engineering. Since its formalization in the classic 1949 text by Claude Shannon and Warren Weaver, The Mathematical Theory of Communication, information theory has grown beyond its origins in electronics and cryptography to find applications in behavioral and social sciences, as well as in some strands of contemporary critical theory.[1]

Information theory quantitatively studies information flow: how senders send information and how receivers receive it. It is closely linked with systems theory in that information theory studies the mechanics by which systems function—how, for example, the parts of a system communicate with each other. Game systems almost always involve communication among players and interaction among system elements; information theory can be a valuable way to understand these processes.

Before going any further, it is crucial to understand what information theory means by the word "information." Information theory does not use the word in the same way it is used in casual conversation. When you say, for example, that you need more "information" about a car before you purchase it, you are using the word to mean knowledge. You need more data about the car in order to better understand it. This is, in some ways, the opposite of how the word is used in information theory. Consider Warren Weaver's description of the way in which information theory understands "information:"

The word information, in this theory, is used in a special sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning. In fact, two messages, one of which is heavily loaded with meaning and the other of which is pure nonsense, can be exactly equivalent, from the present viewpoint, as regards information.[2]

Unlike a knowledge-based understanding of the word, in information theory, "information" is divorced from meaning. But what does this mean exactly? Let's look at another description of "information" that comes from communications theorist Stephen W. Littlejohn:

Information is the measure of uncertainty in a situation. When a situation is completely predictable, no information is present…. As used by the information theorist, the concept of information does not refer to a message, facts, or meaning. It is a concept bound only to the quantification of stimuli or signals in a situation.[3]

According to Littlejohn,"information" is a way of understanding the mathematical content of a signal: the "quantification of stimuli or signals." With its foundation in formal, quantitative thinking, it should be clear that information theory has a greater affinity to rules than to play or culture. Information theory proceeds from two important premises, both of which are evident in the quotes from Weaver and Littlejohn:

Meaning is irrelevant to information. Information has nothing to do with the content or meaning of a message. As Weaver points out, two strings of letters, one of which makes coherent sense and the other of which is nonsense, could contain the same amount of information.

Information measures uncertainty. Furthermore (and this is a tricky point to grasp), in information theory, information does not measure the amount of "stuff"in a message. It does not measure the knowledge contained in a mes-sage—it measures uncertainty instead. As Littlejohn notes, "When a situation is completely predictable, no information is present."

The first point, meaning is irrelevant to information, is the easier of the two to digest. Information theory concerns signal trans-mission—it is a formal way of looking at the mathematical structure of a signal, rather than a semiotic way of looking at its content. Remember the example of changing the suits on a deck of cards in Defining Rules? As long as the mathematical structure of the cards remained the same, we could change the content of the suits and still play a card game with the deck. Information theory looks at communication signals in a similar way, highlighting the formal structure while ignoring its content. Because information theory looks at mathematical data and not at meaning, it can be applied to any form of communication: "written letters or words, or musical notes, or spoken words, or symphonic music, or pictures." [4]In information theory the format of the data is irrelevant to the analysis.

What about the second point: information measures uncertainty? This concept is subtler to grasp. According to information theory, information is a measure of how certain you can be about the nature of a signal. A completely predictable signal has a very low amount of information, whereas one that could contain anything has high informational content. What does it mean that parts of a signal are predictable or unpredictable? Consider telegraph messages. For years, people sent telegraph messages leaving out non-essential words such as "a" and "the." These words had a high degree of certainty, and therefore conveyed very little information to the receiver. Similarly, predictable letters were sometimes left out as well, such as in the sentence,"Only infrmatn esentil to understandn mst b tranmitd."[5]

As a thought experiment, imagine a stranger named J.L. If you are trying to guess whether or not J.L. had eaten breakfast today, there are two possible answers to the question: "yes" or "no." You would have a fair degree of certainty as to what J.L. might answer, because there are only two possible units of information to choose from—and that means the amount of information contained in the answer would be low. If you are trying to guess J.L.'s favorite color, you are still just guessing at a single word, but there are more possibilities and therefore there is more uncertainty in your guess. The amount of information in the answer would be higher because you would be selecting J.L.'s favorite color from a larger set of possible answers. If you try and guess J.L.'s favorite word from among all of the words in the English language, the amount of information contained in the answer is even higher still, because there is so much uncertainty about what it might be. As Weaver puts it, information "relates not so much to what you do say, as to what you could say." [6]

If you were going to make a correct guess about J.L.'s breakfast, favorite color, or favorite word, think about the amount of uncertainty that is reduced with a correct answer. If you correctly guessed whether or not J.L. had eaten breakfast, you wouldn't be reducing much uncertainty, because there was very little to begin with. But if you correctly guessed J.L.'s favorite word, uncertainty would be greatly reduced. In this example, the signal that correctly guesses J.L.'s favorite word contains the most information, because it also reduces the highest degree of uncertainty.

On the other hand, if you already know everything about J.L., including his favorite word, his favorite color, and whether or not he ate breakfast today, then a response to the question would actually contain no information at all. The answer is already certain before the guess takes place, and therefore the guess is not really a guess. The degree of uncertainty about the truth would not decrease as the result of a right or wrong answer.

This is precisely what Weaver means when he says that, "information is a measure of one's freedom of choice when one selects a message." [7]No freedom of guessing (because we already know the answer) means no information: uncertainty has not decreased. With a simple yes or no answer, there is a little more freedom in what can be sent as a message, but not much. In moving toward picking a favorite color or picking a favorite word, the freedom of what might be offered as an answer increases, and the amount of information in the message increases as a result.

How does information theory's concept of "information" connect with games? Think about this: information is a measure of freedom in decision making. Games are contexts which provide players with the ability to make meaningful decisions. In a sense, the information in a communication system is analogous to the space of possibility in a game. The sender of an information-rich message is choosing from many potentially meaningful options. The player in a game with a large space of possibility is selecting an action from among many possible meaningful options as well.

[1]Stephen W. Littlejohn,

Theories of Human Communication, 3rd Edition (Belmont, CA: Wadsworth Publishing Company, 1989), p. 45.

[2]Claude E. Shannon and Warren Weaver,

Mathematical Theory of Communication (Champaign: University of Illinois Press, 1963), p. 8–9.

[3]Littlejohn,

Theories of Human Communication, p. 46.

[4]<www.lucent.com/minds/infotheory/>.

[5]Weaver and Shannon,

Mathematical Theory of Communication ,p. 25.

[6]Ibid. p. 8–9.

[7]Ibid. p. 8–9.



/ 403