Arthur B. Markman (1998) Mediating States as Consistent Probablistic States in a Robot. Psycoloquy: 9(65) Representation Mediation (9)

Volume: 9 (next, prev) Issue: 65 (next, prev) Article: 9 (next prev first) Alternate versions: ASCII Summary
PSYCOLOQUY (ISSN 1055-0143) is sponsored by the American Psychological Association (APA).
Psycoloquy 9(65): Mediating States as Consistent Probablistic States in a Robot

Reply to Andreae on Representation-Mediation

Arthur B. Markman
Department of Psychology
University of Texas
Austin, TX 78712

Eric Dietrich
PACCS Program
Binghamton University
Binghamton, NY


Andreae (1998) describes a robot that appears to implement one type of mediating state suggested by Markman & Dietrich (1998) as the central notion of mental representation. Andreae's robot looks quite interesting, and fits nicely the theoretical constraints on mediating states. Reflection on Andreae's robot yields a couple of interesting points about mediating states. We close by defending our use of Dretske's notion of information, rather than Shannon's.


compositionality, computation, connectionism, discrete states, dynamic Systems, explanation, information, meaning, mediating states, representation, rules, semantic Content symbols
1. Andreae (1988) describes his work with his robot PURR-PUSS (PP, for short). According to his description, PP has mediating states. PP learns, via association, that a certain action it takes will consistently (up to some probability) result in a certain stimulus being received by it. Andreae calls internal states that represent or denote such consistently associated actions and stimuli "consistent contexts" (CCs). Andreae claims that PP can use its CCs to achieve goals, obtain rewards, and avoid pain.

2. CCs do indeed seem to be mediating states. We defined mediating states as states within a system, embedded in an environment, that are in informational, interactive contact (which we interpret as a kind of causal contact) with states in that environment and which were used by the system to accomplish goals and the like. CCs seem to fit the bill nicely. (We also want to allow for mediating states that are in registration not with external environmental states, but with other internal system states. This complication need not concern us here.)

3. There are a couple of interesting things to note about PP's consistent contexts. First, it appears as if Andreae has in fact added a new condition on mediating states. We said nothing about mediating states being consistently or reliably associated (via behavior) with stimuli received from the environment, although this condition is suggested by our condition that there has to be some informational connection between the mediating state and the environment. This connection imposes its own reliability constraint, especially given that we used Dretske's (1981) definition of informational content, which requires the relevant probabilities to be one (see below). Andreae makes essentially this point in his paragraph 8 where he says "[CCs] can exhibit consistency over a number of occasions only if they correspond to the outside world being in the same or equivalent states." Still it is important to point out that internal states that are inconsistently related to received stimuli (and to behavior that results in inconsistently received stimuli) are unlikely to be relevant to cognitive processing and hence are unlikely to serve as mediating states.

4. The second thing to note about PP's consistent contexts is that they nicely exhibit the methodological point that we intended mediating states to capture: they are mediating states in good standing even though they are rather low level (indeed, Andreae says these mediating states precede representation; para. 8). Andreae's robot PURR-PUSS doesn't cogitate about complex matters at all. Yet it has mediating states. Were it to become significantly more intelligent it would have even more mediating states used at higher levels of cognition. One of our main points was that mediating states are useful at all levels of cognitive theorizing, from the perceptual processing level, up to the level of planning and logical reasoning. Mediating states are any states that warrant semantic interpretation. States at all levels of cognitive processing have this property.

5. Finally, a brief response to Andreae's point that Shannon's (1949) original definition of information would have sufficed for our purposes Shannon's notion of information provides no way of picking out "the" informational CONTENT of a specific signal. His view only provides a way to talk about the quantity of information. This limitation poses a problem, because cognitive scientists frequently want to talk about and use the content of a signal in their theories. Indeed, Andreae talks about the content of mediating states when describing PURR-PUSS. Therefore, Shannon's definition of information has to be augmented. Dretske's (1981) extended definition of information does nearly exactly what we need, so we used it. Dretske's notion of information does allow one to pick out the contents of a signal: a signal, r, carries the information that s is P if and only if the conditional probability of P(s) given that r occurs is 1. This does seem to fit our requirements, whereas Shannon's view does not.


Andreae, J. (1998) A Robot Brain with Mediating States. PSYCOLOQUY 9(59)

Dretske, F. (1981). Knowledge and the Flow of Information. Cambridge, MA: MIT/Bradford.

Markman, A.B. & Dietrich, E. (1998) In Defense of Representation as Mediation. PSYCOLOQUY 9(48)

Shannon, C. (1949). The Mathematical Theory of Communication. Urbana, IL: Univ. of Illinois Press.

Volume: 9 (next, prev) Issue: 65 (next, prev) Article: 9 (next prev first) Alternate versions: ASCII Summary