Arthur B. Markman (1998) Mediating States, Information and Representation. Psycoloquy: 9(66) Representation Mediation (10)

Volume: 9 (next, prev) Issue: 66 (next, prev) Article: 10 (next prev first) Alternate versions: ASCII Summary
Topic:
Article:
PSYCOLOQUY (ISSN 1055-0143) is sponsored by the American Psychological Association (APA).
Psycoloquy 9(66): Mediating States, Information and Representation

MEDIATING STATES, INFORMATION AND REPRESENTATION
Reply to Clapin on Representation-Mediation

Arthur B. Markman
Department of Psychology
University of Texas
Austin, TX 78712
http://www.psy.utexas.edu/psy/FACULTY/Markman/index.html

Eric Dietrich
PACCS Program in Philosophy
Binghamton University
Binghamton, NY
http://www.binghamton.edu/philosophy/home/faculty/index.htm

markman@psy.utexas.edu dietrich@binghamton.edu

Abstract

Clapin (1998) argues that our definition of mediating states will not work. His argument focuses on the idea that not all representational states carry information about their content. We disagree. Whatever the final theory of representational content turns out to be, information will have to play some role. That is the minimal condition required for a truth-functional semantics. In addition to defending mediating states, we discuss other issues raised in Clapin's interesting commentary.

Keywords

compositionality, computation, connectionism, discrete states, dynamic Systems, explanation, information, meaning, mediating states, representation, rules, semantic Content symbols
1. Clapin (1998) agrees that there are likely to be many kinds of representations in cognitive systems, but he objects to the use of mediating states as a minimal condition on representation. His argument against mediating states has three primary elements. First, he points out that we (Markman & Dietrich 1998) suggested that many AI models do not use true mediating states, and therefore do not have representations. Second, he claims that our definition of a mediating state rests solely on covariation between the representation and something in the environment. Finally, he suggests that van Gelder's (1995) example of the steam engine governor may still pose a problem for our definition of mediating states. We will discuss these issues in turn.

2. Clapin correctly notes that we stated that most AI models do not use true mediating states. For many AI models, there is a data structure in the system that is meant to play the role of a representation. However, there is frequently no set of states external to the system that the data structure corresponds to. Instead, the data structures in an AI model have a user semantics, where the programmer knows what the internal states of the program are supposed to correspond to.

3. The data structures in AI models are extraordinarily useful for learning about the computational constraints on potential cognitive processes. For example, AI models of the process of analogical mapping (e.g., Falkenhainer, Forbus, & Gentner, 1989; Holyoak & Hummel, 1997, Keane, Ledgeway, & Duff, 1994) are useful for suggesting how a process of finding correspondences between pairs of data structures can be done tractably. These models make assumptions about the form of mental representations (e.g., that they are predicate structures consisting of entities, attributes and relations), but the specific data structures given to the program to operate are not generated by any process that maintains a registration between the content of those structures and some environment external to the program. Thus, such models have been helpful for understanding complex aspects of cognitive processing, despite the fact that they do not use actual mediating states (Gentner & Markman, 1997). All this is to say that it is possible to learn from models that do not use actual mediating states, though we are likely to learn even more about representation if we used cognitive models that actually represent (see Bickhard & Terveen, 1995 for a defense of this point).

4. Clapin's second point is that our definition of mediating states is too strong, because it requires the system to carry veridical content. As he points out, many theorists have noted that people may be mistaken about elements in the world. If a representation carried its content purely through some sort of covariation with objects in the world, then these kinds of mistakes would not occur.

5. We are quite explicit in our target article that it is an open question how representations come to have their content. All that is required by mediating states is some mechanism for keeping the internal states in registration with external states in some way (see also our earlier reply to the commentary by Terrier 1998; Markman & Dietrich, 1998a). A mediating state may fail to correspond to the external state to which it is supposed to be connected when the processes that provide representations with their content go awry. Mediating states can thus be in error.

6. It is worth saying a bit more about the processes that complex cognitive systems (like humans) can use to keep mediating states in synchrony with states external to the system. A cognitive system cannot have access to the outside world. Instead, it has a series of sense organs that take in stimuli. These stimuli are either consistent or inconsistent with prior inputs. Consistent inputs occur when the system had previously associated the new input with its current state, or had associated the new input with the system's current actions in the environment (see Bickhard, 1998, and Morrison, 1998 for attempts to develop a theory of representational content along these lines). Inconsistent inputs are expectation failures that have often been suggested to drive learning (e.g., Schank, 1982).

7. Mediating states are kept in coordination with the external world by attempting to maintain consistency among different mediating states that ostensibly refer to the same external state. For example, if one wakes up in the middle of the night, one may look across the room and see a tangled mess that might be interpreted as a dog. At this moment, there is some mediating state in the system which carries information that there is a dog on the floor. When a light is turned on, it may become evident that this "dog" was actually some rumpled clothes on the floor. This error in a mediating state is corrected, because the mediating states providing new information about the visual world are brought into coordination with the mediating states carrying information that a dog was on the floor. Thus, any mediating state in a cognitive system can only be coordinated with external states by maintaining consistency with other mediating states of the system.

8. Finally, Clapin argues that van Gelder's steam engine governor still poses a problem for our definition of a mediating state. Bechtel (1998) provides an impressive set of arguments that the steam engine governor is not a good example of a system without representations. We will not recapitulate his argument, though we will make one point here. Clapin points out that van Gelder (1995) notes that there is no simple correlation between arm angle and steam engine speed. While that may be true, if there were no relationship between the pressure in the steam pipe and the speed of the governor, the governor would not work. It may be that there are subtle variations in the speed with which the governor spins as a function of pressure, but overall the higher the pressure in the engine, the faster the governor spins. This relationship carries information about the relative pressure in the steam pipe and is exploited by the connection between the spinning rod and the valve to keep the engine from exploding. The combination of a covariance with the environment and the ability to use that information gives the steam engine governor a (rudimentary) mediating state. Thus, while van Gelder asserts that the steam engine governor has no representations, it is not at all clear that a convincing argument can be made that the governor has no mediating states.

9. In summary, we think the definition of mediating states we provided in our target article provides a basis for talking about how cognitive systems store and use information in their internal processing. Clapin raises some interesting issues about our definition (particularly pertaining to the way mediating states may come to be desynchronized with the environment). Nonetheless, we believe that the core definition of a mediating state remains intact.

REFERENCES

Bechtel, W. (1998). Representations and cognitive explanations: Assessing the dynamicists' challenge in cognitive science. Cognitive Science, 22(3), 295-318.

Bickhard, M. (1998). Levels of representationality. J. of Experi. and Theor. AI, 10. 179-215.

Bickhard, M. and Terveen, L. (1995). Foundational Issues in Artificial Intelligence and Cognitive Science: Impasse and Solution. Amsterdam: Elsevier.

Clapin, H. (1998). Information is not representation. PSYCOLOQUY 9(64). ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.64.representation-mediation.8.clapin

Falkenhainer, B. H., Forbus, K. D., & Gentner, D. (1989). The structure mapping engine: Algorithm and examples. Artificial Intelligence, 41, 1-63.

Gentner, D., & Markman, A. B. (1997). Structural alignment in analogy and similarity. American Psychologist, 52, 45-56.

Hummel, J. E., & Holyoak, K. J. (1997). Distributed representations of structure: A theory of analogical access and mapping. Psychological Review, 104, 427-466.

Keane, M. T., Ledgeway, T., & Duff, S. (1994). Constraints on analogical mapping: A comparison of three models. Cognitive Science, 18, 387-438.

Markman, A. B., & Dietrich, E. (1998). Content and form are not the same. PSYCOLOQUY 9(63) ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.63.representation-mediation.7.markman

Markman, A.B. & Dietrich, E. (1998a) In Defense of Representation as Mediation. PSYCOLOQUY 9(48) ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.48.representation-mediation.1.markman

Morrison, C. (1998). Situated Representations: Toward a theory of emergent representational content. Ph.D. Dissertation. Dept. of Phil. Binghamton Univ., Binghamton, NY.

Schank, R. C. (1982). Dynamic memory. New York: Cambridge University Press. van Gelder, T. (1995). What Might Cognition Be If Not Computation? Journal of Philosophy 92(7) 345.

Terrier, P. (1998). Why is the question of content still open? PSYCOLOQUY 9(58) ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/psyc.98.9.58.representation-mediation.4.terrier


Volume: 9 (next, prev) Issue: 66 (next, prev) Article: 10 (next prev first) Alternate versions: ASCII Summary
Topic:
Article: