Stevan Harnad (2001) Dynamical Systems, Evolution, Grounding and Meaning. Psycoloquy: 12(049) Symbolism Connectionism (16)

Volume: 12 (next, prev) Issue: 049 (next, prev) Article: 16 (next prev first) Alternate versions: ASCII Summary
Topic:
Article:
PSYCOLOQUY (ISSN 1055-0143) is sponsored by the American Psychological Association (APA).
Psycoloquy 12(049): Dynamical Systems, Evolution, Grounding and Meaning

DYNAMICAL SYSTEMS, EVOLUTION, GROUNDING AND MEANING
Reply to Honavar on Harnad on Symbolism-Connectionism

Stevan Harnad
Department of Electronics and Computer Science
University of Southampton
Highfield, Southampton
SO17 1BJ
United Kingdom
http://www.cogsci.soton.ac.uk/~harnad/

harnad@cogsci.soton.ac.uk

Abstract

Symbol systems (computation) are different from dynamical systems (physics) whether or not physics proves to be really continuous. Symbols can be grounded by learning or evolution or both, but Turing Test-passing is temporal, so it requires some learning. Sensorimotor grounding does not guarantee meaning, however, only autonomy from external interpretation.

    REPRINT OF: Harnad, S. (1993). Harnad's response to Honavar. Think
    2: 12-78 (Special Issue on "Connectionism versus Symbolism" D.M.W.
    Powers & P.A. Flach, eds.).
    http://cwis.kub.nl/~fdl/research/ti/docs/think/2-1/index.stm

1. Honavar says little that I can disagree with. For me, analog structures and processes (dynamical systems) are those that are best described as obeying differential equations rather than as implementations of implementation-independent symbol manipulations (or, as Maclennan [2001] puts it, symbolic difference equations). The difference between a real planetary system and a computer simulated planetary system captures the distinction quite nicely. It seems to me that the final chapter of quantum mechanics (concerning the ultimate continuity or discreteness of the physical world) has nothing to do with this dichotomy, no matter what it turns out to be.

2. Whether symbols are grounded by learning or evolution does not much matter to my theory; I happen to focus on learned categories, but the raw input we begin with is clearly already filtered and channelled considerably by evolution. It would be incorrect (and homuncular), however, to speak of a grounded system's 'interpreting' the shapes of its symbols. If the symbols are grounded, then they are connected to and about what they are about independently of any interpretations we (outsiders) project on them, in virtue of the system's TTT interactions and capacity. But (as Searle (2001) points out in his commentary, and I of course agree), there may still be nobody home in the system, no mind, hence no meaning, in which case they would still not really be 'about' anything at all, just, at best, TTT-connected to certain objects, events and states of affairs in the world. Grounding does not equal meaning, any more than TTT-capacity guarantees mind (Harnad 2001a,b). And there is always the further possibility that symbol grounding is a red herring, because symbol systems are a red herring, and not much of whatever really underlies mentation is computational at all. The TTT would still survive if this were the case, but 'grounding' would just reduce to robotic 'embeddedness' and 'situatedness.'

REFERENCES

Harnad, S. (2001) Grounding symbols in the analog world with neural nets -- A hybrid model. PSYCOLOQUY 12(034) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.034

Harnad, S. (2001a) Minds, Machines, and Turing: The Indistinguishability of Indistinguishables. Journal of Logic, Language, and Information 9(4): 425-445. (special issue on "Alan Turing and Artificial Intelligence") http://cogprints.soton.ac.uk/documents/disk0/00/00/16/16/index.html

Harnad, Stevan (2001b) No Easy Way Out. The Sciences 41(2):36-42. http://cogprints.soton.ac.uk/documents/disk0/00/00/16/24/index.html

Honavar, V. (2001) Continuity, learning, symbol-grounding and meaning. PSYCOLOQUY 12(048) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.048

MacLennan, B.J. (2001) Grounding analog computers. PSYCOLOQUY 12(052) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.052

Searle, J.R. (2001) The Failures of Computationalism. PSYCOLOQUY 12(060) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.060


Volume: 12 (next, prev) Issue: 049 (next, prev) Article: 16 (next prev first) Alternate versions: ASCII Summary
Topic:
Article: