Stevan Harnad (2001) Transduction, Yes. Degrees of Grounding, no.. Psycoloquy: 12(037) Symbolism Connectionism (4)

Volume: 12 (next, prev) Issue: 037 (next, prev) Article: 4 (next prev first) Alternate versions: ASCII Summary
Topic:
Article:
PSYCOLOQUY (ISSN 1055-0143) is sponsored by the American Psychological Association (APA).
Psycoloquy 12(037): Transduction, Yes. Degrees of Grounding, no.

TRANSDUCTION, YES. DEGREES OF GROUNDING, NO.
Reply to Boyle on Harnad on Symbolism-Connectionism

Stevan Harnad
Department of Electronics and Computer Science
University of Southampton
Highfield, Southampton
SO17 1BJ
United Kingdom
http://www.cogsci.soton.ac.uk/~harnad/

harnad@cogsci.soton.ac.uk

Abstract

Boyle (2001) is right that transduction is a critical component of symbol grounding. But symbols are not grounded by degrees: Either a robot can identify and interact with the objects its symbols are interpretable as referring to, or it cannot. The rest is just about the reliability and extent of the identification and interaction.

    REPRINT OF: Harnad, S. (1993). Harnad's response to Boyle. Think 2:
    12-78(Special Issue on "Connectionism versus Symbolism" D.M.W.
    Powers & P.A. Flach, eds.).
    http://cwis.kub.nl/~fdl/research/ti/docs/think/2-1/index.stm

1. Boyle's (2001) is a friendly commentary, so there is no point dwelling on the minor differences there are between us: A system that can pass the TTT is good enough for me. As a matter of logic, transduction will have to be part of its successful functioning. How much of its transducer activity will remain analog, how much will be discretized and filtered, how much will be processed syntactically (by 'pattern-matching') -- these are all empirical questions about how that future system will actually succeed in passing the TTT. I happen to have my own hypotheses (neural nets filtering out learned invariants in the analog projection, connecting them to arbitrary symbolic names, which are then manipulated compositionally, but inheriting the nonarbitrary constraint of the grounding) and Boyle may have his. The point, however, is that just as there are no a priori degrees of passing the TTT (that's what the 'Total' ensures), there are no a priori degrees of grounding (at least not in the sense I use the word). Ungrounded symbols mean what they mean only because (within their formal system) they can be systematically interpreted as meaning what they mean. In contrast, the meanings of grounded symbol systems are grounded in the system's capacity for robotic interactions with what the symbols are about.

2. Neither immunity to Searle's (1980) Chinese Room Argument nor TTT-groundedness can guarantee that there's somebody home in such a robot, but I happen to think they're the best we can ever hope to do, methodologically speaking. If Boyle's 'structure preserving superposition' can do a better job, all power to it. But at this point, it seems to amount to what Searle would call 'speculative neurophysiology,' whereas transduction and TTT-power have face validity.

REFERENCES

Boyle, C.F. (2001) Transduction and degree of grounding. PSYCOLOQUY 12(036) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.036

Harnad, S. (2001) Grounding symbols in the analog world with neural nets -- A hybrid model. PSYCOLOQUY 12(034) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.034

Searle, J. R. (1980) "Minds, brains and programs." Behavioral and Brain Sciences 3: 417-424. http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.searle2.html http://www.bbsonline.org/documents/a/00/00/04/84/index.html


Volume: 12 (next, prev) Issue: 037 (next, prev) Article: 4 (next prev first) Alternate versions: ASCII Summary
Topic:
Article: