Stevan Harnad (2001) Software Can't Reconfigure a Computer Into a red Herring. Psycoloquy: 12(055) Symbolism Connectionism (22)

Volume: 12 (next, prev) Issue: 055 (next, prev) Article: 22 (next prev first) Alternate versions: ASCII Summary
Topic:
Article:
PSYCOLOQUY (ISSN 1055-0143) is sponsored by the American Psychological Association (APA).
Psycoloquy 12(055): Software Can't Reconfigure a Computer Into a red Herring

SOFTWARE CAN'T RECONFIGURE A COMPUTER INTO A RED HERRING
Reply to McDermott on Harnad on Symbolism-Connectionism

Stevan Harnad
Department of Electronics and Computer Science
University of Southampton
Highfield, Southampton
SO17 1BJ
United Kingdom
http://www.cogsci.soton.ac.uk/~harnad/

harnad@cogsci.soton.ac.uk

Abstract

Some physical systems are computers, some are not. Those that are can be reconfigured by their software to simulate (i.e., to be systematically interpretable as) any other physical system and its causal properties and connections. But simulated causality is not real causality. A simulated plane cannot fly in a real sky. By the same token, simulated "thoughts" are not causally connected with the things they are interpretable (by external interpreters) as being about.

    REPRINT OF: Harnad, S. (1993). Harnad's response to McDermott.
    Think 2: 12-78 (Special Issue on "Connectionism versus Symbolism"
    D.M.W. Powers & P.A. Flach, eds.).
    http://cwis.kub.nl/~fdl/research/ti/docs/think/2-1/index.stm

1. McDermott (2001) says transducers and neural nets are just two kinds of computational system. I agree about neural nets (room two, SIM), but I would be interested to know how McDermott would reconfigure his Sun to make it implement an optical transducer (as opposed to a virtual optical transducer). Connecting it to an optical transducer begs the question, of course, because that way I could 'reconfigure' it into a furnace or an airplane too, just by connecting them. The reason you can't do it otherwise is because optical transduction, heating and flight are not implementation-independent formal properties. There's more than one way to 'implement' them, to be sure, but none of the ways is computational (for they involve 'reconfiguring' matter, not just a digital computer's states).

2. A flip-flop in a digital computer is indeed describable by a differential equation, as surely as any other analog system is (all implementational hardware is of course analog), but the computation it is performing is not. To know what that is you need to look at the level of what the flip-flop patterns are encoding. That's implementation-independence.

3. McDermott suggests that I am holding 'computers' and 'computation' to distinctions that are either irrelevant or untenable. If this is meant to endorse ecumenism about computation, I would refer him to my response to Dietrich (Harnad, 2001a): If computation is allowed to become sufficiently broad, 'X is/is-not Computation' becomes vacuous (including 'Cognition is Computation'). McDermott doesn't like my own candidate (interpretable symbols/manipulations) because sometimes you can't specify the symbols. Fine, let it be interpretable code then (is anyone interested in uninterpretable code?). Code that 'refers' only to its own physical implementation sounds circular. Causal connections between the code and computer-external things that it is interpretable as referring to, on the other hand, are unexceptionable (that's what my own TTT calls for), but surely that's too strong for all the virtual things a computer can do and be! (When you reconfigure a digital computer to simulate all others -- say, when you go from a virtual robot to a virtual planetary system -- are you reconfiguring the (relevant) 'causal connections' too? But surely those are wider than just the computer itself; virtual causal connections to a virtual world are not causal connections at all -- see the Cheshire cat response to Dyer: Harnad 2001b).

4. One can agree (as I do) that nothing essential is missing in a simulated rainstorm, but the question is: Nothing essential to what? I would say: to predicting and explaining a rainstorm, but certainly not to watering a parched field. So let's get to the point. We're not interested in rainstorms but in brainstorms: Is anything essential missing in a simulated mind? Perhaps nothing essential to predicting and explaining a mind, but certainly something, in fact everything, essential to actually BEING or HAVING a mind. Let's not just shrug this off as (something interpretable as) 'self-modeling capacity.' Perhaps the meanings of McDermott's thoughts are just something relative to an external observer, but I can assure you that mine aren't!

REFERENCES

Harnad, S. (2001) Grounding symbols in the analog world with neural nets -- A hybrid model. PSYCOLOQUY 12(034) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.034

Harnad, S. (2001a) The ubiquity of physics (and the physics-independence of computation) PSYCOLOQUY 12(040) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.040

Harnad, S. (2001b) Computation and minds: Analog is otherwise. PSYCOLOQUY 12(043) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.043

McDermott, D. (2001) Digital computers as red herrings. PSYCOLOQUY 12(054) http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?12.054


Volume: 12 (next, prev) Issue: 055 (next, prev) Article: 22 (next prev first) Alternate versions: ASCII Summary
Topic:
Article: