These four commentaries have not questioned the basic assumptions underlying the explanation for the origin of modern human cognition and culture put forth in the target article. I outline levels at which refutation of this hypothesis can be directed, and discuss the commentators' suggestions, one of which can be extended into an argument for selection operating at the group level.
(i) Culture does not evolve (Gould 1991).
(ii) Some cultural information evolves (e.g. scientific theories, or fashion designs), but not all memories, concepts, attitudes, etc., take part (Blackmore 1999).
(iii) The transformation from pre-cultural to cultural society did not involve a genetic mutation.
(iv) The genetic mutation affected the imitation phase of the evolutionary process, rather than the variation-generating phase. In other words, the ability to imitate, rather than the ability to create, was the bottleneck. (see Blackmore 1999, and forward by Dawkins. This view is also supported by Dennett (pers. com.)
(v) The genetic mutation did not affect the neuron activation threshold; it affected the variation or selection phase in some other way. That is, the creativity-as-bottleneck perspective is correct, but the autocatalytic mind version of it is not.
(vi) Complexity theory (e.g. the concept of autocatalysis) is not a useful tool for gaining insight into these issues (see Dennett 1995).
The higher up on this list the refutation lies, the greater its potential to unravel the picture I have painted. I invite those who do not accept these foundational assumptions to speak up now!
4. As was pointed out in Gabora (1999), supporters of refutation (iv) have much to explain. First, as Blackmore (1999) correctly notes, the onset of culture was associated with an increased variety of tools. Yet this actually contradicts her thesis. That is, if imitative capacity were the bottleneck to culture, then prior to the origin of culture there would have been variation everywhere, and the onset of imitation would have funnelled this variation in the most useful directions. Variety would have decreased. The archeological evidence she cites actually supports the alternative thesis -- that creativity was the bottleneck to culture -- and the autocatalytic-mind hypothesis in my target article tries to explicate exactly how this might have come about. Second, since imitation is found in at least some animals (see Byrne & Russon (1998) and accompanying commentary for an illuminating review), what prevents them from evolving culture? The lack of culture in animals is, on the other hand, not a problem for the creativity-as-bottleneck hypothesis, because imitative capacity remains latent or hidden until there is variation for it to work on. For example, in "Meme and Variations," a computer model of cultural evolution, agents' ability to imitate is set to 1 and their ability to invent to 0, what happens is: nothing (Gabora 1995). There has to be something worth imitating for imitative ability to manifest itself.
5. That said, I will now respond more directly to the commentaries. Edmonds's (1999) suggestion that selection does not operates on single concepts but on conceptual pathways, is insightful, and his suggestions for how variation could arise in a conceptual pathway are intriguing. It would be nice if he could provide some concrete examples. Does the set of motor instructions that results in the execution of, say, a basketball move, count as a conceptual pathway?
6. The problem Dewitte (1999) describes in paragraph 5, wherein the stream of thought becomes "frozen" or locked into a single thought, would not happen in the scenario I describe, because of habituation. That is, if exactly the same neurons are stimulated repeatedly, they all become refractory, and their impact on the stream of thought diminishes or disappears completely. However, habituation cannot prevent the mind from getting locked into one of Edmonds's looping conceptual pathways. Dewitte's point that behavior is the final arbitrator of fitness, and her bucket brigade-like proposal for how behavioral fitness feeds back on the conceptual network, fit nicely into the overall picture.
7. Orsucci's (1999) suggestion that the scenario needs to be expanded to account for the emergence of icons and symbols is an excellent one. (He is probably better-equipped to undertake this than I.)
8. Orsucci also notes that a hypercube-based model of conceptual space may be too simplistic to exhibit complex phenomena such as fractal structure and self-organized criticality (SOC). This is not the case, because the hypercube I describe has some unusual properties. First, it is extremely sparse; not every state, or point on the hypercube, stores a memory. Second, it has a combinatorial structure; each state can itself comprise an information space, so that complex information is built up from simple information. This is how I see SOC entering into the picture. Every time a thought evokes a concept from memory, the relationships amongst all the concepts close to it in Hamming space shift very slightly. That is, the "fabric" of conceptual space is perturbed by the act of retrieving the memory in response to certain associations. This perturbation in turn determines which concepts contribute to the content of the next memory pulled into awareness. Most such retrieval events affect the topology of the conceptual space so little that their implications are not worth considering. Therefore, memory contributes little to the next instant of awareness, and external stimuli or drives take over. However, once in a while, the implications of a thought percolate so deeply into the conceptual network that this retrieval process continues iterating recursively for a long time. This system seems very reminiscent of other systems that display self-organized criticality, and thus one might expect to find a power law relationship relating the frequencies and sizes of recursive retrievals, or "conceptual avalanches".
9. Preti (1999) makes a provocative point in paragraph 13: in a social group it is sufficient for a small fraction of the individuals to carry the mutation that lowers their neuron activation threshold, because the ideas and artifacts resulting from their enhanced creativity can be transmitted, through imitation, to others. That way, few individuals have to suffer the mood swings and hallucinations that tend to accompany creativity, and everyone can enjoy the fruits of their memetic labor. (Of course, all individuals must have a threshold that is at least low enough to achieve worldview closure.) Assuming that jobs vary in the extent to which creativity enhances performance and emotional instability disrupts it, a variable neuronal activation threshold might result in a better match between the jobs to be done and the individuals who do them. The resulting society, with its self-organized division of labor, might outcompete other societies in which little variation in the neuron activation threshold exists. This may not be an argument for group selection, but it does suggest some dampening of selective pressure at the level of the individual.
Byrne, R. W. and Russon, A. (1998) Learning by imitation: a hierarchical approach. Behavioral and Brain Sciences, 21, 667-721. http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.byrne.html
Blackmore, S. (1999) The Meme Machine. Oxford University Press.
Dennett, D.C. (1995) Darwin's Dangerous Idea: Evolution and the Meanings of Life. Simon and Shuster.
Dewitte, S. (1999) What is Selected and Where is it Selected?. PSYCOLOQUY 10(010). Origin Culture (3). ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1999.volume.10/ psyc.99.10.010.origin-culture.3.dewitte http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.010
Edmonds, B. (1999) Joining the Dots: Extending the Autocatalytic Picture. PSYCOLOQUY 10(007). Origin Culture (2). ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1999.volume.10/ psyc.99.10.007.origin-culture.2.emonds http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.007
Gabora, L. (1995) Meme and variations: A computer model of cultural evolution. In (L. Nadel & D. Stein, Eds.) 1993 Lectures in Complex Systems Addison-Wesley. http://www.vub.ac.be/CLEA/liane/MAV/mav.htm
Gabora, L. (1998). Autocatalytic Closure in a Cognitive System. PSYCOLOQUY 9(67). ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1998.volume.9/ psyc.98.9.67.origin-culture.1.gabora http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?9.67
Gabora, L. (1999) To imitate is human: A review of "The Meme Machine" by Susan Blackmore. The Journal of Artificial Societies and Social Systems, 2:2. http://www.soc.surrey.ac.uk/JASSS/JASSS.html
Gould, S. J. (1991) Bully for brontosaurus: reflections in natural history. W. W. Norton & Company. p. 63-66.
Orsucci (1999) Origins of the Origins: Evolution in the Semiotic Universe. PSYCOLOQUY 10(013). ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1999.volume.10/ psyc.99.10.013.origin-culture.4.orsucci http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.013
Preti, A. (1999) Creativity, Genetics and Mental Illness. PSYCOLOQUY 10(016). ftp://ftp.princeton.edu/pub/harnad/Psycoloquy/1999.volume.10/ psyc.99.10.016.origin-culture.6.preti http://www.cogsci.soton.ac.uk/cgi/psyc/newpsy?10.016