Fred L. Bookstein (1993) Geometry as Cognition in the Natural Sciences. Psycoloquy: 4(65) Scientific Cognition (2)

Volume: 4 (next, prev) Issue: 65 (next, prev) Article: 2 (next prev first) Alternate versions: ASCII Summary
Topic:
Article:
PSYCOLOQUY (ISSN 1055-0143) is sponsored by the American Psychological Association (APA).
Psycoloquy 4(65): Geometry as Cognition in the Natural Sciences

GEOMETRY AS COGNITION IN THE NATURAL SCIENCES
Commentary on Giere on Science-Cognition

Fred L. Bookstein
Center for Human Growth and Development
University of Michigan
Ann Arbor, Michigan 48109
(313) 764-2443

fred@brainmap.med.umich.edu

Abstract

The cognitive models of science surveyed in the Giere (1992) volume appear to ignore apperception of "geometrical" data -- locations and displacements in the real space of gauges or photographs. Disregard of this channel entails a serious underweighting of abductive reasoning and of the force of quantitative anomalies or surprisingly accurate predictions in accounts of the rhetoric of the natural sciences.

Keywords

Cognitive science, philosophy of science, cognitive models, artificial intelligence, computer science, cognitve neuroscience.
1. Because I am a biometrician, my customary daily tasks involve processing the records of other scientists' cognitive processes. Typically, I am sent numerical representations of real-world geometry: readings of digital or analogue gauges, or, better, interesting points or regions that my colleagues have located or traced in medical scenes. Regardless of the stereotype of the statistician, my main job as scientific collaborator is to retain the rhetoric and the geometry according to which the data were originally gathered. Right through the figures ultimately published, I exploit every trick I know, for pen or computer, to render my desktop as much like a real geometrical landscape as I can: in Latour's word, to revert to the format of the original inscriptions.

2. Then the cognitive processes associated with my sort of quantitative data analysis are almost purely those of real-world geometry: the ordinary assessment of peaks, lines, displacements, directions, bends, or black spots scattered over a ground. There are rules, of course. Quantitative arguments carried by these specific cognitive features must be traceable back to the original inscriptions, and must be accompanied by calibrations of their precision in the same semantics. Still, the usual rhetoric of statisticians practicing in the natural sciences is dominated by the geometry of real space as that geometry was already recognized by the natural scientists gathering the data.

3. Because this analysis of the geometry of data is so routine -- because (re)cognition of real space underlies all of quantitative natural philosophy, from Clifford (1885) on through this morning's chatter about "scientific visualization" -- I expected to find some discussion of these issues in this volume (Giere 1992, 1993). That expectation was wholly frustrated. According to the index, there are no discussions of "geometry," "statistics," "quantification," "diagrams," "instruments," "gauges," "vision," "precision," "uncertainty," or "error." This can't be right. To be of any use to scientists, the cognitive modeling of science must incorporate some discussion about where numbers come from; but the cognitive modes discussed here completely omit the very ones on which I most rely. So parochial a reduction of scope cannot have been these authors' intention.

4. For example, Richard Grandy's chapter overlooks all these geometric concerns in spite of its promising title ("Information, Observation, and Measurement from the Viewpoint of a Cognitive Philosophy of Science"). "Observation and measurement are both processes by which we wrest information about nature from the world," Grandy begins, but then he quickly loses both threads, of "nature" and of the "world." Instead, he adopts the Shannon-Weaver definition of information -- "information of event Ai about event Bj" -- for which knowledge is represented in terms of probabilities rather than locations on a gauge or in a landscape. In turning to the formalisms of "communication," he leaves behind the origins of the data. What matters for natural science is not the accuracy of an instrument's reading to n binary digits, but the fact that that reading is a transformed location or displacement: that it arose from the geometry of real space. The discrepancy between this point of view and Grandy's is clear in his first example, the weather in Houston. While the report that the probability of rain is 1/8 looks like meteorological science, it is not. The forecasters actually rely on temperature, humidity, wind gauges, all deriving from physical continua, not discrete "events," and all ultimately verified by comparison of predictions to later observations of the same gauges.

5. Another example of this disregard for the real roots of quantitative natural science can be found in Lindley Darden's chapter "Strategies for Anomaly Resolution." The juiciest anomalies are quantitative; since Sewall Wright, even geneticists are always checking for more and more exact agreement of theory with data. But from this discussion of lethal recessive genotypes, one might conclude that "anomalies" are merely qualitative -- the nonappearance of a thing you expected, or the converse. Examples like these systematically conceal the glorious mystery of natural science by which, once in a very great while, advances in metrology combine with strenuous control of conditions of observation to arrive at true constraints of our explanations by data. The thought-experiments of Einstein reviewed by Nersessian actually rest on the most exquisitely quantitative null findings of the nineteenth century. It is this coercion by nature at which we most marvel, not the routine production of "representations" or any such wallpaper. Earlier in her essay, she describes Maxwell sketching lines and circles and writing equations. But Maxwell was not merely doodling, pursuing "imagistic reasoning as a species of analogical reasoning"; rather, he was contemplating points and lines in real space, real geometric images whose reliable production in the laboratory perforce constrains any symbolism of "equations." Had the shapes of Faraday's reproducible patterns of filings (Nersessian's Figure 1) not been circular, Maxwell would have made little progress with his equations. It is the geometric relevance of those equations -- the Cartesian formalism of analytic geometry -- that is the artifact, not the geometry per se.

6. Every author in the volume seems to share this refusal to acknowledge the interplay among real space, instrument readings, and differential equations in producing the crowning achievements of science. For instance, the chapters by Nowak & Thagard and by Freedman, all proponents of "coherence," supply two long listings of "Input to Echo" (pp. 302-307, 333-336) that include no estimates whatsoever of the precisions of observations. In the second of these examples, about "latent learning," there is no quantification suggested at all. Surely no such system of reasoning could relate to competent cognition in the contemporary natural sciences, where most of the crucial facts are geometrical and are accompanied by hard-won estimates of standard error. As the strongest form of "coherence" is that between point predictions and their measurements, the absence of any semantics of observational accuracy from a system proposed to simulate "explanatory coherence" supports Glymour's wry observation that "all the hard questions have been begged." The dominance of the quantitative natural sciences rests on their occasional ability to make surprising quantitative forecasts accurate well beyond the limits of instrumentation at the time the forecasts were made. It is this geometric notion of "accuracy" that got Voyager to Neptune; it wasn't guided there by propositional calculus.

7. This is not a new concern. Eugene Wigner (1960) meditates upon the uncanny power with which an "unreasonably effective" point prediction constrains subsequent quantitative theory, and John Platt (1964) echoes the claim in his notion of "strong inference." Precise quantitative predictions are remarkably more persuasive than any other form of comparison between theory and data, including all the logical processes reviewed in this volume. Before Kuhn wrote of anomalies, even before the Vienna positivists, there was Charles Sanders Peirce, whose omission from this volume is, in my view, its most serious flaw. Peirce's great essays of the 1870's dissect out all the main applications of logic in science. Of these, "abduction" is the most relevant:

    The surprising fact, C, is observed; 
    But if A were true, C would be a matter of course, 
    Hence, there is reason to suspect that A is true. 

8. Disciplinary boundaries have shifted in the 120 years since the recognition of this important mode of scientific reasoning. What Peirce called "logic" is now squarely at the core of the cognitive paradigm, the ostensible theme of this volume. The Peircean notion of abduction corresponds closely to the cognitive process with which most of these chapters are concerned: not the arguing, nor the testing, but suspicion followed by discovery, from Maxwell in his garden through the Wright brothers to real Purkinje cells or segregating genes. In this simple syllogism, Peirce has captured the core of the empirical program of quantitative natural science: when faced with a careful measurement that is surprising on the basis of previous knowledge, find an explanation that makes it less surprising, and test that explanation as strenuously as you can.

9. As a geodesist, Peirce meant "surprising" in its irreducibly quantitative form. Permit me a minor anachronism: "surprise" is calibrated by the number of standard errors separating your careful observation from your thoughtful prediction according to a hypothesis you might actually consider holding. What drives the logic of science (now the cognitive aspects of science), in this view, is true incompatibility of strong (precise, quantitative) data with strong (previously reliable) theory. It is this sort of anomaly, Kuhn (1961) pointed out, that keeps us up at night: measurements that come out wrong. And this notion of "wrong" is metrological, i.e., spatial: it is represented by deviations of observed displacements from theoretical displacements, in one, two, or many spatial dimensions, by too many standard errors. Point the telescope where Mercury should be, it isn't there; measure the change in separation between two stars during the eclipse of 1919, and it is what Einstein predicted. No account of cognition in science can be valid without acknowledging the logic underlying these singularly coercive observations.

10. Among the examples in this volume are several drawn from psychology. None of them involve sufficient coherence between observations and theory to test alternate accounts of where belief in an explanation comes from. But there exists at least one good example of strong inference in psychology: Stanley Milgram's notorious "Obedience experiments," which demonstrated the social determination of obedience in a manner so forceful that his apparatus is on display at the Smithsonian Institution. While the understanding of this experiment among the general educated public is restricted to one stunning fact (that 26 out of the 40 subjects went up to the lethal level, in one particular simulation of a learning experiment involving punishment), the actual mass of evidence (Miller, 1986) involves the accumulation of many more of these empirical percentages, ranging from nearly zero compliance to nearly total compliance, as Milgram systematically varied the putative strength of the features of situational pressure ("scientist" in white coat, "victim" in room, "confederate" pushes the button, etc.). The match between a prior theoretically grounded rank-ordering of the "power" of the situation and the empirical trace of obedience supports Milgram's abduction with a strength typically reserved for quantitative natural science. The best presentation of this pooled finding is as a single diagram, a curve running diagonally across a rectangle of obedience by situational "pressure" (itself a metaphor for a gauge reading).

11. Perhaps, then, the problem with this volume is a philosophical one after all: the book has described what scientists do well when they are not doing science well. Essays like these seem to miss the point of science; they reduce it to the aspects of cognitive change their authors feel competent to discuss, instead of looking to see what processes (like the reading of gauges or of photographs) natural scientists actually exploit. My own field, statistics, arose from the possibility of accurate quantitative science (Stigler, 1986) so effectively that nowadays statistics is part of what we mean by "instrumentation." In biomedicine as in the physical sciences, the greatest strides of natural science are inseparable from advances in quantitative instrumentation and particularly from the habit they have of forcing experts to acknowledge anomalies as precision is sharpened.

12. I would not claim that this is all that matters in the cognitive philosophy of science. But a "model of science" in which there is no role for the geometry of instrumental readings, for the rhetorical force of getting a surprising answer from an instrument, cannot account for its most important theme, the very core of its discipline. I cannot expect any program for bridging cognitive science and philosophy of science to succeed unless it incorporates the crucial role of real geometry (lines, heights, gray blobs, the celestial sphere) in embodying what we understand and in forcing us to acknowledge what we do not understand.

REFERENCES

Clifford, W. K. (1885). The Common Sense of the Exact Sciences. New York: Appleton.

Giere, R.N. (1993). Precis of: Cognitive Models of Science. PSYCOLOQUY 4(56) scientific-cognition.1.giere.

Giere, R.N. (1992) Cognitive Models of Science. Minnesota Studies in the Philosophy of Science, volume 15. Minneapolis: University of Minnesota Press.

Kuhn, T. S. (1961). The Function of Measurement in Modern Physical Science. pp. 31-63. In: Woolf, H. (ed.) Quantification. Indianapolis: Bobbs-Merrill.

Miller, A. (1986). The Obedience Experiments. New York: Praeger.

Platt, J. (1964). Strong Inference. Science 146:347-353.

Stigler, S. M. (1986). The History of Statistics: the Measurement of Uncertainty Before 1900. Harvard.

Wigner, E. P. (1960). The Unreasonable Effectiveness of Mathematics in the Natural Sciences. Communications in Pure and Applied Mathematics 13(1). Reprinted in Symmetries and Reflections. Indiana University Press, 1967, pp. 222-237.


Volume: 4 (next, prev) Issue: 65 (next, prev) Article: 2 (next prev first) Alternate versions: ASCII Summary
Topic:
Article: