| Home | SKINNER early years | SKINNER, college years | SKINNER explains behaviorism & choice-- briefly | SKINNER, brief biography | I was not a Lab Rat--Deborah Skiner | Skinner on Programmed Instruction | Skinner on mind and wanting | Skinner: "Why I am Not a Cognitive Psychologist" | QUOTES & LINKS | Skinner's philosophy of mind--prof. Brown

B.F. SKINNER, works and life

Skinner: "Why I am Not a Cognitive Psychologist"

EXCELLENT—skinner explains the fundamental flaw with cognitive psychology.  This is the best summation of why behaviorism is superior to cognitive psychology

 

The only special status I have concerning my own behavior is a greater knowledge of what I have done in my past.—jk

 

The negative status I have concerning my own behavior is that I have been conditioned to respond to questions concerning it; and these verbal responses are often quite misleading—jk

 

Behaviorism removes us from the pedestal of god-like, and it places us with our ancestors, the animals--jk

 

For significant insight into human behavior, use the psychology that applies to our brothers, the animals—jk.

 

Man is a complex chicken—BF Skinner.

 

Cognitive psychology, poppycock on stilts—jk.  (A paraphrase of Jeremy Bentham on Natural rights).

 

Cognitive language ought to be considered as a shorthand for the complex reinforcement contingencies—and nothing more--jk

 

Thoughts are silent whispers—BF Skinner

 

Imaging of the brain has shown that what we call “thoughts” are merely  epiphenomena

 

 

Why I Am Not a Cognitive Psychologist

 

 

The variables of which human behavior is a function lie in the environment. We distinguish between (i) the selective action of that environment during the evolution of the species, (2) its effect in shaping and maintaining the repertoire of behavior which converts each member of the species into a person, and (3) its role as the occasion upon which behavior occurs. Cognitive psychologists study these relations between organism and environment, but they seldom deal with them directly. Instead they invent internal surro­gates which become the subject matter of their science.[1]

 

Take, for example, the so-called process of association. In Pavlov's experiment a hungry dog hears a bell and is then fed. If this happens many times, the dog begins to salivate when it hears the bell. The standard mentalistic explanation is that the dog "associates" the bell with the food. But it was Pavlov who associated them! "Associate" means to join or unite. The dog merely begins to salivate upon hearing the bell. We have no evidence that it does so because of an internal surrogate of the contingencies.

 

In the "association of ideas" the ideas are internal replicas of stimuli to which I shall return. If we have eaten lemons, we may taste lemon upon seeing a lemon or see a lemon upon tasting lemon juice, but we do not do this because we associate the flavor with the appearance. They are associated in the lemon. "Word associ­ations" are at least correctly named. If we say "home" when some­one says "house," it is not because we associate the two words but because they are associated in daily English usage. Cognitive asso­ciation is an invention. Even if it were real, it would go no further toward an explanation than the external contingencies upon which it is modeled.

 

Another example is abstraction. Consider a simple experi­ment. A hungry pigeon can peck any one of a number of panels bearing the names of colors—"white," "red," "blue," and so on, and the pecks are reinforced with small amounts of food. Any one of a number of objects—blocks, books, flowers, toy animals, and so on—can be seen in an adjacent space. The following contin­gencies are then arranged: whenever the object is white, no matter what its shape or size, pecking only the panel marked "white" is reinforced; whenever the object is red, pecking only the panel marked "red" is reinforced; and so on. Under these conditions the pigeon eventually pecks the panel marked "white" when the object is white, the panel marked "red" when the object is red, and so on. Children are taught to name colors with similar contingencies, and we all possess comparable repertoires sustained by the reinforcing practices of our verbal environments.

 

But what is said to be going on in the mind? Karl Popper[i]  has put a classical issue this way: "We can say either that (i) the universal term "white" is a label attached to a set of things, or that (a) we collect the set because they share an intrinsic property of "whiteness." Popper says the distinction is important; natural scientists may take the first position but social scientists must take the second. Must we say, then, that the pigeon is either attaching a universal term to a set of things or collecting a set of things because they share an intrinsic property? Clearly, it is the experi­menter not the pigeon who "attaches" the white key to the white objects displayed and who collects the set of objects on which a single reinforcing event is made contingent. Should we not simply attribute the behavior to the experimental contingencies? And if so, why not for children or ourselves?[2] Behavior comes under the control of stimuli under certain contingencies of reinforcement. Special contingencies maintained by verbal communities produce "abstractions." We do attach physical labels to physical things and we collect physical objects according to labeled properties, but comparable cognitive processes are inventions which, even if real, would be no closer to an explanation than the external contin­gencies.

 

Another cognitive account of the same data would assert that a person, if not a pigeon, forms an abstract idea or develops a concept of color. The development of concepts is an especially popular cognitive field. (The horticultural metaphor minimizes contributions from the environment. We may hasten the growth of the mind but we are no more responsible for its final character than farmers for the character of the fruits and vegetables they so care­fully nourish.) Color vision is part of the genetic endowment of most people, and it develops or grows in a physiological sense, possibly to some extent after birth. Nevertheless, most stimuli acquire control because of their place in contingencies of rein­forcement. As the contingencies become more complex, they shape and maintain more complex behavior. It is the environment that develops, not a mental or cognitive possession.

 

A passage from a recent discussion of the development of sexual identity in a child might be translated as follows: "The child forms a concept based upon what it has observed and been told of what it means to be a boy or girl." (A child's behavior is affected by what it has observed and been told about being a boy or girl.) "This concept is oversimplified, exaggerated, and stereo­typed." (The contingencies affecting the behavior are simplified and exaggerated and involve stereotyped behavior on the part of parents and others.) "As the child develops cognitively, its concepts, and consequently its activities, become more sophisticated and realistic." (As the child grows older, the contingencies become more subtle and more closely related to the actual sex of the child.) Children do not go around forming concepts of their sexual identity and "consequently" behaving in special ways; they slowly change their behavior as people change the ways in which they treat them because of their sex. Behavior changes because the contingencies change, not because a mental entity called a concept develops.

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

Many mentalistic or cognitive terms refer not only to con­tingencies but to the behavior they generate. Terms like "mind," "will," and "thought" are often simply synonyms of "behavior." A historian writes: "What may be called a stagnation of thought prevailed, as though the mind, exhausted after building up the spiritual fabric of the Middle Ages, had sunk into inertia." Exhaus­tion is a plausible metaphor when a quiet period follows an active one, but it was behavior that became stagnant and inert, pre­sumably because the contingencies changed. Certain social condi­tions ("the spiritual fabric of the Middle Ages") made people active. A second set of conditions, possibly produced by the very behavior generated by the first, made them much less so. To understand what actually happened we should have to discover why the contingencies changed, not why thought became stagnant or inert.

 

Behavior is internalized as mental life when it is too slight to be observed by others—when, as we say, it is covert. A writer has pointed out that "the conductor of an orchestra maintains a cer­tain even beat according to an internal rhythm, and he can divide that beat in half again and again with an accuracy rivaling any mechanical instrument." But is there an internal rhythm? Beating time is behavior. Parts of the body often serve as pendulums useful in determining speed, as when the amateur musician beats time with a foot or the rock player with the whole body, but other well-timed behavior must be learned. The conductor beats time steadily because he has learned to do so under rather exacting contingencies of reinforcement. The behavior may be reduced in scale until it is no longer visible to others.' It is still sensed by the conductor, but it is a sense of behavior not of time. The history of "man's develop­ment of a sense of time" over the centuries is not a matter of cognitive growth but of the invention of clocks, calendars, and ways of keeping records—in other words, of an environment that "keeps time."

 

When a historian reports that in a given period "a wealthy, brilliant, and traditional governing class lost its will," he is re­porting simply that it stopped acting like a wealthy, brilliant, and traditional governing class. Deeper changes are suggested by the term "will" but they are not identified. They could not have been changes in particular people, since the period lasted more than one lifetime. What changed were presumably the conditions affecting the behavior of members of the class. Perhaps they lost their money; perhaps competing classes became more powerful.

 

Feelings, or the bodily conditions we feel, are commonly taken as the causes of behavior. We go for a walk "because we feel like going." It is surprising how often the futility of such an explanation is recognized. A distinguished biologist, C. H. Waddington,[ii] re­viewing a book by Tinbergen, writes as follows:

 

It is not clear how far he Tinbergen would go along with the argument of one of the most perceptive critical discussions of ethology by Suzanne Langer, who argues that each step in a complex structure of behavior is con­trolled, not by a hierarchical set of neural centers, but by the immediate feelings of the animal. The animal, she claims, does the next thing in the sequence, not to bring about a useful goal, or even as a move toward an enjoyable consummation, but because it actually feels like doing it at the moment.

 

Evidently Waddington himself goes along partway with this "per­ceptive view."

 

But suppose Langer is right. Suppose animals simply do what they feel like doing? What is the next step in explaining their behavior? Clearly, a science of animal behavior must be replaced or supplemented by a science of animal feelings. It would be as extensive as the science of behavior because there would presumably be a feeling for each act. But feelings are harder to identify and describe than the behavior attributed to them, and we should have abandoned an objective subject matter in favor of one of dubious status, accessible only through necessarily defective channels of introspection. The contingencies would be the same. The feelings and the behavior would have the same causes.

 

A British statesman recently asserted that the key to crime in the streets was "frustration." Young people mug and rob because they feel frustrated. But why do they feel frustrated? One reason may be that many of them are unemployed, either because they do not have the education needed to get jobs or because jobs are not available. To solve the problem of street crime, therefore, we must change the schools and the economy. But what role is played in all this by frustration? Is it the case that when one cannot get a job one feels frustrated and that when one feels frustrated one mugs and robs, or is it simply the case that when one cannot earn money, one is more likely to steal it—and possibly to experience a bodily condition called frustration?

Since many of the events which must be taken into account in explaining behavior are associated with bodily states that can be felt, what is felt may serve as a clue to the contingencies. But the feelings are not the contingencies and cannot replace them as causes.

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

By its very nature operant behavior encourages the invention of mental or cognitive processes said to initiate action. In a reflex, conditioned or unconditioned, there is a conspicuous prior cause. Something triggers the response. But behavior that has been posi­tively reinforced occurs upon occasions which, though predisposing, are never compelling. The behavior seems to start up suddenly, without advance notice, as if spontaneously generated. Hence the invention of such cognitive entities as intention, purpose, or will. The same issues were debated with respect to the theory of evolu­tion and for the same reason: selection is a special causal mode not easily observed. Because controlling circumstances which lie in an organism's history of reinforcement are obscure, the mental sur­rogate gets its chance. Under positive reinforcement we do, as we say, what we are free to do; hence the notion of free will as an initiating condition. (I think it was Jonathan Edwards who said that we believe in free will because we know about our behavior but not about its causes.)

 

When we do not know why people do one thing rather than another, we say that they "choose" or "make decisions." Choosing originally meant examining, scrutinizing, or testing. Etymologically, deciding means cutting off other possibilities, moving in a direction from which there is no return. Choosing and deciding are thus conspicuous forms of behavior, but cognitive psychologists have nevertheless invented internal surrogates. Anatole Rapaport[iii] puts it this way: "A subject in a psychological experiment is offered a choice among alternatives and selects one alternative over others." When this happens, he says, "common sense suggests that he is guided by a preference." Common sense does indeed suggest it, and so do cognitive psychologists, but where and what is a preference? Is it anything more than a tendency to do one thing rather than another? When we cannot tell whence the wind cometh and whither it goeth, we say that it "bloweth where it listeth," and common sense, if not cognitive psychology, thus credits it with a preference. (List, by the way, is an example of a term with a physical referent used to refer to a mental process. It means, of course, to lean—as in the list of a ship. And since things usually fall in the direction in which they are leaning, we say that people lean toward a candidate in an election as a rough way of predicting how they will vote. The same metaphor is found in "inclination"; we are "inclined" to vote for X. But it does not follow that we have internal leanings and inclinations which affect our behavior.)

 

"Intention" is a rather similar term which once meant stretch­ing. The cognitive version is a critical issue in current linguistics. Must the intention of the speaker be taken into account? In an operant analysis verbal behavior is determined by the consequences which follow in a given verbal environment, and consequences are what cognitive psychologists are really talking about when they speak of intentions. All operant behavior "stretches toward" a future even though the only consequences responsible for its strength have already occurred. I go to a drinking fountain "with the intention of getting a drink of water" in the sense that I go because in the past I have got a drink when I have done so. (I may go for the first time, following directions, but that is not an excep­tion; it is an example of rule-governed behavior, of which more later.)

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

So much for the cognitive internalization of contingencies of reinforcement and the invention of cognitive causes of behavior.  Far more damaging to an effective analysis is the internalization of the environment. The Greeks invented the mind to explain how the real world could be known. For them, to know meant to be acquainted with, to be intimate with. The term cognition itself is related to coitus, as in the biblical sense in which a man is said to know a woman. Having no adequate physics of light and sound nor any chemistry of taste and odor, the Greeks could not under­stand how a world outside the body, possibly some distance away, could be known. There must be internal copies. Hence cognitive surrogates of the real world.

 

The distinction between reality and conscious experience has been made so often that it now seems self-evident. Fred Attneave[iv]  has recently written that "the statement that the world as we know it is a representation is, I think, a truism—there is really no way in which it can be wrong." But there are at least two ways, depending upon the meaning. If the statement means that we can know only representations of the outside world, it is a "truism" only if we are not our bodies but inhabitants located somewhere inside. Our bodies are in contact with the real world and can respond to it directly, but if we are tucked away up in the head, we must be content with representations.

 

Another possible meaning is that knowing is the very process of constructing mental copies of real things, but if that is the case how do we know the copies? Do we make copies of them? And is that regress infinite?  Some cognitive psychologists recognize that knowing is action but try to make the point by appealing to another mental surro­gate. Knowledge is said to be "a system of propositions." According to one writer, "when we use the word 'see' we refer to a bridge between a pattern of sensory stimulation and knowledge which is prepositional." But "prepositional" is simply a laundered version of "behavioral," and the "bridge" is between stimuli and behavior and was built when the stimuli were part of the contingencies.

 

Representational theories of knowledge are modeled on prac­tical behavior. We do make copies of things. We construct repre­sentational works of art, because looking at them is reinforced in much the same way as looking at what they represent. We make maps, because our behavior in following them is reinforced when we arrive at our destination in the mapped territory. But are there internal surrogates? When we daydream, do we first construct copies of reinforcing episodes which we then watch, or do we simply see things once again? And when we learn to get about in a given territory, do we construct cognitive maps which we then follow or do we follow the territory? If we follow a cognitive map, must we learn to do so, and will that require a map of the map? There is no evidence of the mental construction of images to be looked at or maps to be followed. The body responds to the world, at the point of contact; making copies would be a waste of time.

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

Knowledge is a key term in cognitive theory, and it covers a good deal of ground. It is often contrasted with perception. We are said to be able to see that there are three dots on a card but only to know that there are thirteen after counting them, even though counting is a form of behavior. After noting that one spiral can be seen to be continuous but that another can be discovered to be so only by tracing, Bela Julesz[v] has said that "any visual task that cannot be performed spontaneously, without effort or deliberation, can be regarded as a cognitive task rather than as a perceptual one," though all the steps in that example are also clearly behavioral.

 

"Knowing how to do something" is an internal surrogate of behavior in its relation to contingencies. A child learns to ride a bicycle and is then said to possess knowledge of how to ride. The child's behavior has been changed by the contingencies of rein­forcement maintained by bicycles; the child has not taken possession of the contingencies.

 

To speak of knowing about things is also to construct an in­ternal surrogate of contingencies. We watch a football game and are then said to possess knowledge of what happened. We read a book and are said to know what it is about. The game and the book are somehow "represented" in our minds: we are "in posses­sion of certain facts." But the evidence is simply that we can describe what happened at the game and report what the book was about. Our behavior has been changed, but there is no evidence that we have acquired knowledge. To be "in possession of the facts" is not to contain the facts within ourselves but to have been affected by them.

 

Possession of knowledge implies storage, a field in which cognitive psychologists have constructed a great many mental sur­rogates of behavior. The organism is said to take in and store the environment, possibly in some processed form. Let us suppose that a young girl saw a picture yesterday and when asked to describe it today, does so. What has happened? A traditional answer would run something like this: when she saw the picture yesterday the girl formed a copy in her mind (which, in fact, was really all she saw). She encoded it in a suitable form and stored it in her memory, where it remained until today. When asked to describe the picture today, she searched her memory, retrieved the encoded copy, and converted it into something like the original picture, which she then looked at and described. The account is modeled on the physical storage of memoranda. We make copies and other records, and respond to them. But do we do anything of the sort in our minds?

 

If anything is "stored," it is behavior. We speak of the "ac­quisition" of behavior, but in what form is it possessed? Where is behavior when an organism is not behaving? Where at the present moment, and in what form, is the behavior I exhibit when I am listening to music, eating my dinner, talking with a friend, taking an early morning walk, or scratching an itch? A cognitive psycholo­gist has said that verbal behavior is stored as "lexical memories." Verbal behavior often leaves public records which can be stored in files and libraries, and the metaphor of storage is therefore par­ticularly plausible. But is the expression any more helpful than saying that my behavior in eating my dinner is stored as prandial memories, or scratching an itch as a prurient memory? The ob­served facts are simple enough: I have acquired a repertoire of behavior, parts of which I display upon appropriate occasions. The metaphor of storage and retrieval goes well beyond those facts.

 

The computer, together with information theory as designed to deal with physical systems, has made the metaphor of input-storage-retrieval-output fashionable. The struggle to make machines that think like people has had the effect of supporting theories in which people think like machines. Mind has recently been defined as "the system of organizations and structures ascribed to an indi­vidual that processes inputs . . . and provides output to the vari­ous subsystems and the world." But organizations and structures of what?  (The metaphor gains power from the way in which it disposes of troublesome problems. By speaking of input one can forget all the travail of sensory-psychology and physiology; by speak­ing of output one can forget all the problems of reporting and analyzing action; and by speaking of the storage and retrieval of information one can avoid all the difficult problems of how or­ganisms are indeed changed by contact with their environments and how those changes survive.)

 

Sensory data are often said to be stored as images, much like the images said to represent the real world. Once inside, they are moved about for cognitive purposes. There is a familiar experiment on color generalization in which a pigeon pecks at a disk of, say, green light, the behavior being reinforced on a variable interval schedule. When a stable rate of responding develops, no further reinforcements are given, and the color of the disk is changed. The pigeon responds to another color at a rate which depends upon how much it differs from the original; rather similar colors evoke fairly high rates, very different colors low rates. A cognitive psy­chologist might explain the matter in this way: The pigeon takes in a new color (as "input"), retrieves the original color from memory, where it has been stored in some processed form, puts the two colored images side by side so that they may be easily compared, and after evaluating the difference, responds at the appropriate rate. But what advantage is gained by moving from a pigeon that responds to different colors on a disk to an inner pigeon that re­sponds to colored images in its mind? The simple fact is that because of a known history of reinforcement, different colors control diff­erent rates.

 

The cognitive metaphor is based upon behavior in the real world. We store samples of material and retrieve and compare them with other samples. We compare them in the literal sense of putting them side by side to make differences more obvious. And we respond to different things in different ways. But that is all. The whole field of the processing of information can be reformulated as changes in the control exerted by stimuli.

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

The storage of practical knowledge raises another problem. When I learn, say, to take apart the rings of a puzzle, it seems unlikely that I store my knowledge of how to do so as a copy of the puzzle or of the contingencies the puzzle maintains for those trying to solve it. Instead cognitive theory holds that I store a rule. Rules are widely used as mental surrogates of behavior, in part because they can be memorized and hence "possessed," but there is an important difference between rules and the contingencies they describe. Rules can be internalized in the sense that we can say them to ourselves, but in doing so we do not internalize the contingencies.

 

I may learn to solve the puzzle in either of two ways. I may move the rings about until I hit upon a response that separates them. The behavior will be strengthened, and if I do the same thing a number of times, I will eventually be able to take the rings apart quickly. My behavior has been shaped and maintained by its effects on the rings. I may, on the other hand, simply follow printed directions supplied with the puzzle. The directions describe be­havior that separates the rings, and if I have already learned to follow directions, I can avoid the possibly long process of having my behavior shaped by the contingencies.

 

Directions are rules. Like advice, warnings, maxims, proverbs, and governmental and scientific laws, they are extremely important parts of a culture, enabling people to profit from the experience of others. Those who have acquired behavior through exposure to contingencies describe the contingencies, and others then circum­vent exposure by behaving in the ways described. But cognitive psychologists contend that something of the same sort happens in­ternally when people learn directly from the contingencies. They are said to discover rules which they themselves then follow. But rules are not in the contingencies, nor must they be "known" by those who acquire behavior under exposure to them. (We are lucky that this should be so, since rules are verbal products which arose very late in the evolution of the species.)

 

The distinction between rules and contingencies is currently important in the field of verbal behavior. Children learn to speak through contact with verbal communities, possibly without instruc­tion. Some verbal responses are effective and others not, and over a period of time more and more effective behavior is shaped and maintained. The contingencies having this effect can be analyzed. A verbal response "means" something in the sense that the speaker is under the control of particular circumstances; a verbal stimulus "means" something in the sense that the listener responds to it in particular ways. The verbal community maintains contingencies of such a nature that responses made upon particular occasions serve as useful stimuli to listeners who then behave appropriately to the occasions.

 

More complex relations among the behaviors of speaker and listener fall within the fields of syntax and grammar. Until the time of the Greeks, no one seems to have known that there were rules of grammar, although people spoke grammatically in the sense that they behaved effectively under the contingencies maintained by verbal communities, as children today learn to talk without being given rules to follow. But cognitive psychologists insist that speakers and listeners must discover rules for themselves. One authority, indeed, has defined speaking as "engaging in a rule-governed form of intentional behavior." But there is no evidence that rules play any part in the behavior of the ordinary speaker. By using a dic­tionary and a grammar we may compose acceptable sentences in a language we do not otherwise speak, and we may occasionally con­sult a dictionary or a grammar in speaking our own language, but even so we seldom speak by applying rules. We speak because our behavior is shaped and maintained by the practices of a verbal community.

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

Having moved the environment inside the head in the form of conscious experience and behavior in the form of intention, will, and choice, and having stored the effects of contingencies of rein­forcement as knowledge and rules, cognitive psychologists put them all together to compose an internal simulacrum of the organism, a kind of doppelganger, not unlike the classical homunculus, whose behavior is the subject of what Piaget and others have called "subjective behaviorism." The mental apparatus studied by cognitive psychology is simply a rather crude version of contingencies of reinforcement and their effects.

 

Every so-called cognitive process has a physical model. We associate things by putting them together. We store memoranda and retrieve them for later use. We compare things by putting them side by side to emphasize differences. We discriminate things one from another by separating them and treating them in differ­ent ways. We identify objects by isolating them from confusing surroundings. We abstract sets of items from complex arrays. We describe contingencies of reinforcement in rules. These are the actions of real persons. It is only in the fanciful world of an inner person that they become mental processes.

 

The very speed with which cognitive processes are invented to explain behavior should arouse our suspicions. Moliere made joke of a medical example more than three hundred years ago: am asked by the learned doctors for the cause and reason why opium puts one to sleep,  to which I reply that there is in it soporific virtue the nature of which is to lull the senses." Moliere's candidate could have cited evidence from introspection, invoking a collateral effect of the drug, by saying:  "To which I reply that opium makes one feel sleepy." But the soporific virtue itself is sheer invention, and it is not without current parallels.

 

A conference was recently held in Europe on the subject of scientific creativity. A report published in Science[vi]6 begins by point­ing out that more than ninety percent of scientific innovation has been accomplished by fewer than ten percent of all scientists. The next sentence might be paraphrased in this way: "I am asked by the learned doctors for the cause and reason why this should be so, to which I reply that it is because only a few scientists posses creativity." Similarly, "I am asked by the learned doctors for the cause and reason why children learn to talk with great speed, to which I reply that it is because they possess linguistic competence.' Moliere's audiences laughed.

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

Cognitive psychologists have two answers to the charge that the mental apparatus is a metaphor or construct. One is that cognitive processes are known through introspection. Do not all thinking persons know that they think? And if behaviorists say they do not, are they not either confessing a low order of mentality or acting in bad faith for the sake of their position? No one doubts that behavior involves internal processes; the question is how well they can be known through introspection. As I have argued elsewhere, self-knowledge, consciousness, or awareness became possible only when the species acquired verbal behavior, and that was very late in its history. The only nervous systems then available had evolved for other purposes and did not make contact with the more impor­tant physiological activities. Those who see themselves thinking see little more than their perceptual and motor behavior, overt and covert. They could be said to observe the results of "cognitive processes" but not the processes themselves—a "stream of conscious­ness" but not what causes the streaming, the "image of a lemon" but not the act of associating appearance with flavor, their use of an abstract term but not the process of abstraction, a name recalled but not its retrieval from memory, and so on. We do not, through introspection, observe the physiological processes through which behavior is shaped and maintained by contingencies of reinforce­ment.

 

But physiologists observe them and cognitive psychologists point to resemblances which suggest that they and the physiologists are talking about the same things. The very fact that cognitive processes are going on inside the organism suggests that the cogni­tive account is closer to physiology than the contingencies of rein­forcement studied by those who analyze behavior. But if cognitive processes are simply modeled upon the environmental contingen­cies, the fact that they are assigned to space inside the skin does not bring them closer to a physiological account. On the contrary, the fascination with an imagined inner life has led to a neglect of the observed facts. The cognitive constructs give physiologists a mis­leading account of what they will find inside.

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

In summary, then, I am not a cognitive psychologist for several reasons. I see no evidence of an inner world of mental life relative either to an analysis of behavior as a function of environ-forces or to the physiology of the nervous system. The respective sciences of behavior and physiology will move forward most rapidly if their domains are correctly denned and analyzed.

 

I am equally concerned with practical consequences. The appeal to cognitive states and processes is a diversion which could well be responsible for much of our failure to solve our problems. We need to change our behavior and we can do so only by changing our physical and social environments. We choose the wrong path at the very start when we suppose that our goal is to change the "minds and hearts of men and women" rather than the world in which they live.

 

 

Imaging of the brain has shown that what we call “thoughts” are merely

epiphenomena

 



[1]  Comments by JK  Compared to other mammals, the easy of which man adopts new behavior is explained by the greater role of operant conditioning, and conversely the reduce role of respondent conditioning. This is a result of evolution is reflected in the structure of the brain. 

[2] This observation is at the heart of Skinners saying that “man is a complex chicken”, viz., that we are not different in kind, but rather in complexity.  This is true because both man and chickens learn by both respondent and operant conditioning. 



[i] 1 Popper, K. Poverty of historicism. London, 1957.

[ii]  2 Waddington, C. H., New York Review, February 3, 1974

[iii]  3 Rapaport, A. Experimental games and their uses in psychology. General Learning Press,   1973.

[iv] * Attneave, F., American Psychologist, July 1974.

[v] 5 Julesz, B., Scientific American, April 1975.

[vi] 6 Science, June 21, 1974.

 

Enter supporting content here