Functionalism is also much less rigid a theory when compared to behaviorism. Behaviorism is the theory that all states of mind could be explained by observing a persons behavior, and that a person’s mind is just a bunch of psychological dispositions. The problem with behaviorism is demonstrated by Rodin’s Le Penseur. The problem was, if there was no behavior then the mental states could not be described.
Materialism is defined by William Lyons as a view that “…everything mental is totally inside the head”. Materialism is similar to functionalism although, materialism says nothing about the relationship between the mind and the body.
Searle did not support the idea that idea that the brain could be represented by a digital computer, or that “The implementation of an appropriate computer program is sufficient for thinking”. Searle based his argument against strong A.I. by saying that computers can only manipulate symbols, and can not understand them. Therefore, computer instructions are only syntactic and not semantic. The computer program operates like this:
Input (possible from keyboard)→ software program→central processing unit (CPU)→ machine language (binary)→ output (possible screen). The CPU processes the input using logical conditions., for example, if there is a 1 in memory location a, then put a 0 in memory location b. This means that the computer has no understanding of what it is doing, and is just manipulating symbols in a brute force manner. Searle illustrates this point when he says; “There is more to having a mind then having a formal or syntactical process.”
Searle uses a thought experiment to illustrate his point called the Chinese room. First he says that a computer program was designed to simulate an understanding of Chinese. He then poses the question; does the computer really understand Chinese? Then to prove the fact that the computer does not in fact understand Chinese, he develops the Chinese room. Searle, who has no knowledge of Chinese, is locked in a room. Inside the room are baskets filled with cards on which Chinese characters are printed. When a Chinese person slides a character under the door, Searle’s job is to reply by sliding one of the symbols from one of the baskets back under the door. Searle has a list of instructions which say, if this squiggle1 comes in, then push squiggle2 out under the door. Searle has no understanding of the squiggle and ad he is just following the instructions. By, analogy, the computer, which uses a similar type of logic also, has no understanding of what the meanings of the inputs are. Therefore, the computer is not thinking.
Computers simulate mental ability; the computer is not intelligent, even though it may appear to be doing intelligent things. One example of this was the defeat of the chess grand master Gary Kasparov by the computer, Deep Blue. For human beings, chess requires a high level of thought. The person has to visualize and remember patterns that are occurring through conscious thought and even use psychology to try and figure out their opponent’s strategy. The computer plays chess in a totally different way. The computer computes a board position “tree” for the white and black pieces about 20 moves into the future, this means the computer’s tree holds about 3 million possible moves. Then the computer uses an algorithm based on simple logic to compare the “goodness” of each move. The computer then makes its moves base on this algorithm. Therefore, just like in the Chinese room, the computer is comparing specific inputs (moves) with logical conditions.
The second argument Searle makes in his essay is that only biological entities can produce mental states capable of understanding or intentionality. Searle believes that brains cause minds, that is, “mental processes are caused by behavior elements in the brain”. He also says that minds have semantic content, what we think, our beliefs desires etc. are about something. Secondly, a computer program is completely defined by its syntax, and syntax is not sufficient for semantics. This means there is no program implemented on a digital computer that will give the system the capability of having mental processes. It follows that the brain is not a digital computer. Searle then says; “as a matter of biology, brain processes cause mental states.” That is, through the biological make up of the brain, mental states are realized. Therefore, mental states such as “consciousness, intentionality, subjectivity and mental causality are all part of our biological life history” Therefore, an object capable of mental states would have to have powers equivalent to a human, therefore, it is necessary for an entity capable of brain states to have a biological make up.
In Margret Boden’s essay “Escaping from the Chinese Room”; she argues Searle’s two main points. These points are; that programming is purely syntactic and that non biological machines are not capable of understanding or causality.
Boden argued that programming must have a “toe hold” in semantics. Boden says the programmer has intentionality and he is relaying meaningful information into the programming. This point is shown when Boden describes programmers creating a program;
“They [programmers] can do this [program] only if they can externally specify a mapping between the formalism and matters of interest to them”.
Therefore, according to Boden, the computer must “read” the important information. Boden says that because the program is able to perform a task it most have some semantics, equivalently, Searle understands English and that is how he is able to understand the instructions for the cards. Boden remarks;
“The inherent procedural consequences of any computer give it a toe hold in semantics”
But the compile or interpreter program (Searle in the Chinese room) that interprets the program is again based in simple logical conditions and can only read a specific set of instructions from the software that was designed for it. The thinking Searle in the room is doing is again very different from what the CPU is doing. Boden’s point is weak and would rely on a unified programming language that both the programmer and the computer could understand.
Boden also replied to Searle’s claim that only biological entities are capable of causal powers needed to understand or possess intentionality. Boden uses a different definition of intentionality then Searle
“…an intentional system is one whose behavior we can explain predict and control only by ascribing beliefs, goals and rationality to it.”
She makes the claim that Searle’s restriction that only biological machines can have intentionality is based on intuition. She says that Searle has no more evidence that beer cans are the reason behind intentionality compared to Sodium pumps.
Boden also replied to Searle’s example of a walking, talking robot. Searle said that the brain of the robot was exactly like Searle in the Chinese room, and therefore, the robot does not have meaningful input and does not have the causal powers for generating understanding. Boden’s point is that the brain of the robot (Searle) cannot be credited with the understanding of the robot; “computationalists do not ascribe intentionality to the brain.” Boden tries to show that these intentional states are the product of the robot as a whole, including all its solenoids, photo diodes etc. This is shown when Boden writes “…intentional states such as these are products of people not brains.” She tries to demonstrate that the robot and the human brain are the same, in that they are both made up of small very stupid parts which build into a collective brain that is capable of understanding;
“The fact that a certain light sensitive cell can respond to intensity gradients and that one neuron can inhibit the firing of another”
Problem with this explanation is that now instead of input from a keyboard, the robot now receives its information from its sensors. The input will still be processed programming in the CPU of the robot.
Boden did not disprove Searle’s point that the syntax of a program is not sufficient for understanding of a digital computer system. Boden could not show that non-biological entity did have understanding, but she did show that Searle’s argument on this point was a little weak because it relied on in tuition. The functionalism theory was shown to be an attractive theory a long as it was not taken too literally. Consequently it was shown in this essay that due to syntactical nature of computer programs and the computers lack of causal powers needed for a machine to have mental states.