From Key Terms in Philosophy of Mind (Continuum, 2010):
Chinese room, an argument, due to John Searle, against FUNCTIONALISM as well as certain conceptions of ARTIFICIAL INTELLIGENCE. The argument has, as a main component, the following THOUGHT EXPERIMENT: A computer program alleged by functionalists to allow a computer to conduct a conversation in Chinese is rewritten as a set of instructions in English that can be followed by John Searle even though he understands no Chinese. Searle is imagined to sit in a room in which cards with Chinese symbols emerge from one of two slots in the wall. Searle examines each incoming card and, though comprehending no Chinese, consults instructions concerning which appropriate response card should be selected and sent out of the second of the two wall slots. The essence of the Chinese room argument against functionalism is that since Searle can follow the program without understanding Chinese, functionalism is mistaken in its contention that intelligent processes such as understanding Chinese are constituted by program-following.
One noteworthy functionalist response to the Chinese room argument has come to be known as the systems response. According to the systems response, it is not John Searle who is running the program, but a larger system, of which he is a mere proper part, that runs the program. This larger system includes, in addition to John Searle, the cards coming in and out of the slots, and the book that Searle consults when each new card comes in. According to the systems response, no threat is posed to functionalism by the possibility that John Searle can play his part without understanding Chinese. It is the whole system that runs the program and thus, according to the functionalist, the whole system is what understands Chinese.
Searle has countered against the systems response that the cards and the book are irrelevant and that it is possible, at least in theory, for John Searle to memorize the contents of the book (or its functional equivalent) and replace the cards with heard and spoken Chinese utterances. In this imagined sce- nario, John Searle hears a Chinese question and then, though he doesn’t understand Chinese, consults his memory of the rule book, which describes different sounds in terms of their purely auditory, nonsemantic characteristics, and Searle then produces an appropriate sound with his mouth. Now the whole system running the program does not have John Searle as a mere proper part.
Another functionalist response to the Chinese room argument is the robot response. According to the robot response, the system comprising the Chinese room does not adequately satisfy the conditions for SYMBOL GROUNDING and thus no state of the system exhibits the appropriate INTENTIONALITY for understanding Chinese. If, instead, the system comprised by the whole Chinese room and its contents were embedded in a large robot so that it could act as the robot’s brain, the states of the room-system could acquire intentionality in virtue of their relations to the rest of the robot and the robot’s relations to its environment. Such a response emphasizes the importance of embodiment for cognition. See EMBODIED COGNITION.
I think another response is that one could not produce proper responses to questions by referencing a table, and that the true rules that would be necessary to produce believable responses would be complex enough that being able to effectively use them would have to be something very similar to understanding Chinese. In his second version, if Searle can distinguish spoken Chinese and can produce Chinese responses, at that point, no matter how he does it, it seems silly to say that he can't speak Chinese.
ReplyDelete