(What follows is the text of my midterm for my philosophy of mind class.)
In his article “Troubles with Functionalism”, Ned Block outlines a thought experiment that he takes to be a counterexample to functionalism, the thesis that if two or more beings are identical in how they function they have (or are capable of having) the same mental states. In what follows I’ll argue, using a different thought experiment, that if Block’s argument refutes functionalism it also refutes materialism for precisely the same reasons.
Block invites us to consider the population of China, which has been ordered to simulate a human brain’s “program.” (Philosophy of Mind, p. 96) Granted, this simulation—which we can call the Chinese Nation—consists of people rather than neurons, and they communicate by radio rather than by neurotransmitters, but it is functionally identical to a human brain. Nevertheless, Block thinks that your brain is phenomenally conscious—there’s something it’s like to be it—while the Chinese Nation is not. (Philosophy of Mind, p. 97) Since your brain and the Chinese Nation are functionally the same, and since Block thinks it would be absurd to say that the Chinese Nation has mental states, it cannot be true that functionally identical systems have the same mental states, and so functionalism must be false.
Suppose, now, that the Chinese government orders the Chinese population to do something different. One by one, they start replacing the Chinese citizens with neurons. The neurons are suspended in nutrient baths, and housed in small containers. For each kind of neuron the containers have a supply of the neurotransmitters that that kind of neuron normally responds to. One part of the container takes in electrical signals and releases the neurotransmitters when it receives the appropriate input. Another part takes in the neurotransmitters released by the neuron’s synaptic vesicles and generates an appropriate electrical signal as output. Suppose further that the electrical input and output signals are coded in such a way that, when connected to the radios being used by the Chinese citizens, the signals are indistinguishable from those generated by the Chinese citizens. As the citizens are gradually replaced by the neurons, there should be no change in the activity of the system. After all, in Block’s original thought experiment we supposed that the citizens mimicked the activity of individual neurons, and collectively the citizens were organized so as to mimic the behavior of the brain as a whole. There is thus no reason for the activity of the system to change in any relevant way as the citizens are replaced by the neurons, and the system should continue to implement the same program throughout the transition
Once this process is complete, we will have a system which functions in the same way as the Chinese Nation and a normal human brain. Let’s call this system the scattered brain, for the only important difference between it and a normal brain is that its neurons are scattered throughout a larger region of space. Block has no doubt that human brains are phenomenally conscious, but he does doubt that the Chinese Nation is phenomenally conscious. (Philosophy of Mind, p. 97) But what about the scattered brain? Is it phenomenally conscious or not?
I think Block faces a dilemma. If he says that the scattered brain is phenomenally conscious, he will have to confront the difficult task of explaining why it enjoys this kind of consciousness while the Chinese Nation does not. If the idea is that one can see a priori that an entity of that sort is just the wrong sort of thing to be phenomenally conscious, I would ask those sympathetic to this idea to consider that the Chinese Nation differs by a very small amount from the system that results from replacing one of the citizens with a neuron, which in turn differs by a very small amount from the system that results from replacing another citizen with another neuron, which in turn… until, at long last, we have the scattered brain. Now, if one can see a priori that the Chinese Nation is not phenomenally conscious, one can surely also see a priori that the system that results from replacing one of the citizens with a neuron is also not phenomenally conscious, in which case one can surely also see a priori that the system that results from replacing another citizen with another neuron is also not phenomenally conscious… until, at long last, one can see a priori that the scattered brain is not phenomenally conscious, contra our original supposition
If Block still thinks that the scattered brain enjoys phenomenal consciousness, he must either accept the idea that the Chinese Nation enjoys phenomenal consciousness after all, or else accept that at some point in the series one can suddenly no longer see a priori that the system in question is not phenomenally conscious. I doubt that Block would accept the first alternative. The second alternative would be problematic for any physicalist, for then there would be two like systems that are very different mentally—one system could not be phenomenally conscious while the other one could—even though the physical difference between them is very slight.
Could the difference between the scattered brain and the Chinese Nation consist in the fact that the Chinese citizens are conscious while the individual neurons are not? But then why can a phenomenally conscious being be composed of unconscious parts, but not of conscious parts? In fact, we have reason to think that it could, because both the right and the left hemispheres of one’s brain are phenomenally conscious even though one’s whole brain is phenomenally conscious too. In any case, someone who raises this objection needs to give an account of why this should be a relevant difference. And if anyone is sympathetic to it, I’d ask them to imagine that their brain is the scattered brain, and that the transition process is run in reverse, so that their neurons are gradually replaced by Chinese citizens. How plausible is it that one would gradually lose one’s phenomenal consciousness as this transition proceeds? If one is inclined to think that it’s not very plausible one should also be inclined to think that whether a being has phenomenally conscious parts has no bearing on whether that being is itself phenomenally conscious or not.
What, though, if Block thinks that the scattered brain is not phenomenally conscious? In that case I would ask what relevant difference there could be between it and a normal brain. The scattered brain is composed of neurons, just like a normal brain. Furthermore, these neurons communicate with each other in a way that is by hypothesis functionally equivalent to the way the neurons of a normal brain communicate. The only real difference seems to be that the neurons of the scattered brain are farther away from each other than those of a normal brain. So if Block wants to raise this objection he owes us an explanation as to why a greater spatial separation should make any difference as far as phenomenal consciousness is concerned
I think there is one way to avoid the above dilemma, but it comes at a price. The problem is that we have taken it for granted that a normal human brain is phenomenally conscious while either the Chinese Nation or the scattered brain is not, even though all three of them are functionally the same. Something has to give. Block thought it was functionalism, for it seemed to him that the Chinese Nation could not be phenomenally conscious. I hope to have shown that this is untenable, because if the scattered brain is phenomenally conscious, then the Chinese Nation should be too, and if a normal brain is phenomenally conscious, then there is no plausible reason why the scattered brain could not be
The only way out that I can see, then—if one doesn’t want to accept functionalism—is to hold that neither the human brain, nor the Chinese Nation, nor the scattered brain is phenomenally conscious. Instead, one would have to accept some form of dualism and hold that each of these three systems could be said to be phenomenally conscious in the sense that they could be associated with a phenomenally conscious mind, but strictly speaking none of them could be said to be phenomenally conscious in and of itself. So while I do think it is possible to reject functionalism, I think one can only do so plausibly if one is prepared to reject materialism as well. My prediction is that, for many philosophers of mind, that is far too high a price to pay.
Block, Ned. “Troubles with Functionalism”. Excerpted from C. W. Savage, ed., Perception and Cognition (University of Minnesota Press, 1978), pp. 261-325. Reprinted in Philosophy of Mind: Classical and Contemporary Readings, pp. 94-8.
Chalmers, David J. Philosophy of Mind: Classical and Contemporary Readings. New York, Oxford University Press, 2002.