Excerpt: "The only reality we can perceive is the one our brain allows us to. This thought has all sorts of implications to the nature of reality, and to how we define what is real..."
After a week on the origin of life and another on the origin of the universe, we now turn to the third installment of this series, a digression on the origin(s) of mind. The plural expresses the many ways in which we can think of mind and its origin. I shall touch on some of these without any hope or intention of being either exhaustive or coherent. For when it comes to mind, I confess my perplexity. And I am sure I am not alone.
First some definitions, just to start the controversy. Since I am not a cognitive psychologist or a philosopher of mind, I hope my co-bloggers Tania Lombrozo and Alva Noƫ will come to the rescue in due course. We play with three words, "brain," "mind" and "consciousness," and possibly a fourth, "intelligence."
Brain is easy. All vertebrate animals have it in their skull; it's the central organ of the nervous system. An interesting but tangential question is which is the simplest brain, or what animal has the simplest brain. Jellyfish, for example, have diffuse nerve nets but no central nervous system. The winners are worms, who have small bundles of neurons arranged as nerve cords running along the length of their bodies.
Mind, consciousness and intelligence are hard. From a scientific perspective, all three are products of the brain. There is matter and nothing else. The question then is to figure out how the brain does it: how we can ask profound questions and write essays about them while dogs and chimps can't, even though they are arguably intelligent. There are levels of intelligence, levels of consciousness and levels of mindfulness. So, one of the questions about origin of mind is how it evolved to the level we see today.
In this connection, I note the recent New York Review of Books essay by John Searle on Christof Koch's book, Consciousness: Confessions of a Romantic Reductionist. Searle criticizes Koch for his attempt to indiscriminately use information theory to explain consciousness, starting with attributing it to devices such as thermostats or cellphones. Koch claims that even a photodiode is conscious: It turns on when the light is on and off when the light is off. So, its consciousness has two states (on and off) and minimal information.
Perhaps Koch is paving the road to conscious machines, but I must agree with Searle that unless there is some level of subjective understanding of the action that is undertaken, there is no consciousness to speak of. When cellphones start chatting with one another, we should be duly impressed.
Consciousness needs a conscious observer. And that's the rub.
To facilitate things, let's say that mind is a faculty that conscious, intelligent beings have, the ability to think, feel and reflect about the world and the subjective experiences it presents. It is then legitimate to ask whether other animals have minds or whether machines can one day have them too. This is a key aspect of the debate, since the mind-body problem has traditionally split the line between two sides: Mind is a property of brains that reach a certain level of cognitive complexity and hence a state of matter; or mind is not matter — it is something that can't be reduced to how the brain works.
Of course, this kind of mind-matter dualism dates back at least to Descartes, something that nowadays is mostly not seriously considered, at least by cognitive neuroscientists. (See, for example, the very heated debate surrounding philosopher Thomas Nagel's latest book, Mind and Cosmos. For a good review with many references, see the contribution by Jennifer Schuessler to the New York Times. See also the powerful essay on Nagel's book by Adam Frank in these pages. Nagel goes against scientific reductionism and proposes that mind is a property of the universe, something beyond the merely quantifiable. He is not alone, even among scientists.)
Attributing some sort of teleology to the universe in order to explain mind is merely an updated version of the biblical aspirations that we are special creatures because we were created with a purpose. Instead, I would argue that we can be special without having been created, that what makes us special is precisely the opposite — the fact that we evolved in a universe that is pretty hostile to life, especially complex multicellular life. If you don't believe life is rare, take a look around our planetary neighbors. (Yes, there should be life elsewhere in the universe; but no, it doesn't follow that this life would have evolved to the level of intelligence we see here.)
The wonder lies not in some sort of unknowable property of the cosmos but in the fact that we do have a mind to ponder such things. The answer is within our heads, and the challenge is to find it without being able to step outside and take an objective look.
Recently, Brazilian neuroscientist Suzana Herculano-Houzel, from the Federal University of Rio de Janeiro, showed that the often-quoted figure that the human brain has about 100 billion neurons is off by some 14 billion. The proper number is around 86 billion, connected through trillions of synapses, all packed in about 2.8 pounds. How perplexing that this little bundle of nerve cells can do what it can! Noninvasive probes such as fMRIs provide amazing activity maps of what goes on where as different stimuli are teased. Without getting lost in the immense maze of cognitive experiments being performed today, it is clear that the brain integrates sensorial stimuli from the outside and re-creates our sense of reality from within.
The only reality we can perceive is the one our brain allows us to. This thought has all sorts of implications to the nature of reality, and to how we define what is real, something Adam touched on in his essay and that I hope to come back to sometime in the near future.
What we call the world happens inside our brains, teased from the outside or from the inside. (Dreams are worlds within, with arbitrary physical laws and narrative rules.) A key question to be answered is whether consciousness needs organic matter to sustain it or whether it can exist merely through electronic circuits. Of course, we all like to think that circuits will do it, that it is a matter of time before we build an intelligent, conscious machine. But we don't really know whether that's even possible, do we?
NOTE: I'd like to add one more book to the reading list from the first week:
No comments:
Post a Comment