The Illusory Soul
A follow-on effect of functionalism many find disturbing is that the “I” inside us is just a byproduct of the brain reacting instinctively to the environment, no differently than we assume it does in an earthworm or flea.
On the subject, cognitive scientists Marvin Minsky and Steven Pinker have both argued that consciousness is an illusion constructed by the brain’s subcomponents, and as such there is no real “I” inside our heads making independent decisions. Computer scientist Anthony Simola’s explains in his book, “The Roving Mind: A Modern Approach to Cognitive Enhancement”: “Indeed, one of the longstanding tenets of neuroscience and philosophy is that minds are what brains do — in other words, our consciousness arises from electrical signals and there is no soul or ghost in the machine, so to speak. Consciousness, then, is a mere hallucination — we only feel like we are in charge and make free decisions, whereas in reality our decisions are dictated to us by the laws of physics and the motion of particles that were put in place 13.7 billion years ago at the birth of our universe.”
Along these lines, Simola explains that if functionalism works, it’s because we actually can’t replace bits of your brain with technology and expect “you” to remain inside it because there’s no “you” to begin with. “What does exist — and only as an abstraction — is a pattern of continuation that can be stored somewhere as information, including your DNA, your memories, and the infinite number of iterations of your persona that followed each other.”
But the paradox proposed by functionalism is that the being created when we somehow transpose a mind will actually think and feel, at least as we understand these terms. Regardless of whether the above is true, we’re still machines whose purpose is to think and feel, reacting to outside stimulus and deceived by the illusion of a free will. Granted, this implies that the facsimile will act and feel the same.
Joel Richeimer, co-chairman of the neuroscience department at Kenyon College, Ohio, says that if we gradually replaced the necessary parts with machinery the result wouldn’t be a human because of the unique confluence of what it is that makes something human to begin with.
“A human is a member of a species that has a specific evolutionary history,” he says. “A robot won’t be human, but maybe that’s not the question.” But he agrees that — according to functionalism — if we created a functional equivalent to a human being, placed it in a human environment and gave it the proper inputs, it would “think” and “feel” inasmuch as we know what such qualities mean.
“What is pain? It’s the function to warn us of possible tissue damage. If that’s the correct analysis of pain, then it’s theoretically possible to build a machine that experiences pain,” he says.
The Nuance Of Flesh
Which brings us to the next hurdle. Those scientists excited by the advent of mind transplantation, artificial intelligence, and every other technology that would be enabled by the brain’s alleged similarity to the computer, were actually dead wrong — the two are nothing alike.
Rather than the neuron being a chemical-based instrument no different than the electromagnetic-based computer byte, Greenfield reminds us that brains have far more analogue gradients than the binary on/off states of a machine.
“An action potential is the universal signature of a neuron,” she says, “so it made it easy for people to draw parallels with on/off switches, but we know there’s a lot more going on than just the generation of an action potential.”
Meaning that when we scan our hypothetical brain state, we have to do much more than just see whether all 100 billion neurons are switched on or off and which of the 7,000 synaptic connections they have committed to at the same instant. To move or digitize the complete consciousness, we’d have to account for every possible chemical reaction and behavior, no matter how small.
The Mind In Situ
Given the technological advancements of computing power, maybe we can get over the issue of the massive memory space needed for capturing an individual state, nevertheless, a whole new problem awaits us. The brain doesn’t exist in isolation. It both receives and sends a constant stream of communication to and from every other system in the body. One view of anatomy might be that the brain is merely a clearinghouse for disparate bits of information, with consciousness being a mere foil for driving the body to meet its needs (one of the views of functionalism).
You’re hungry or sexually aroused in response to needs communicated by the body, but they’re felt as brain states — some would say emotions — that are as much a part of your sense of self as your political views or career ambitions.
It works the other way too. Study after study confirms that married people live longer, and that religious people are more satisfied, not because of their piety, but because they feel they belong to a supportive community.
“Think of the placebo effect,” Greenfield says. “We know your mental state can change things like your immune system physically. If you’re depressed you get more ill and so on.”
Perhaps all this means is that to take a complete copy of the brain we’d also have to scan and capture the complete state of everything connected with it (central nervous system, reproductive system, digestive system, etc.)?
If you transposed the mind state onto another body without all that, might you end up with a catastrophic system crash, because every individual body is so completely different? And not just at the microscopic cellular level. Put the confident mind of a tall, strapping man into a short, pudgy body of a boy, and who knows what kind of psychosis might result?
The memories and the sense of self forming your brain state right now have been inextricably informed and construed by everything from your upbringing to the foibles of your immune system, all of it a roiling cauldron of instincts and intents particular to your anatomical nexus.
Richeimer agrees, saying that the “meaning” of brain events depends on a working body interacting with the world. “Without it, a brain would be a bit like your desktop computer,” he says. “I can ask Google a question, but the answer would mean nothing to the computer, even if it is correct.”
All Or Nothing
All of which brings us to new territory entirely. In trying to transfer a mind and a sense of self successfully, it seems we have to take everything — every neural impulse, heartbeat, muscle tension, and skin temperature reading down to the molecular level — and make a complete copy of it, either digitally or anatomically.
It could be said that once we reach that stage of technology we’re not transplanting a mind at all but simply cloning an organism. Even if we manage to make a perfect duplicate of you with your particular experiences, memories, and sense of self, it would only be for an instant.
“Let’s just say every last molecule of you at this very moment was somehow translocated to another place,” Greenfield suggests. “As soon as that copy is in a different environment and a different place it would be a different person because the brain reacts to the environment. It will start to have a different experience from the real you.”
But is that finally a glimmer of hope in our endeavor? Does Greenfield therefore admit the copy would be living, feeling, and conscious? “By definition if it’s a simulacrum of the brain rather than a model and identical in every regard then of course it would be, there’d be no difference between the two,” she says.
This article was originally published in Brain World Magazine’s Fall 2015 issue.
More from Brain World
- Daniel Kahneman on the Marvels and Flaws of Our Human Intuition
- Energy and Consciousness: Something From Nothing
- Finding the Rhythm of the Universe Through Movement and Music
- It’s A Meaningful Life
- Take It From Me: Neuroscience Is Advancing, But We’re A Long Way Off Head Transplants
- Where’s My Flying Car? On Science’s So-Called Broken Promises