Summary
Despite the seeming confidence with which I wrote "Dissolving Confusion about Consciousness" and other essays on subjective experience, I still feel confused about consciousness at an emotional level. Rationally I think my reductionist viewpoint is likely to be right, but there has always occasionally still been a weird feeling I have when I ask myself the "hard problem": Why do I feel the feels of experience? What can those feels even be, anyway? And what else feels this way? This piece presents a dialogue between two parts of my brain—one arguing for the logical-seeming reductionist answers and the other raising objections on the grounds of intuitive resistance. It wends through many ideas, but it ultimately concludes on the realization that consciousness can be both an emergent physical process with fuzzy conceptual boundaries and an immanent subjective experience. Those two don't have to be in contradiction. Indeed, I don't see why consciousness would make any more sense if it were ontologically fundamental.
Contents
Introduction
I think there is something harder about the hard problem of consciousness than most related philosophical confusions. With most problems, we can realize that we feel an intuition that conflicts with a more logical understanding of the situation. But with consciousness, we're asking what "feeling an intuition" even is.
Sam Harris has noted that the hard problem of consciousness seems comparable to the hard problem of why the universe exists. In both cases, we're probing to ask "Why is this thing I apprehend (either qualia or physics) here?"
One approach is to say that the existence of these things is a brute fact. In the course "Scientific Approaches to Consciousness", Spring 2009, Lecture 6, John F. Kihlstrom compared the hard problem of consciousness with the "hard problem of air": Why is there air? There just is.a Likewise, we could ask: Why do neural firing patterns of certain sorts give rise to these raw feelings I experience? They just do.
This approach is fine for scientists. It's analogous to the "shut up and calculate" interpretation of quantum mechanics. But when it comes to ethics, we need more. We need to know what other things have raw feelings. Ignoring the question is thus not an option, because the set of things we care about morally may vary from very small (maybe awake non-infant biological humans) to very large (all physical operations in the universe to some degree) depending on our take on the hard problem.
So, I repeatedly query myself: What is this "subjective experience" thing that I have? I sometimes lie awake in bed asking it to myself over and over. It feels similar to bumping my head against an insoluble math problem. While I feel like I do know the "answer"—it's the reductive physicalist answer I discuss in other essays—the answer doesn't quite feel intuitive when looked at from a certain perspective.
The rest of this essay consists of a dialogue between the intuition-based side of me and the reductionist side of me. It traces some of the ideas I bat back and forth in my head when trying to make sense of this topic.
Dialogue
Reductionism: It's obvious that Cartesian dualism is wrong. Occam's razor penalizes it, and in any case, it's not more intuitive than a monist approach because a dualist soul doesn't explain qualia any better than patterns of neural firing do. We'd still have to ask: Why does this soul feel the way it does? A soul would have its own "hard problem". Using souls to explain consciousness is like using God to explain the origin of the universe. In each case, you just push the problem back a step to some other mysterious thing. I don't expect to "solve" the problem of why the universe exists. I don't even know what a solution would look like. Could I take a similar stance for consciousness? Just accept Kihlstrom's view and say that these feels are what my brain does, and that's that?
Intuition: But which parts of my brain's operations do those "feels" come from? Is it very sophisticated algorithms that only higher animals have? Is it lower-level neural operations? Is it from the nature of the underlying physics itself, in which case I should adopt a more panpsychist ethical outlook?
Reductionism: You're looking at it all wrong. The "feels" don't exist "out there". You conceptualize and attribute them. You can understand this more clearly if you picture the world as a bunch of particles moving around and yourself as a robot performing computations to classify some particle movements into categories. When I think of myself as being a robot in a physical world, I seem to dissolve most of my confusions.
Intuition: Fine. But why do these conceptualizations and attributions feel like something? You could say they're thoughts in my head, but why are the thoughts conscious? How does a conscious thought arise at a neural level? That seems just as hard as explaining a conscious feeling.
Reductionism: "Consciousness" is one of a number of constructs that your brain has a vague conception of. When you hear the word or reflect on it, you feel like you know what it means, and it triggers changes in your brain. But that doesn't make it an explicit, precise idea. Likewise, a lot of people think they know what moral realism or libertarian free will mean, and these ideas trigger certain reactions, but that doesn't mean they're well thought out. They're like stances or feelings.
Intuition: Aye, but there's the rub! You can explain moral realism or libertarian free will as feelings, but how do you explain feelings themselves? Saying that feelings are feelings doesn't help.
Reductionism: Think of it this way. There's some network in your brain that represents you, and one that represents the idea of consciousness. When you reflect on yourself, you're associating the "you" concept with the "idea of consciousness" concept. And the same can be done for other minds—associating them with the concept of consciousness. Thus, whether other minds are conscious is a sentiment that you compute in your brain. Anything can be conscious if you decide it is. That's what empathy is about. And the easiest thing to attribute consciousness to is yourself; empathy begins at home.
Intuition: Ok, but there's something more—something it's like to have experiences. I want to know whether there's something it's like to be a fly, or a blade of grass, or a rock.
Reductionism: You are what you are—a brain receiving inputs and taking actions. A rock is what it is—a collection of atoms mostly held together, sometimes tumbling down a hill. These are just different physical processes. Sure, you can try to imagine yourself as being a rock, but you are in fact not the rock, and if you fantasize about yourself in the rock's place, you'll introduce all kinds of algorithms that are specific to your brain that will muddy the thought process. You can't really imagine being a rock because you and it are just different sorts of things. Trying to simulate a rock in your head is like trying to open an Excel spreadsheet using a PacMan.exe game file. They're just not compatible. It's slightly easier to imagine yourself as other animal minds, because their mental operations are more similar to yours, but even then you're using some amount of fiction, because you are in fact not them. Even if you imagine an apple as being an orange, an apple is in fact not an orange; it's just an apple. Empathy is always an act of imagination. I'm not saying it's not morally important; it's just that it is a fiction of sorts.
Intuition: If I'm conscious because of the operations in my brain, what are the boundaries? Is it just my neurons that are conscious rather than my skin or body? Why do I only have one consciousness in my head? Or do I have many, and I just can't tell because I'm not the other ones? But it seems like I'm the most powerful one, since I have control over my body's behavior.
Reductionism: There are no "boundaries". What you call consciousness is part of physical feedback processes that just flow through physics. The parts of your brain that speak and write text like this are getting their ideas from global information and computation networks that win control of your brain. There are smaller networks in your brain doing other things, including losing elections for control, but they don't get to tell the language centers what to think or say explicitly. The text you write is dictated by the winning neural coalition. The winning coalition talks about itself as if it's a unitary consciousness because it's in control. The other activation signals can't type sentences or voluntarily lift your arms or whatever, though they still can do other important things.
Intuition: Are those lower-level components conscious too? Do they matter morally?
Reductionism: It depends on your moral intuitions. If the action-controlling parts of your brain extend sympathy to the lower-level parts, then sure, you'll begin to act in accordance with caring about them and making linguistic attributions of consciousness to them. The choice is for you to decide, though.
Intuition: You're still missing the point. All this talk about algorithms and physical operations misses the bizarreness of qualia. Like infinity or the meaning of "existence", qualia are almost incomprehensible.
Reductionism: Sure, the brain can throw itself into loops of confusion and weirdness. There are some things that, when I contemplate them, feel kind of like looking down from a 100-story building. But this doesn't mean there's something more substantive underneath. In any case, certain mental states or moods can also turn off your feelings of consciousness as being something inexplicable and open you up to being one with the underlying physical universe. Feeling weird about the hard problem can be dialed up or down in our minds.
Intuition: Fine, but you still haven't explained what the feeling in "feeling weird" is. Anyway, we still need a handle on "feelings" to make ethical assessments. If we just looked at a cold, physical universe, how would we decide what to care about? Nihilism would become overwhelming.
Reductionism: Yes, I see that. Of course, there are ethical approaches that don't require attributions of sentience to make valuations.
Intuition: Yeah, but I don't like those ethical approaches! I need to help the suffering creatures in the world, not make myself pleased with some elegant abstract valuation approach. The sentient animals and machines out there need our help!
Reductionism: That makes sense. And you feel that this sentience-based approach to ethics is the most overridingly important one? (I wonder how much your feeling that way is contingent on your development and culture.)
Intuition: Yes, I feel that helping suffering creatures is the most important—indeed, the only—thing that matters.
Reductionism: Well, then you do need to sort out what physical processes you're going to consider sentient. Good luck with that. Cheers!
Intuition: But wait. We haven't solved the problem. We're back where we started... Don't go.
Reductionism: What more is there to say?
Intuition: I still feel unsettled about the whole situation and don't know which entities to care about.
Reductionism: Look, maybe this will help. Apparently the raw-feel-ness of raw feels is something that exists—where that statement is sort of poetry written in a confused language by beings with limited powers of expression and comprehension. It's sort of like a pocket calculator trying to tell another pocket calculator that an integral in calculus can be conceptualized as pressing the "+" button a bunch of times. Whatever we're describing when we talk about qualia—if we're describing anything meaningful at all—is some fundamental property of the way physics is. It's a constant across the universe. So, it can cancel out of your ethical considerations, and you can focus your ethics on describing criteria for regarding particular parts of physics as more sentient than others.
Intuition: Umm...
Reductionism: The world ultimately reduces to physics—quarks, leptons, etc., or probably something else more fundamental. "Consciousness" is not one of those physical primitives. Thus, consciousness must be an emergent phenomenon—a process that involves many parts working together in various ways. But if consciousness is emergent, then it must have fuzzy boundaries just like any concept does. There's no "true essence" underlying an emergent phenomenon. It just is what its components are, and we decide the boundaries of labeling which emergent phenomena belong to which categories that we make up. Consciousness must be like this. So your ethics project must be of the form of deciding for yourself what boundaries or weighting schemes you want to pick. This is not just my idea; let me quote from Susan Blackmore's review of David Papineau's Thinking about Consciousness:
our phenomenal concepts are too vague. And if the term "vague" seems vague, I can perhaps best convey his idea by using his favourite analogy - that it is similarly vague whether he, David Papineau, is really bald or not. This vagueness about consciousness, he says, condemns the inflationary materialist to "perpetual ignorance about when it is present in non-human beings." (p 197). So if we ask whether Moggy the cat, or silicon doppelgangers, are conscious, or whether octopuses really feel pain, we cannot decide, and nor could an omniscient God. His claim is not that it is vague for Moggy or for the octopus, but that our concept of consciousness is not precise enough to decide whether Moggy falls into it or not. If you remain convinced that the "spark" of consciousness must either be present or not, then you are lapsing back into dualism.
Intuition: But what if consciousness is not emergent? What if it's a true ontological primitive alongside string theory or whatever?
Reductionism: Well, then you're back to the problems with dualism. In any case, why is it less mysterious for consciousness to be an ontological primitive than an emergent property? The real weirdness is that consciousness is anything at all. But apparently it does exist. No need to keep dividing by zero in your brain, unable to comprehend it.
Intuition: What about idealist monism? Maybe consciousness is the only ontological primitive.
Reductionism: Ok, I guess that would solve the mystery of why the universe exists, because then the only problem would be why consciousness exists, since there is no external universe. Also, it seems like you'd end up reproducing a lot of the theories of physicalist monism (quantum mechanics, chemistry, biology, etc.) in your idealist framework, because the predictions you'd make about future experiences would be the same as physicalists make. So in practice your world view would look pretty similar.
Intuition: But the advantage would be that I could now say that my consciousness and other consciousnesses really exist in some ontologically absolute sense. Seeing consciousness in other minds is then no longer just my interpretation.
Reductionism: Hmm, but under idealism, do other minds "actually exist", or are they just percepts within your conscious experience?
Intuition: Let's assume a form of idealism where there are really many minds in the world.
Reductionism: Why do your conscious percepts of those minds correspond to what the minds are actually doing? What's the causal link from a mind thinking stuff to you seeing a picture of that mind in a body doing stuff?
Intuition: The laws of idealist physics specify those correlations.
Reductionism: And you can't be sure that a given image you perceive does correspond to a conscious mind or just an image, but you can form probability estimates based on some prior assumptions, such as that more complex minds in your visual perception are more likely to actually be minds?
Intuition: Yep.
Reductionism: Ok. I'll have to think about this more. Certainly this picture of the universe is counterintuitive to modern Western minds, but I guess so is my own reductive view. I personally still find reductionism more intuitive. Intuitively it feels confused to "reify" subjective experience the way idealism does. But it's just a different ontology. I don't know if there are major problems with it because I haven't explored it in depth. Also, I should clarify that I'm not speaking about George Berkeley's version of idealism, which involved elaborate appeals to God. Kane B gives a solid refutation of Berkeley.
Intuition: Yep. You know, there are times when I don't find physicalist monism on consciousness unintuitive. It's sort of a frame of mind one adopts. Sometimes I can just really see myself as being part of the physics of the world. I am one more of the many buzzings, hissings, and whirrings of matter in my surroundings. Meditating on this stance makes consciousness seem less weird. It also has the byproduct of making lots of the universe seem ethically relevant, because I can see the ways in which my brain operations are just some of the many goings-on of moving particles. I'm not sure what I think about this. I do feel there's a danger in extending one's sympathies too far.
Reductionism: Yeah. Though the stance we adopt is ultimately up to us, some stances feel much more compatible with the way the universe works than others. A clear is example is rejecting speciesism: It's obvious that the (fuzzy) group of organisms with Homo sapiens DNA is arbitrary and doesn't carve nature at its joints. But other examples, like the one you raise, are less clear.
Intuition: It would be a problem if going too "far out" in my concern led me to neglect clear agony by animals.
Reductionism: I see.
Intuition: I wonder how culturally dependent these confusions about consciousness are. Do some other cultures not have them because they conceive of mind and body in a different way? This question seems like a fruitful intersection between anthropology and philosophy of mind.
Reductionism: I guess some cultures were/are animists, seeing souls in trees and rocks in addition to people and animals. But maybe those cultures still thought of souls as immaterial rather than part of physics? The Wikipedia article claims that "Animism encompasses the belief that there is no separation between the spiritual and physical (or material) world [...]. Animism thus rejects Cartesian dualism." If so, this seems like a relatively enlightened viewpoint, and the fact that people held it naturally suggests that maybe hard-problem confusions are not biologically inevitable?b
Intuition: I don't know. That's something to explore further. Anyway, thanks for the chat!
The two go to sleep. Then, in the middle of the night....
Intuition: Hey, Reductionism.
Reductionism: What? (yawn)
Intuition: I think I have better insight into our disagreement.
Reductionism: Oh?
Intuition: Yeah. You see, what had previously been hanging us up was this: You said consciousness had to be emergent and therefore it had to have conceptual boundaries, like other higher-level features of the world. I felt that, no, whether I'm conscious is something more fundamental, and it doesn't depend on some arbitrary categorization we make as to whether a given physical process is conscious or not. Now I see that we were both kind of right at the same time: Consciousness is emergent and does have fuzzy boundaries, but that doesn't have to make its subjective texture less immanent to the experiencer.
Reductionism: Interesting.
Intuition: Yeah, it was an insight that flashed into my head all of a sudden. I was feeling myself as being physics, and then I realized: "Hey, this experience could feel just as real if it were emergent as if it were somehow ontologically priviledged."
Reductionism: You've stumbled on an old idea. If we remove the hairs one by one from a hairy person until he's bald, it's hard to say at what point he stopped being hairy and started being bald, but that doesn't mean we can't tell the difference between a full head of hair and being completely bald. Likewise, the transition from ice to liquid water is a continuous trajectory—water molecules may break off piece by piece, maybe some ice crystals remain icy within an emerging soup of water, etc.—but that doesn't mean we can't tell the difference between really frozen ice and warm water. The consciousness debate could be basically the same. There is something special going on with my consciousness, even though exactly where we pin it down is going to involve fuzzy judgment calls.
Intuition: Exactly. Just like water having clearly different properties from ice, my brain's operations have some clearly different dynamics from those of a rock. That's enough to account for the difference in subjective texture of the two. We don't need any fancy machinery from another dimension. In a certain poetic sense, there is something it's like to be the rock also, but it's just very different from what it's like to be me. Even to make statements like that can be confusing, because the two are so very different.
Reductionism: In a similar way, there is something "liquid-like" about ice. A lot of things, actually. They both are made of water molecules that are influencing each other by certain kinds of electrical interactions. They both can absorb heat, resist immediate impact, and contain air. They both are pulled by gravity and are used widely by humans and other animals. But there is also something different about liquid water from ice—at least, different in a fuzzy kind of way, because no one can say exactly where the dividing line is between them. We set it at 0°C, but there may still be some watery molecules and some ice crystals there. Even looking at the situation closer up, the difference between a completely free liquid water molecule and a completely bound water molecule in an ice crystal is a gradual one, because the molecule may be almost bound to the crystal but still moving around a bit.
Intuition: And of course, even ice itself has molecular vibrations at greater or lesser speed based on temperature. In a similar way, even so-called non-sentient physics may exhibit some remote properties analogous to sentience to greater or lesser degrees. (By the way, this analogy of consciousness with states of matter is not a new idea.)
Reductionism: So it seems this debate has been two blind men arguing over what the elephant is. On the one hand are those, like you, who insist that consciousness does feel like something special. On the other hand are those like me who say consciousness must have fuzzy boundaries because it's an emergent property of physics. But the two needn't be in conflict. When we see ourselves as part of physics, it becomes more intuitive that distinctions that are not ontologically absolutely different but just different-within-physics can still contain different sorts of subjective experiences. Indeed, we already can see this with the different types of subjective experiences we have depending on the states of our brains. Why should it be any more weird that we can feel conscious or not than that we can feel anger or joy? We don't insist that the difference between anger and joy needs to have an ontologically absolute boundary. But the fact that these categories are fuzzy doesn't stop anger and joy from being clearly distinguishable.
Intuition: But where does this leave your frequently criticized claim that "consciousness is whatever we define it to be"?
Reductionism: Ah, good question. I think when people talk about "consciousness" they're referring to a really complicated collection of textures. In mind-space as a whole, there may be many kinds of things that we could legitimately call consciousness, even if—to speak poetically—they feel somewhat different. We only experience a few of these varieties ourselves. I think there are some central features that people are pointing to when they insist that conscious experience is what matters, but there remains a lot of room for variation around the edges.
Intuition: I see. Here's an example to illustrate. It's clear that contemporary Iceland is more of a free democracy than Stalinist Russia. But where to draw the line between when something is a free democracy and when it's not is fuzzy, and the boundary differs between observers. Maybe some place more emphasis on voting, others on an active citizenry between elections, others on freedom of speech, and others on the lower levels of state-sponsored violence. There's legitimate room for debate about the relative importance of various criteria, so in a sense, a "free democracy" can mean whatever we define it as. But that doesn't stop there from being some real underlying features that we're trying to point at when we use these words.
Reductionism: Right. And this analogy suggests further why consciousness should more reasonably be seen in gradations, just as democracy should. We could imagine slowly deforming Iceland until it became Stalinist Russia. At what point did its "free democracy"-ness suddenly switch off? Nowhere in particular; it was more of a gradual change. So too it can be with consciousness.
Intuition: You would say that everything is at least vanishingly conscious, and by analogy, every country is at least vanishingly a free democracy—for instance, because there may be some moments of non-persecution and some concern by the leaders about popular sentiment if only to avert rebellion. But other people feel this is a misuse of language, that calling Stalinist Russia "at least vanishingly" a free democracy is to misunderstand what free democracy really is. In that sense, some of the debate reflects difference in linguistic style.
Reductionism: Do you feel better now about this whole thing?
Intuition: Yes, I think I see where different camps have been divided. Those who claim that consciousness can't be just for us to decide are pointing out that they have some intuitions—maybe hard to externalize and not even fully understood—that consciousness should mean some certain kinds of things and that it wouldn't capture what they're trying to refer to if we define consciousness too differently from that. I agree with this. A lot of the problem with consciousness is that we don't have good language or concepts for elaborating what it is we're trying to point to. But that's not an issue that requires metaphysical doubt; it's just a practical limitation. It's like a dog trying to express that it has acid reflux by barking at its owner, even though the dog doesn't know what "acid" is or how to describe what's happening to it more clearly than by saying "Woof".
Reductionism: And maybe for some questions we really don't have well refined intuitions. We can point to the difference between our consciousness and our non-consciousness, as whole discrete things, just like we can distinguish modern Iceland from Stalinist Russia. But there's a vast multi-dimensional space of other ways minds can be arranged, and maybe our intuitions don't extrapolate well to those regions. For instance, consider a political system in which citizens vote for their leaders, but 100% of the population always chooses the same person, and there is no speaking or journalism. Is this a free democracy? It's just sort of different from the kinds of contrasts we see between Iceland and Stalin. Is it less of a free democracy than Iceland? But everyone gets everything he wants, so how can that be? And so on.
Intuition: Hmm, often the objection at this point is that whether it feels like something to be a mind is more fundamental and more absolute than whether a given political system is a free democracy.
Reductionism: Maybe. Perhaps the difference between consciousness and not is more closely analogous to the difference between liquid and solid. But consciousness involves a hugely complicated mix of operations, which makes me think it's probably more like the "free democracy" question. Even when we can't say we're conscious, we might still be minimally conscious in certain ways. Brains have lots of parts, not all of which are completely working or completely not working. We don't encounter these shades of gray a lot in ordinary life, so we don't have well refined intuitions about them.
Intuition: Maybe one way to picture the situation in our minds is to think of mind-space like colors of the rainbow. What we experience is white. When we're unconscious (e.g., under deep general anaesthesia), that's black. When we're very groggy, that's some shade of gray. But there can be lots of colors outside of this unidimensional spectrum from black to white. Maybe robots are orange and insects are pink. Maybe other minds are ultraviolet. And elementary physics is an extremely long-wavelength radio signal. Of course, this is all poetry—trying to convey a gist or mental attitude for how to feel about the situation rather than to describe anything literal.
Reductionism: This reminds me of the Parable of the Apple Eater. It goes as follows. A boy grew up alone on a deserted island. He survived only by eating the apples on various trees. He became facile with distinguishing "apple" from "non-apple" (e.g., rocks, sticks, seawater). Then one day, he came across an orange. He could tell it definitely wasn't an apple—after all, it was the wrong color, had a more bumpy skin, and was made of many slices on the inside. Since he only cared about apples, he decided the orange was valueless. He threw it away and then licked his fingers to clean off the juice. At that point he realized the orange tasted good. It wasn't an apple, but it did have apple-like characteristics. The boy went back to retrieve the orange and began eating them along with apples thereafter.
Intuition: By analogy, we're only familiar with our own kinds of minds vs. unconsciousness, but there can be a vast array of textures of consciousness in mind-space. For instance, a computer that makes all its decisions on the basis of expected-utility calculations would seemingly have a very different subjective texture from a human, yet if that computer is highly intelligent, it seems plausible in a poetic sense to say it has a different kind of subjective texture, especially if it has self-monitoring systems and can make non-trivial statements about its own cognition.
Reductionism: Right. Of course, there's not a strict literal meaning for a "different texture of subjective experience", but that phrase evokes the right kind of image in our minds for how we want to mentally frame the situation. Human thought paradigms are actually always like this—we always have kind of general whisps of mental meaning that point in certain directions, enough to help us successfully navigate the world.
Intuition: It feels somewhat spiritual to think about this topic. It's no wonder that mind/body distinctions seem to be a main source of spiritual intuition. Or in the case of animism, mysticism, and Eastern religions, mind/body unifications.
Reductionism: Yes. Now, let's review: Why does subjective experience have the distinctive texture that it does?
Intuition: Well, in some sense Kihlstrom was right: That's just the way things are. Physics takes on many forms, and I am one of them. When I look at it in a certain way, that just seems to make sense. When I look at it in another way, my mind goes into a paradox mode: Wait, what do you mean this is just physics? Probably the paradox feeling results from a conflict of physical versus phenomenal stances. This is an interesting failure mode that my brain goes into. Though I shouldn't call it just a failure mode, because the phenomenal stance is a crucially important part of my moral values. Also, I feel like this resolution of physics-vs.-phenomenality reveals something about physics. Without trying to sound too woo-woo, I feel like physics is more magical, in a sense, than I had thought. Now, this is all poetry, but then again, the hard problem was always poetry. Here I am, this buzzing hive of particles, this whirlpool in physics. I do have a distinct texture, but so too water has a distinct nature from ice, and apples have a distinct nature from rocks. Why is that strange? It's not really, until our phenomenal-stance brain processes try to disturb the peace with their loud protests. Rather than protesting, the phenomenal and physical stances should be working together to write great works of literature about the nature of experience by different types of beings.
Reductionism: And what do you do with the moral question of what parts of physics to care about and how much?
Intuition: The truth is some sort of mix between "consciousness can be anything we pick" versus "there is a definite answer to what is conscious". I see it as analogous to non-realist moral uncertainty or reflective disequilibrium. There are some general, vague things we're trying to say when we talk about consciousness, in a similar way as we may have some vague ethical ideas that we start with. But our process of pointing at consciousness is crude and scientifically illiterate, and it needs to be augmented with greater understanding of how nature works. Similarly, our ethical intuitions need to be augmented with greater appreciation of the world and how other people see it. We engage in some sort of reflective-equilibrium process to come up with better, more coherent definitions of consciousness—ones that combine our naive self-pointing efforts with a fuller physical view of how the systems work. For that matter, this is always how definitions work. People start with crude examples and point at various features. Then some counterexamples come in, and people discover new objects that break down the existing categories, and so people refine their definitions.
Reductionism: Yes. Of course, the changes in definition don't alter the substance of what was being defined. But they do change how we conceptualize what was being defined, which includes both our sense of our own qualia and our attribution of them to others. These changing conceptualizations then change our moral attitudes.
Intuition: Is there a clear analogy that we can use to summarize this perspective? Something that is more transparent, so that when people get confused, they can refer back to the analogy and figure out the corresponding answer for consciousness?
Reductionism: Well, we brought one up earlier: We could compare "consciousness" with "fruitness". Animal consciousness could be like apples. Of course, apples themselves have many varieties, just like animals. And a single apple can change over the course of its life. Sophisticated digital consciousness could be like oranges. Other types of consciousness could be bananas, cherries, cranberries, and so on.
Intuition: How about tomatoes? Apple juice? Apple pie? Two apples that are fused together? A single apple seed? A stem? A fructose molecule?
Reductionism: There you go. You're getting into plenty of paradoxes that parallel those for consciousness.
Intuition: Hmm, it still feels like consciousness is more fundamental than fruitness. On the other hand, the part of my brain that generates that sentence and orders it to be typed has only a limited perspective. If it could internalize the broader diversity of mind-space, maybe it would begin to feel differently.
Reductionism: Yes. Thought experiments of slowly deforming one thing into another kind of thing can help. Chalmers's "fading qualia" argument is one example, but we can do the same for human consciousness into human non-consciousness, bunnies into insects, robots into laptops, or whatever else. We can see that the underlying nature of these things is continuous, and hence there should be some continuity in the phenomenology we attribute to them, although of course some deformations can be vastly more abrupt than others, such as when you pull the power cord out of a robot or pull the oxygen out of a human brain.
Intuition: Thanks for this conversation. It seems to have validated your previous reductionist approach and the conclusions you reached with it. At the same time, this topic feels more spiritually rich and satisfying than before, and it makes clear that understanding the empirical details of how minds work is important for making more refined pronouncements on these topics; we can't just decide from our armchairs what our intuitions about consciousness are pointing to.
Footnotes
- This isn't the best example, because the question of why air exists actually is answerable in a rather straightforward way. But we could keep tracing it back and eventually reach the question of why the universe exists, and this would be at least as hard as any hard problem of consciousness. Indeed, asking why anything exists seems like the most puzzling question of all. It appears that any explanation will only push the story back one step. I get chills thinking about how weird it is that things exist and are the way they are. (back)
- Our thought experiments about the hard problem may be contaminated by our existing philosophical prejudices. From a humor article: "While the group was initially unable to confirm that water is H2O on Twin Earth, the results turned out to be due to contaminated research materials—one of the researchers’ minds had been contaminated by Chomskyan internalist semantics." (back)