More or Less Conscious

I posted about Integrated Information Theory (IIT) and how it strongly violates some of our intuitions about things that are conscious, and whether we can try to formalize those intuitions. Another odd thing about IIT, not necessarily a criticism, is that it claims not just to tell us whether a system is conscious, but how to measure the amount of consciousness the system has.

Thinking about amounts of consciousness is an odd thing. I’m comfortable saying that I am more conscious in my normal waking state than when I am under general anesthetic. I’m comfortable saying that some animals (e.g. dogs, dolphins, and monkeys) are probably more conscious than others (e.g. spiders, crickets, and snails). But when we start talking about specific values of consciousness, as IIT does, I realize how slippery and strange these notions are.

I’m not the only one. I referred in the previous post to an article by John Horgan, and he has a follow-up article with replies from other scientists and philosophers.

Here is Adam Pautz:

Indeed, it implies that the amount of consciousness in such a system is *unbounded* – since its Phi level is unbounded. My worry about this is not Aaaronson’s – namely, that such predictions are counterintuitive. Rather, my point is that it is not even clear what these predictions mean. What could it even mean to say that a 2D grid might have, say, “10 times the amount” of consciousness that you have when you are fully awake? In general, I don’t yet know what proponents of IIT mean by talk of the “amount” of consciousness – a supposedly unbounded dimension of our experiences (and indeed one that has a ratio scale, on IIT, since Phi has a ratio scale). This is not yet an objection, but a request for clarification. However, if proponents of IIT cannot clearly explain what unbounded dimension they have in mind, then it becomes an objection, because it means that IIT is a theory without a clear subject matter.

Suppose I am an embodied A.I. agent with a large phi value. Then suppose my architecture is changed in some way so that my phi value doubles. How has my experience changed? Is it something similar to increasing the resolution of images, i.e. I have a higher-resolution consciousness of some kind. Or is it “more of” consciousness, and what would that mean? IIT doesn’t answer these questions, and doesn’t really say anything about the hard problem of consciousness.

Here is Garrett Mindt:

On the note about explaining what it means for something to have a higher/lower *amount* of consciousness, perhaps IIT points to an issue in studying consciousness, and that’s whether consciousness is an all-or-nothing type of thing or something that comes in degrees. If one thinks consciousness is all-or-nothing, then IIT will look like it must be false since different things will have varying degrees of phi. But if one thinks having consciousness is a matter of falling somewhere on a spectrum, then IIT gives you a quantifiable framework of determining where on that spectrum a particular system falls. We seem to already talk in this way when we are trying to ascribe consciousness to non-human creatures. I am reluctant in certain circumstances to ascribe human-like consciousness to some creatures, but nonetheless wouldn’t say they lack consciousness completely. Just as well it seems perfectly conceivable that human-like consciousness isn’t the end-all-be-all of the consciousness scale and I would find it very odd that on this “pale blue dot” is where consciousness reaches its pinnacle in the cosmos. Unfortunately, we are trapped in our own particular degree of consciousness and so such a differing in spectrum doesn’t seem intuitively plausible. We would have no idea what it would be like to be a higher/lower degree of phi, just as we don’t know what it’s like to be a bat (a presumably lower system of phi according to IIT)!

That’s an interesting thought, that perhaps consciousness does not come in varying degrees. If that were true, it would pose a problem for many theories of consciousness — not just IIT. It would not be a problem for a unitary form of cosmopsychism where there is only one consciousness, however.