You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

Do Chalmers' Zombies beg the question?

Pantagruel October 24, 2021 at 13:22 11075 views 110 comments
I'm re-reading Chalmers' The Conscious Mind and, while I agree with his position, I find his arguments about logical supervenience lacking.

Chalmers contends that conscious experience does not supervene logically on the physical, so mind is not derivable from matter which leads to his position of naturalistic dualism. He relies heavily on the case of the phenomenal zombie, which is functionally and psychologically identical to him, but has no phenomenal experience.

To me, this seems a very strained example, and I think Chalmers highlights the limitations of this example himself when he says, "My zombie twin does not have any conscious experience, but he claims that he does" (p. 174).

If zombie-consciousness is devoid of phenomenality, what possible set of conditions could give rise to the zombie asserting phenomenality? Isn't this a petitio principii?

Comments (110)

bongo fury October 24, 2021 at 14:42 #611116
Quoting Pantagruel
If zombie-consciousness is devoid of phenomenality, what possible set of conditions could give rise to the zombie asserting phenomenality?


Are we to exclude deliberate deception? If so, how about innocent confusion?
Michael October 24, 2021 at 14:58 #611127
Quoting Pantagruel
If zombie-consciousness is devoid of phenomenality, what possible set of conditions could give rise to the zombie asserting phenomenality?


Brain activity that triggers the vocalization of the expression "I am conscious." There's nothing special about those words. "I am conscious" is no more an indicator of consciousness than "one plus one equals two." They're just sounds that can result from mechanical operations.
dimosthenis9 October 24, 2021 at 15:05 #611132
Quoting Pantagruel
He relies heavily on the case of the phenomenal zombie, which is functionally and psychologically identical to him, but has no phenomenal experience.


Well as long as you say that he relies his case on that. Then don't think there is much to discuss here. He starts his position from a total wrong base. So his outcome can normally only be wrong too.

Functionally and psychologically identical presupposes human phenomenal experience. They come in package. You can't have the one without the other. However convenient would might be for him to built his case on, still can never be right.
Pantagruel October 24, 2021 at 15:10 #611135
Reply to Michael
Quoting dimosthenis9
However convenient would might be for him to built his case on, still can never be right.


But it is still possible to come to the right conclusion for the wrong reason.

Quoting bongo fury
Are we to exclude deliberate deception? If so, how about innocent confusion?


I've been thinking along these lines. One difference must be that the actually conscious being can know that it is conscious (in the strong sense); while the zombie that knows it is conscious is wrong. So could you say that this is the exact dividing line between a reductionist and an emergent consciousness?
dimosthenis9 October 24, 2021 at 15:12 #611137
Quoting Pantagruel
But it is still possible to come to the right conclusion for the wrong reason.


By pure luck?
Pantagruel October 24, 2021 at 15:18 #611138
Reply to dimosthenis9 Any number of ways. Perhaps the zombie argument can yield the correct result if the conclusion-begging premise is better analyzed. This is the direction that Chalmers travels when he examines the 'paradox of phenomenal judgements'. I'm curious to see exactly how much consciousness he has to allot to the physical.

For example, if knowledge that one is conscious is the phenomenal judgement sine qua non, then when the zombie effects this judgement, it can either judge correctly "I am not conscious"; or it can judge incorrectly "I am conscious". What this says is that the judgement of/about consciousness (conscious judgement) does not supervene on the physical since, if it did, it would be self-contradictory. Which explicitly contradicts Chalmers' second premise.
Deleted User October 24, 2021 at 15:31 #611141
This user has been deleted and all their posts removed.
dimosthenis9 October 24, 2021 at 15:32 #611142
Quoting Pantagruel
Any number of ways. Perhaps the zombie argument can yield the correct result if the conclusion-begging premise is better analyzed.


Even if his conclusion is correct it would never be accepted, if the way he tries to prove it is via starting from this false base. Everyone could just push it with the tip of his finger and it would collapse.

So as to be taken serious he would have to choose a different path. More solid one.
We could discuss what could be the appropriate base as to make his conclusion right or seem more reliable at least.

I consider myself idealist and believe that mind is non material but I would never choose such a lame premise as to convince a materialist. I would be easily defeated.
Pantagruel October 24, 2021 at 15:50 #611147
Continuing on, Chalmers does pursue this fine line of division between the phenomenal and the psychological by way of refuting Dennett, who says that his materialistic theory can explain why things 'seem' the way they do. Chalmers says that Dennett is exploiting an ambiguity (equivocating) between two senses of seem, one of which (the phenomenal) is not captured by the other (why we say the things we do). Balancing on the "knife-edge between the phenomenal and the psychological realms" is how Chalmers puts it. I feel that is where this question lies.
noAxioms October 24, 2021 at 15:50 #611148
Quoting Pantagruel
If zombie-consciousness is devoid of phenomenality, what possible set of conditions could give rise to the zombie asserting phenomenality?

I've always claimed to be the zombie, without lying about it. I don't think I'm conscious, at least not by Chalmers' definition, so no, they don't necessarily lie about it. Sure, I can detect red, but so can the simple mechanical device.
Pantagruel October 24, 2021 at 16:27 #611166
Quoting noAxioms
I've always claimed to be the zombie, without lying about it. I don't think I'm conscious, at least not by Chalmers' definition, so no, they don't necessarily lie about it. Sure, I can detect red, but so can the simple mechanical device.


But anyone who takes this view will likely be an eliminativist (or a reductive functionalist) about consciousness from the start. If one accepts that our immediate evidence does not rule out the possibility that we are zombies, then one should embrace the conclusion that we are zombies: it leads to a much simpler view of the world, for a start. But the reason there is a problem about consciousness is that our immediate evidence does rule out that possibility.
Chalmers on the "Irreducibility of Consciousness"
bongo fury October 24, 2021 at 16:40 #611172
Quoting Pantagruel
One difference must be that the actually conscious being can know that it is conscious (in the strong sense);


How does this work? Acquaintance of the head with an immaterial picture inside it?
noAxioms October 24, 2021 at 16:57 #611186
Quoting Pantagruel
But anyone who takes this view will likely be an eliminativist
Well, I've pretty much eliminated the immaterial mind as described by Chalmers, but the stanford page on eliminative materialism describes a 'radical position' which basic monism is not. I think my mental states supervene on physics, making me sort of materialist of sorts, hardly a radical position to take.
Functionalist about consciousness seems closer to the mark, but they don't call themselves zombies, and neither do I except in the context of discussions such as this one.

If one accepts that our immediate evidence does not rule out the possibility that we are zombies, then one should embrace the conclusion that we are zombies
The OP says they're functionally identical, so by definition, immediate evidence does not rule out the possibility.

But the reason there is a problem about consciousness is that our immediate evidence does rule out that possibility.
How is that a problem? It simply leaves it open to interpretation (as does any position without empirical differences).

Pantagruel October 24, 2021 at 17:02 #611189
Quoting bongo fury
How does this work? Acquaintance of the head with an immaterial picture inside it?


Well, axiomatically, if the zombie thinks it is not a zombie it is wrong, if a conscious being thinks it is not a zombie it is correct.
Pantagruel October 24, 2021 at 17:05 #611190
Quoting noAxioms
I think my mental states supervene on physics, making me sort of materialist of sorts, hardly a radical position to take.


Chalmers does discuss this. One position he explores and admires involves ascribing phenomenal (or proto-phenomenal) states to physical entities. And also monism, not of the mental, but of an over-arching variety. I have strong affinities for both these directions also. Both of them are consistent with a systems-philosophical approach.
bongo fury October 24, 2021 at 18:37 #611230
Reply to Pantagruel

If a machine with no ghost thinks it has a ghost, it is wrong. Tick.

If a machine with a ghost thinks it is not a machine with no ghost, it is correct. Tick.

Pantagruel October 24, 2021 at 19:06 #611244
Quoting bongo fury
If a machine with no ghost thinks it has a ghost, it is wrong.


Does this mean you equate the 'experience of consciousness' as that of 'having a ghost'? That is interesting.....
noAxioms October 24, 2021 at 21:03 #611297
Quoting bongo fury
If a machine with a ghost thinks it is not a machine with no ghost, it is correct.
In that case, it's probably the ghost thinking it's not a machine with no ghost, and the ghost is correct. The opinion of the machine is not given.

NOS4A2 October 24, 2021 at 23:31 #611366
Reply to Pantagruel

Chalmers’ zombie twin is not “logically coherent”, to me. He can only assume, and not prove, that “conscious experience” is missing from the zombie. This is because he assumes, and never proves, that “conscious experience” is a fundamentally natural phenomenon. Of course he can imagine it missing from a zombie because he has long assumed it occurs elsewhere.

I think you’re right. He’s reasoning in a circle.
180 Proof October 24, 2021 at 23:59 #611379
Quoting Pantagruel
He [Chalmers] relies heavily on the case of the phenomenal zombie, which is functionally and psychologically identical to him, but has no phenomenal experience.

"P-zombie" incoherence.
frank October 25, 2021 at 00:12 #611385
Pantagruel October 25, 2021 at 00:32 #611392
Quoting NOS4A2
Chalmers’ zombie twin is not “logically coherent”, to me. He can only assume, and not prove, that “conscious experience” is missing from the zombie.


But if he assumes it then that is the case he is examining. A conscious zombie would contradict his example.
RogueAI October 25, 2021 at 01:13 #611409
Reply to Pantagruel Would Chalmer's P-zombie twin also have the same evolutionary history as Chalmers?
InPitzotl October 25, 2021 at 01:27 #611411
Quoting RogueAI
Would Chalmer's P-zombie twin also have the same evolutionary history as Chalmer?

Zombies are functionally equivalent to conscious entities. Generically different entities have different evolutionary histories (because "you count to two when you count them"), but given the functional equivalent clause in the definition, any treatment of p-Chalmers as saying something Chalmers says is by definition fair game.
noAxioms October 25, 2021 at 01:52 #611415
Quoting InPitzotl
Generically different entities have different evolutionary histories

But at some point, there'd be a common ancestor, which means that a non-conscious entity bred a conscious one with this new relationship with the external entity.
In other words, what had functioned just fine in the parent (a mind that was self-contained and fit despite the lack of relationship with the 2nd entity) gave birth to one where all those advantages were cast aside, letting the choices be made by the external entity rather than itself as has been done by its parent.

This is probably unfair, since it paints a picture of a very binary, abrupt transition from one way of doing things to a complete new one. The child simply wasn't evolved to take advantage of the external entity, at least not in full, and so I imagine the process to take many generations, with a slow ceding of control to the external choices which for some reason made the phenomenal being more fit.

To me, this sounds like slow possession by a demon, however benevolent.

This also leaves the price paid by the parent (the extra metabolism to support an exceptionally large brain) still being paid by the child despite all the function of that expensive brain having been contracted out to a 3rd party. If the 3rd party is doing all the work, why still cart around a brain whose function can obviously be accomplished by one a 10th the metabolism as is done by other creatures our size.
Wayfarer October 25, 2021 at 08:54 #611496
Quoting Pantagruel
if zombie-consciousness is devoid of phenomenality, what possible set of conditions could give rise to the zombie asserting phenomenality? Isn't this a petitio principii?


Agree with you. How could a zombie answer the question 'how are you?' or 'how are you feeling?' I don't see how it could. It could be scripted to regurgitate an answer, but surely it couldn't be hard to fool it. 'What was the most embarrasing thing that ever happened to you?' for example - because they allegedly don't have any experience or inner life. They are, therefore, automatons.

Actually I'm reminded of a passage I often quote from Descartes, and bear in mind he wrote this in the 17th Century:

[quote=René Descartes]if there were such machines with the organs and shape of a monkey or of some other non-rational animal, we would have no way of discovering that they are not the same as these animals. But if there were machines that resembled our bodies and if they imitated our actions as much as is morally possible, we would always have two very certain means for recognizing that, none the less, they are not genuinely human. The first is that they would never be able to use speech, or other signs composed by themselves, as we do to express our thoughts to others. For one could easily conceive of a machine that is made in such a way that it utters words, and even that it would utter some words in response to physical actions that cause a change in its organs—for example, if someone touched it in a particular place, it would ask what one wishes to say to it, or if it were touched somewhere else, it would cry out that it was being hurt, and so on. But it could not arrange words in different ways to reply to the meaning of everything that is said in its presence, as even the most unintelligent human beings can do. The second means is that, even if they did many things as well as or, possibly, better than anyone of us, they would infallibly fail in others. Thus one would discover that they did not act on the basis of knowledge, but merely as a result of the disposition of their organs. For whereas reason is a universal instrument that can be used in all kinds of situations, these organs need a specific disposition for every particular action. [/quote]



Pantagruel October 25, 2021 at 09:39 #611517
Reply to 180 Proof
:up:

Reply to RogueAI Actually Chalmers touches on this because, if his zombie-twin has an "inverted spectrum" of any conscious experience, for example, (sees blue where the other sees red i.e.) then there necessarily will be different "causal histories" of that type of experience, even if the experiences themselves are the same. So isomorphic mapping of history can be problematic.

Reply to Wayfarer
:up:

I like Chalmers description of consciousness as something which determines the intension of its own concept.
RogueAI October 25, 2021 at 20:22 #611710
Quoting Pantagruel
Actually Chalmers touches on this because, if his zombie-twin has an "inverted spectrum" of any conscious experience, for example, (sees blue where the other sees red i.e.) then there necessarily will be different "causal histories" of that type of experience, even if the experiences themselves are the same. So isomorphic mapping of history can be problematic.


I'm wondering how P-zombies could have a history that involves the development of words that refer to conscious experiences they don't have.
Michael October 25, 2021 at 21:28 #611749
Quoting RogueAI
I'm wondering how P-zombies could have a history that involves the development of words that refer to conscious experiences they don't have.


We have a word like “soul” despite the fact that we have no such thing.
Marchesk October 26, 2021 at 04:12 #611964
Quoting RogueAI
I'm wondering how P-zombies could have a history that involves the development of words that refer to conscious experiences they don't have.


If we're not conscious in the way that philosophers like Chalmers claim we are, then qualia would count as such a word in our universe. Idealism would be another. Platonism would be yet another. Not to conflate those three terms, but it demonstrates that if the world is physical, it doesn't prevent us from coming up with non-physical words.
Marchesk October 26, 2021 at 04:24 #611977
Quoting Pantagruel
If zombie-consciousness is devoid of phenomenality, what possible set of conditions could give rise to the zombie asserting phenomenality? Isn't this a petitio principii?


We could in the future have f-zombies with mind-uploading. The science fiction book Permutation City explores the concept. Brain scanning and computing has progressed to the point in the 2050s for accurate digital copies to exist in simulated worlds. The question the physical main character wishes to explore is whether a digital copy is conscious, and how manipulating the simulation might distort that consciousness. Most of the copies commit suicide upon finding out their uploads. But the last one is prevented from doing so, and goes on to invent the dust theory of consciousness.

If digital uploads are not conscious, they could still be functionally equivalent and make the same claims about being conscious as we do. They would physically be different, which would mean that consciousness is not functional, and behavior is not a reliable indicator for being conscious.
GraveItty October 26, 2021 at 06:55 #612018
Quoting Pantagruel
. He relies heavily on the case of the phenomenal zombie, which is functionally and psychologically identical to him, but has no phenomenal experience.


Then he must be a phenomenal zombie with phenomenal experience.
TheMadFool October 26, 2021 at 07:16 #612024
Beetle In The Box

It's possible that there's no beetle.

Truly conscious people have something that they call "beetle" but p-zombies have nothing in their boxes but they still call it "beetle". In essence p-zombies are using words (meaning is use) and if Wittgenstein in right, we're also doing the same. :chin:
GraveItty October 26, 2021 at 08:02 #612037
Quoting TheMadFool
Truly conscious people


Are there non-truly conscious people (apart from my wife)?
TheMadFool October 26, 2021 at 09:23 #612062
Quoting GraveItty
Are there non-truly conscious people (apart from my wife)?


:lol: Don't say that I laughed at your joke. I really don't want to be in her bad books. :smile:
Pantagruel October 26, 2021 at 09:29 #612066
Quoting TheMadFool
In essence p-zombies are using words (meaning is use) and if Wittgenstein in right, we're also doing the same


Yes, as I mentioned earliler, I think this is the sense in which Chalmers suggests that consciousness determines the intension of its own concept.....
bert1 October 26, 2021 at 09:58 #612074
Quoting Pantagruel
If zombie-consciousness is devoid of phenomenality, what possible set of conditions could give rise to the zombie asserting phenomenality? Isn't this a petitio principii?


If the zombie is the clone of a liar, ill educated, or mad person.
bert1 October 26, 2021 at 10:02 #612077
Person and zombie-clone don't violate the identity of indiscernibles law. They are conceptually discernable - one is conscious and the other one isn't. You just can't tell which is which from the outside. The whole point is that they are conceptually discernible, but physically indiscernible (whatever 'physically' means in this context). And they are actually discernible by the one which is conscious. He knows which one he is.
Pantagruel October 26, 2021 at 10:11 #612084
Quoting bert1
If the zombie is the clone of a liar, ill educated, or mad person


I'm more of the opinion that consciousness in this scenario constitutes a nescio quid, such that for a zombie to make a true qualia-claim it would be referring to something to which it in principle does not have access.
bert1 October 26, 2021 at 13:57 #612215
Quoting Pantagruel
I'm more of the opinion that consciousness in this scenario constitutes a nescio quid, such that for a zombie to make a true qualia-claim it would be referring to something to which it in principle does not have access.


Indeed. Any claim to having an experience must be false if expressed by a zombie, very much a nescio quid for the zombie. But for the human, who has noticed he is conscious, it's more of a, er, conscio quid, or something.
noAxioms October 26, 2021 at 18:43 #612348
Quoting bert1
a liar, ill educated, or mad person.
I am neither mad nor ill-educated.
Neither party is lying. For it to be a lie, each being (the zombie using nothing but physics, and the 'human', as y'all put it) need to spend a moment in each other's shoes to compare. This is what the one is like, and this is the other. Now given that, one can select which most closely matches his experience (or choose to lie about it and claim the other). Until then, there's no lie about it since both have only one experience to compare, and each has learned the vocabulary to describe it from places like this forum. I certainly would never have used the word 'qualia' for instance had I not heard it from others.
So in the interest of not lying, I assert that the experience that I have is just the result of doing it the same way as would any physical device with sensory input and an information processor to make sense of it. There's nothing seemingly inexplicable about it, and hence I conclude that I'm one of the zombies and that I'm missing out on the full inexplicable-by-physics experience.
Pantagruel October 27, 2021 at 09:44 #612725
Quoting noAxioms
hence I conclude that I'm one of the zombies and that I'm missing out on the full inexplicable-by-physics experience.


:up:
Marchesk October 27, 2021 at 10:17 #612731
Quoting noAxioms
Neither party is lying. For it to be a lie, each being (the zombie using nothing but physics, and the 'human', as y'all put it) need to spend a moment in each other's shoes to compare. This is what the one is like, and this is the other.


There's nothing it's like to be a zombie. So for us humans switching places is the same thing experientially as being unconscious. It would be like losing time once you switch back.
Pantagruel October 27, 2021 at 11:08 #612743
In keeping with Chalmers approach, I'll offer some metaphysical speculations at this point.

Chalmers eventually examines how information can plausibly link the physical and the phenomenal, since it presents aspects of both (the well-known issue of the two entropies). What I would like to consider is, extending Chalmers approach of supervenience, if consciousness, while not supervenient on the physical, is in fact supervenient on the informational, then consciousness could be translated from one medium to another, exactly as information can be. The only question is, when I am thinking this thought now, is that exhaustively represented by the informational content, or is there something more? Is the thing which is producing or creating information itself a form of information? I'm inclined to think it is...some form of globally coherent informational history maybe. And so, yes, theoretically translatable between mediums.
Sam26 October 27, 2021 at 14:01 #612832
Quoting Pantagruel
f zombie-consciousness is devoid of phenomenality, what possible set of conditions could give rise to the zombie asserting phenomenality? Isn't this a petitio principii?


One possible answer, is that the zombie is just programmed to say these kinds of things. If, for example, our reality is a kind of program of sorts, then it's quite possible that some being (what we refer to as a person) might just be part of the program. They act like us, they talk like us, but they lack the internal subjective experiences of a real self. It's certainly possible, but unless you were able to remove yourself from the program, it would be difficult if not impossible to tell the difference.

It's hard to see where he's committing a fallacy.
Marchesk October 27, 2021 at 15:25 #612870
Quoting Pantagruel
I'm inclined to think it is...some form of globally coherent informational history maybe. And so, yes, theoretically translatable between mediums.


I believe Chalmers would disagree, because he would say that consciousness is not reducible to information. It does not logically supervene. Rather, there's an additional law of nature that binds conscious experience (or causes it to emerge) whenever there is an informationally rich stream, or whatever the criteria is.

One might object to this new arbitrary law that adds something additional to nature, but I think the even deeper issue is the status of laws making nature be a certain way. If we can allow such laws on the microphysical level, then I don't see what stops them from happening elsewhere. Because laws of nature are deeply mysterious.
Pantagruel October 27, 2021 at 15:52 #612884
Reply to Marchesk Yes, this isn't in scope for Chalmers' theses, but is metaphysical speculation, as I said. I don't know that he disagrees specifically though - it is an extension of his dual-aspect approach but may suggest an overarching monism (of information). He is amenable to such notions.
Pantagruel October 27, 2021 at 15:54 #612886
Reply to Sam26 In which case the claims would be 'caused' by something familiar with the experience presumably....
noAxioms October 27, 2021 at 16:31 #612903
Quoting Marchesk
There's nothing it's like to be a zombie.
I sort of agree, but see bold below. I have no evidence that anything is being me. But that doesn't mean that the zombie cannot function, perceive, etc. like any other automaton.

So for us humans switching places
I didn't really have in mind 'switching places' since lacking something being me, there's nothing to switch. Perhaps the zombie (Phil) can be possessed by something (Bob) being it for a short while. But this only lets Bob know what its like to be Phil (who is for a short while not a zombie), but Phil might not necessarily be aware of it.

the same thing experientially as being unconscious.
I think I follow this, but disagree. A self-driving car, with driver in it, has something 'being' it and the car is 'conscious'. The car is an extension (an avatar) of the driver. Not sure how you're using 'experience' here. A self-driving car is capable of being aware of its surroundings and function on its own. That's 'experience' in my book, as distinct from 'conscious' which is the experience and control of the driver. If you use the word differently, then I need one to describe what a mechanical device does to measure the world.

Point is, the car can cede control to a conscious entity (driver) and become a car/driver system, and if you ask the system if it has phenomenal experience, it would be the driver that answers 'yes'. Perhaps the car still has its own experience and notes that it would have done that lane change better, but it's not in control.
If the same car is driving itself with the same person now acting epiphenomenally as passenger, then there's still something 'being' it, but it's the car in control, and thus the car that answers when asked if it has phenomenal experience. The car is unaware of the passenger, so it truthfully answers yes since it is quite aware of the vehicles around it and such and has no driver-phenomenal experience with which to compare.
There may be no passenger at all, and thus nothing 'being' the system, and the experience of the car is the same. So as long as it is in control, the car is going to answer the same way. If it's under control of an external driver agent, then it is the agent that answers, not the car, and the car cannot convey what that experience is like.

I claim to be the zombie car, not the driver/car system because i have no evidence to the contrary and it seems more plausible than the physics-defying system otherwise posited.

Marchesk October 27, 2021 at 16:45 #612910
Quoting noAxioms
I claim to be the zombie car, not the driver/car system because i have no evidence to the contrary and it seems more plausible than the physics-defying system otherwise posited.


So you see zombie colors, hear zombie sounds, think zombie thoughts, dream zombie dreams?
RogueAI October 27, 2021 at 18:02 #612929
Quoting Sam26
One possible answer, is that the zombie is just programmed to say these kinds of things. If, for example, our reality is a kind of program of sorts, then it's quite possible that some being (what we refer to as a person) might just be part of the program. They act like us, they talk like us, but they lack the internal subjective experiences of a real self. It's certainly possible, but unless you were able to remove yourself from the program, it would be difficult if not impossible to tell the difference.


Except the zombie is supposed to be identical to me except for being conscious. I don't talk about my mental states due to any programming; when I talk bout being in pain, say, it's because I want to inform someone about my mental state.
RogueAI October 27, 2021 at 18:04 #612932
Quoting Marchesk
If we're not conscious in the way that philosophers like Chalmers claim we are, then qualia would count as such a word in our universe. Idealism would be another. Platonism would be yet another. Not to conflate those three terms, but it demonstrates that if the world is physical, it doesn't prevent us from coming up with non-physical words.


Doesn't coming up with words for mere possibilities require imagination?
Marchesk October 27, 2021 at 18:06 #612933
Quoting RogueAI
Doesn't coming up with words for mere possibilities require imagination?


So ... p-zombie Chalmers is imagining the redness of red and what it's not like for his zombie twin to lack that red sensation, and it's implications for metaphysical possibility.

When p-zombie Mary leaves the black and white room and sees a red object for the first time, she learns a new fact that isn't a fact, because p-zombie Mary is mistaken about seeing red. In fact, her entire world is colored in combinations of p-red, p-yellow and p-blue. She learns about the p-redness of p-red. We can call that a p-fact. But she already knows all the p-facts. So she learns nothing.

My brain hurts now. I'll admit to having difficulties with the p-zombie argument when it comes time for the zombies to talk about consciousness.
Sam26 October 27, 2021 at 18:16 #612937
Quoting RogueAI
Except the zombie is supposed to be identical to me except for being conscious.


Yes, and this is why I said, "...they lack the internal subjective experiences of a real self," which was meant to mean they are not conscious. It's difficult to know if such a zombie would really act like a conscious being. It seems that you could in theory make them respond just like us. It would be like playing a game, say, World of Warcraft, and not knowing if you're talking with a real person or not.
RogueAI October 27, 2021 at 18:18 #612939
Quoting Sam26
Yes, and this is why I said, "...they lack the internal subjective experiences of a real self," which was meant to mean they are not conscious. It's difficult to know if such a zombie would really act like a conscious being. It seems that you could in theory make them respond just like us. It would be like playing a game, say, World of Warcraft, and not knowing if you're talking with a real person or not.


I don't think they could act entirely like a conscious being because conscious beings' actions are sometimes caused by their mental states.
Sam26 October 27, 2021 at 18:45 #612947
Quoting RogueAI
I don't think they could act like a conscious being because conscious beings' actions are sometimes caused by their mental states.


The point of course would be, how you could you tell the mental state apart from a programmed response? I don't think, in theory, you could.
RogueAI October 27, 2021 at 19:13 #612954
Quoting Sam26
The point of course would be, how you could you tell the mental state apart from a programmed response? I don't think, in theory, you could.


Maybe not but that's an epistemological point. It seems to me that P-zombies can exist iff there are no actions caused solely by mental states.
noAxioms October 27, 2021 at 20:29 #612994
Quoting Marchesk
So you see zombie colors, hear zombie sounds, think zombie thoughts, dream zombie dreams?

No, I see colors, hear sounds, think thoughts, and dream dreams, but I do it the zombie way without help from the outside, just like the self-driving car does. OK, the car probably doesn't dream, but it does the other things, however reluctant you might be to ascribe such terms to such a device.
RogueAI October 27, 2021 at 20:44 #613002
Reply to noAxioms If pain hurts, you're not a zombie.
Marchesk October 27, 2021 at 23:27 #613115
Quoting noAxioms
No, I see colors, hear sounds, think thoughts, and dream dreams, but I do it the zombie way without help from the outside,


What's that like?
noAxioms October 28, 2021 at 02:21 #613232
Quoting RogueAI
A self-driving car can't feel pain. I assume you can.

I don't feel pain. I'm a zombie, remember? I merely process the data received from my nerve endings and make the appropriate facial expressions and such.

Some cars don't have damage sensors. They're still awfully primitive. Ones that do have the sensors process the data, which can be interpreted as pain or not depending on your choice to characterize it with such language or not, a choice which doesn't alter what's actually going on one way or the other. But I assert there is no evidence of any fundamental difference between myself and the car. Our mutual refusal to use the word pain to describe the respective systems isn't evidence of anything.

Quoting Marchesk
What's that like?
To what? There isn't anything to which it is like something. That's the thing I deny. There's no 'I' (a thing with an identity say) that's being me.
InPitzotl October 28, 2021 at 23:46 #613761
Quoting Marchesk
My brain hurts now. I'll admit to having difficulties with the p-zombie argument when it comes time for the zombies to talk about consciousness.

Yeah, that's the real problem here. If qualia are epiphenomenal, how can we talk about them?
Pantagruel October 29, 2021 at 00:18 #613770
Quoting InPitzotl
Yeah, that's the real problem here. If qualia are epiphenomenal, how can we talk about them?


:up:
Marchesk October 29, 2021 at 07:13 #613856
Quoting noAxioms
To what? There isn't anything to which it is like something. That's the thing I deny. There's no 'I' (a thing with an identity say) that's being me.


So you don't feel pain?
InPitzotl October 29, 2021 at 11:58 #613924
Quoting noAxioms
There's no 'I' (a thing with an identity say) that's being me.

This phrase sounds suspicious. There's a me, but there's no I being me?

Also, there's definitely an "I" there. Something typed an entire grammatically correct, if not coherent, response in this thread with a unified theme conveying some particular form of skepticism to zombies.
RogueAI October 29, 2021 at 18:06 #614012
Quoting noAxioms
I don't feel pain. I'm a zombie, remember?


Are you pretending for the thread, or do you actually think you're a p-zombie?
noAxioms October 30, 2021 at 02:07 #614279
Thanks for all the feedback everyone. I actually appreciate it.

Quoting InPitzotl
My brain hurts now. I'll admit to having difficulties with the p-zombie argument when it comes time for the zombies to talk about consciousness.
— Marchesk
Yeah, that's the real problem here. If qualia are epiphenomenal, how can we talk about them?

Indeed, if one with qualia can talk about it, it isn't epiphenomenal. Those of us without the qualia might talk about it because we hear the rest of you talking about it and know no better.

Quoting RogueAI
Are you pretending for the thread, or do you actually think you're a p-zombie?

Well, not pretending anything. Chalmers claims a conscious experience that does not supervene logically on the physical. I don't have that since what I do isn't a logical contradiction like that. So I can only presume Chalmers (and the rest of you non-zombies) has a conscious experience that is fundamentally different than me just "receiving data that could be interpreted as pain", as it was put in T2. I might use the word pain, not because I (like the other zombies) am lying, but because we've been provided with no other vocabulary to describe it.

Quoting Marchesk
So you don't feel pain?
'Pain' seems to be a word reserved to describe the experience of had by the experiencer of a human. It would be a lie to say that I feel pain, in the context of this topic, so lacking an experiencer, I cannot by definition feel pain any more than can a robot with damage sensors. Again, I may use the word in casual conversation (outside the context of this topic) not because I'm lying, but because I lack alternative vocabulary to describe what the pure physical automaton does, something which by your definition cannot feel pain since it lacks this experiencer of it.

Quoting InPitzotl
There's no 'I' (a thing with an identity say) that's being me.
— noAxioms
This phrase sounds suspicious. There's a me, but there's no I being me?

This is a better question. The 'me' is like the robot, the thermostat', the automaton. These things, in common language, have a sort of legal identity, but not an identity which holds up to close scrutiny such as Parfit demonstrates. The "I" on the other hand refers to the experiencer of a conscious thing, something which gives it a true identity that doesn't supervene on the physical. My 'me' doesn't appear to have that. It seems inconsistent that something with an identity can be paired with something without one. The bijunction between the two doesn't work without a series of premises which I find totally implausible.

Also, there's definitely an "I" there. Something typed an entire grammatically correct, if not coherent, response in this thread with a unified theme conveying some particular form of skepticism to zombies.
No, that's the legal 'me' doing that. Any toaster has one of those. Any automaton can type a similar response in a thread such as this.
InPitzotl October 30, 2021 at 13:08 #614488
Quoting noAxioms
The "I" on the other hand refers to the experiencer of a conscious thing, something which gives it a true identity that doesn't supervene on the physical.

I cry foul here. Imagine a believer of the classical elements telling you that he just fetched a pail of water from the well. When you ask the guy what water is, he explains that it is the element that is cold and wet. Analogously, you object... there is no "water"; for "water" refers to an element that is cold and wet, and we don't have such things. The problem is, the guy did in fact fetch the stuff from the well. This I believe is your error.

Slightly more analytical, the guy has a bad theory of water. When asked to describe what water is, the guy would give you an intensional definition of water that is based on the bad theory. It's proper to correct the guy and to say that there is no such thing as he described in this case; however, the guy is also ostensively using the term... the stuff in the well is an example of what he means by water. His bad theory doesn't make the stuff in the well not exist. So the guy is in a sense wrong about what water is, but is not wrong to have the concept of water. The stuff the guy goes out to fetch from the well really is there.

You're objecting to an intensional definition of "I", which is simply based on a questionable theory of self... but you still have the extension to which "I" refers.
Quoting noAxioms
No, that's the legal 'me' doing that. Any toaster has one of those. Any automaton can type a similar response in a thread such as this.

I've no idea what you mean by legal me, but the ostensive I to which humans refer is not something a toaster has. I can't comment on the automaton... the term's too flexible.
RogueAI October 30, 2021 at 20:44 #614646
Quoting noAxioms
'Pain' seems to be a word reserved to describe the experience of had by the experiencer of a human. It would be a lie to say that I feel pain, in the context of this topic, so lacking an experiencer, I cannot by definition feel pain any more than can a robot with damage sensors. Again, I may use the word in casual conversation (outside the context of this topic) not because I'm lying, but because I lack alternative vocabulary to describe what the pure physical automaton does, something which by your definition cannot feel pain since it lacks this experiencer of it.


How far does you skepticism go? Do you think there's a strong possibility you're the only mind in existence?
Marchesk October 31, 2021 at 03:33 #614872
Quoting noAxioms
Pain' seems to be a word reserved to describe the experience of had by the experiencer of a human. It would be a lie to say that I feel pain, in the context of this topic, so lacking an experiencer, I cannot by definition feel pain any more than can a robot with damage sensors. Again, I may use the word in casual conversation (outside the context of this topic) not because I'm lying, but because I lack alternative vocabulary to describe what the pure physical automaton does, something which by your definition cannot feel pain since it lacks this experiencer of it.


I can't tell which position you're actually arguing for or against. I assume it's a reductio?

I any case, I'm confident you do feel pain, and trying to argue that you don't via some objective comparison or description doesn't change the fact that you do in fact feel pain.
Janus October 31, 2021 at 04:25 #614899
Quoting Michael
Brain activity that triggers the vocalization of the expression "I am conscious." There's nothing special about those words. "I am conscious" is no more an indicator of consciousness than "one plus one equals two." They're just sounds that can result from mechanical operations.


This is exactly why a physicalist like Dennett, who thinks the idea of Zombies is incoherent, says that if they were possible, then we would all be zombies. Why would a zombie say it is conscious if it didn't think it was conscious? (Why would it say anything at all if it didn't think anything, for that matter?).

If the zombie can think it is conscious (which itself is an act of consciousness), and this thinking is the result of brain activity, then what reason could we have to think consciousnesses would not also be a result of brain activity?
Janus October 31, 2021 at 04:44 #614908
Quoting Sam26
It's hard to see where he's committing a fallacy.


You would need to posit a programmer which is no part of Chalmer's ideas..
SophistiCat October 31, 2021 at 09:16 #614964
Quoting InPitzotl
Slightly more analytical, the guy has a bad theory of water. When asked to describe what water is, the guy would give you an intensional definition of water that is based on the bad theory. It's proper to correct the guy and to say that there is no such thing as he described in this case; however, the guy is also ostensively using the term... the stuff in the well is an example of what he means by water. His bad theory doesn't make the stuff in the well not exist. So the guy is in a sense wrong about what water is, but is not wrong to have the concept of water. The stuff the guy goes out to fetch from the well really is there.


An eliminativist about personal identity could hold the phlogiston as a counterexample. To be sure, the phlogiston, identity, water element have been posited not as idle fantasies, but in order to explain some manifest reality. But the preferred solution, at least in the case of the phlogiston, was not to come up with a better theory of the phlogiston, but to drop the stuff altogether as part of a better theory that accounts for the manifest reality of heat transfer.
InPitzotl October 31, 2021 at 13:00 #615028
Quoting SophistiCat
An eliminativist about personal identity could hold the phlogiston as a counterexample.

I am pretty sure you're at least one step behind, not ahead, of the post you just replied to.
Quoting SophistiCat
But the preferred solution, at least in the case of the phlogiston, was not to come up with a better theory of the phlogiston, but to drop the stuff altogether as part of a better theory that accounts for the manifest reality of heat transfer.

This is clumsily phrased. Phlogiston theory is a theory about combustion. It was replaced by oxidation theory, a better theory about combustion. We dropped the notion of phlogiston, but not the notion of combustion.
SophistiCat October 31, 2021 at 13:24 #615036
Quoting InPitzotl
I am pretty sure you're at least one step behind, not ahead, of the post you just replied to.


I am not sure how to take this. Is this just a generic putdown, or did you mean something more specific? What am I missing?

Quoting InPitzotl
This is clumsily phrased. Phlogiston theory is a theory about combustion. It was replaced by oxidation theory, a better theory about combustion. We dropped the notion of phlogiston, but not the notion of combustion.


Well, referring to the phlogiston theory as a theory of heat heat transfer was perhaps clumsy, but you have ignored the substance of my response in favor of capitalizing on this nitpick.
frank October 31, 2021 at 13:34 #615040
Quoting Janus
Why would a zombie say it is conscious if it didn't think it was conscious


The argument is just about conceivability. Your question shows you've gone beyond conceiving of the P-zombie to asking why it's like that.

That's all that's needed to drive the wedge in.
noAxioms October 31, 2021 at 15:06 #615075
Quoting InPitzotl
The "I" on the other hand refers to the experiencer of a conscious thing, something which gives it a true identity that doesn't supervene on the physical.
— noAxioms
I cry foul here. Imagine a believer of the classical elements telling you that he just fetched a pail of water from the well. When you ask the guy what water is, he explains that it is the element that is cold and wet. Analogously, you object... there is no "water"; for "water" refers to an element that is cold and wet, and we don't have such things. The problem is, the guy did in fact fetch the stuff from the well. This I believe is your error.
This comment would perhaps at least make sense to me had it been attached to a comment of mine about pain and "data which could be interpreted as pain", but you've chosen to reference a comment about different kinds of identity for two very different things (a car and its driver say).
The only way I can parse it, it is the followers of Chalmers that are making the error you point out, where a human is privileged in being allowed to call something water/cold/wet, but anything else (a sump pump moving the stuff) doing the exact same thing is not allowed to use such privileged language (the pump moves a substance which could be interpreted as water). A mechanical device with damage sensors to which it reacts lacks the privilege to say it feels pain. I didn't make those privilege rules, so cry foul to the ones that make those rules. I decline to use the word because I don't consider myself to be in the privileged class.

Slightly more analytical, the guy has a bad theory of water. When asked to describe what water is, the guy would give you an intensional definition of water that is based on the bad theory. It's proper to correct the guy and to say that there is no such thing as he described in this case; however, the guy is also ostensively using the term... the stuff in the well is an example of what he means by water.
Bad analogy. In the case in question, nobody is ostensively using a term. You can't point to your subjective feeling of warmth and assert the toaster with thermostat doesn't feel anything analogous. Sure, it's a different mechanism, but not demonstrably fundamentally different.
So the guy is in a sense wrong about what water is, but is not wrong to have the concept of water.
No, wrong to have the concept of water since the term 'water' is not in fact being ostensively used. Perhaps not wrong, since there may be water in his well, but I detect none in mine and he cannot show me the water in his.

You're objecting to an intensional definition of "I".
I think so.
No, that's the legal 'me' doing that. Any toaster has one of those. Any automaton can type a similar response in a thread such as this.
— noAxioms
I've no idea what you mean by legal me
Legal identtiy: There is a rock placed at X, and you move it to a new location Y. Is it the same rock, or merely a different arrangement of matter in the universe with only language suggesting a binding between the prior arrangement and the later one? Is that toaster under your arm the same toaster as was stolen from me a moment ago, or a different one to which I have no claim? I shake a rope sending a wave down its length. Is the wave I created the same wave that reaches the other end despite not involving motion of a single bit of the original perturbed material? That's what I call legal identity, and has nothing to do specifically with life forms. It seems mostly language based, not based on anything physical, and it doesn't always work. A cell divides by mitosis. Which is the original cell? Language has no obvious answer and physics doesn't care.

but the ostensive I to which humans refer is not something a toaster has.
This seems to be an example of the privileged language mentioned above. What I see as the 'bad theory' asserts privileged status to humans, raising them above a mere physical arrangement of matter, and assigns language reserved only for objects with this privileged status. I'm denying the status, and thus sit in the group with the toaster, forbidden to use the sacred language. My son has one of those 'hey google' devices sitting on its table, and it might reply to a query with "I cannot find that song" or some such. But such usage seems to refer to the legal identity (something I don't deny) and not to "I, the experiencer of the device" which neither the toaster nor the physical arrangement of matter referred to as 'noAxioms'.

Quoting RogueAI
How far does you skepticism go? Do you think there's a strong possibility you're the only mind in existence?
I don't consider my position on this to be abnormally skeptical. I simply deny the non-physical experiencer, which is a fairly standard monist position. I differ from the mainstream position in that I'm willing for others to have the dual relationship (and hence all the talk about it), thus forcing me to use alternate terms to describe how I work. Most monists probably believe that every mind supervenes on the physical, not just some of them. My position explains why the zombies talk about pain when they don't actually 'feel' (privileged definition) it. They are not lying, merely drawing from the limited vocabulary available to them.

Quoting Marchesk
I any case, I'm confident you do feel pain, and trying to argue that you don't via some objective comparison or description doesn't change the fact that you do in fact feel pain.
Not if a mechanical device is forbidden from using the word. If a thermostat doesn't feel warmth, then neither do I. I admit to pain being a rare one, with few devices having sensors to provide it.
Quoting Janus
If the zombie can think it is conscious (which itself is an act of consciousness) , and this thinking is the result of brain activity, then why would consciousnesses not also be a result of brain activity?
:up:
Or not even necessarily brain activity, but any information-processing activity.
InPitzotl October 31, 2021 at 15:07 #615076
Quoting SophistiCat
I am not sure how to take this. Is this just a generic putdown, or did you mean something more specific? What am I missing?

It was not a put-down. I'm not just generically using braggart language here; you're literally one step behind. The water example is a response to the response you just gave, and it does not negate it. We did not discard the notion of water when we discarded classical elements, and there is a good reason we did not do so. That we discarded phlogiston on replacing it with a better theory, does not negate this good reason not to discard water when dropping classical element theory.
Quoting SophistiCat
Well, referring to the phlogiston theory as a theory of heat heat transfer was perhaps clumsy, but you have ignored the substance of my response in favor of capitalizing on this nitpick.

That's not quite the clumsiness I was referring to. "X is a bad theory of Y" is to be understood in the sense of X being an explanans and Y an explanandum. In this sense, phlogiston theory is not a theory of phlogiston because phlogiston is an explanans. The explanandum here is combustion; so phlogiston in this sense is a theory of combustion. When we got rid of phlogiston theory, we did get rid of phlogiston (explanans), but we did not get rid of combustion (explanandum).
SophistiCat October 31, 2021 at 17:39 #615108
Quoting InPitzotl
We did not discard the notion of water when we discarded classical elements, and there is a good reason we did not do so. That we discarded phlogiston on replacing it with a better theory, does not negate this good reason not to discard water when dropping classical element theory.


There is an infidelity in my phlogiston analogy in that "phlogiston" and "self" are not on the same level in terms of their pedigree and epistemic centrality. They are, however, on the same level in that both are theoretical entities that have played a role as explanans, and it is that which eliminativists attack. They do not deny that which gives rise to our habitual concept of "self"; rather they question the validity of the conceptualization.

Here I should disclose that I have been playing something of a devil's advocate, because I am not on board with the kind of eliminativism that blithely rejects concepts like "self" as merely illusory. Personal identity may be nothing over and above a psycho-social construct, a legal fiction, as @noAxioms might say, but it does exist at least qua construct, and as such it has very real consequences. And that is existence enough, as far as I am concerned. Where I am on board with eliminativism is in not granting habitual mental categories roles in science or metaphysics without first subjecting them to critical evaluation.
TheMadFool October 31, 2021 at 18:04 #615116
We don't know if consciousness if physical or nonphysical.

A p-zombie is a hypothesized being physically identical to a human but bereft of consciousness.

So, yes a p-zombie begs the question; after all it's defined in such a way that assumes consciousness nonphysical.

However, it can still be used in an argument like so:

1. If consciousness is nonphysical then p-zombies are possible

Ergo,

2. If p-zombies are impossible then consciousness is physical.





InPitzotl October 31, 2021 at 18:15 #615121
Quoting noAxioms
Bad analogy. In the case in question, nobody is ostensively using a term.

Of course they are. This is why they tend to say we have these properties, but these things over here, they don't. They are ostensively pointing to the properties, and they are formulating an incomplete theory in an attempt to explain the properties they are pointing to. And I even agree it's a bad theory about what they're ostensively including.
Quoting noAxioms
The only way I can parse it, it is the followers of Chalmers that are making the error you point out, where a human is privileged in being allowed to call something water/cold/wet, but anything else (a sump pump moving the stuff) doing the exact same thing is not allowed to use such privileged language (the pump moves a substance which could be interpreted as water).

The notion that either we have an immaterial driver in the driver's seat experiencing things or the toaster feels warmth sounds like a false dichotomy to me.
Quoting noAxioms
Is it the same rock,

Yes, it's the same rock...
Quoting noAxioms
or merely a different arrangement of matter in the universe

...and it's probably that too. Most of the toaster's mass is in gluons. They're constantly obliterating and reforming. And the next grand TOE may even do something more weird with the ontologies.

Nevertheless, if you had your name scratched onto the toaster when I stole it, it will tend to still be scratched on there unless I scratched it off.

Identity need not be fundamental; it can be "soft"... emergent, pragmatic versus universal by necessity, constructed from invariances, and the like.
Quoting noAxioms
You can't point to your subjective feeling of warmth and assert the toaster with thermostat doesn't feel anything analogous. Sure, it's a different mechanism, but not demonstrably fundamentally different.

Actually, yes, I can. The toaster reacts to warmth. "Legal me" reacts to warmth as well. But "legal me" also reacts to an increase in blood acidity.

But there's a difference between how I react to warmth and how I react to an increase in blood acidity. I can subjectively report on my feeling of warmth; I cannot subjectively report on my feeling of high blood acidity. Ostensively speaking, warmth is an example of something I subjectively feel; acidity is an example of something I react to but do not subjectively feel. There is no good reason for me to suspect that because the toaster reacts to warmth like I react to warmth, that it is subjectively feeling warmth like I subjectively feel warmth in contrast to how I do not subjectively feel blood acidity.
Quoting noAxioms
This seems to be an example of the privileged language mentioned above. What I see as the 'bad theory' asserts privileged status to humans, raising them above a mere physical arrangement of matter, and assigns language reserved only for objects with this privileged status.

Your sales pitch here is a dud. I can play Doom on this computer. I might could even play Doom on my Keurig. But I cannot play Doom on this bottle of allergy pills.

Different physical objects have different physical arrangements, and some arrangements have properties other arrangements don't have. We might could even say certain arrangements of physical objects have privileged status, raising them above other arrangements, and that we are justified in assigning language reserved for some classes of objects.

The "I" I accused you of having is simply a unit of theory of mind as it applies to the linguistic aspect of your posts. Humans can be thought of as objects susceptible to be described in terms of units of theory of mind, at least in the typical sense.
Quoting noAxioms
My son has one of those 'hey google' devices sitting on its table, and it might reply to a query with "I cannot find that song" or some such.

There are particular arrangements of physical matter that come in individual "toaster"-like bodies, which are embedded in their environments and must navigate them, and which regularly participate in conversations of various sorts with other entities. "Hey google" is not one of these things. But noAxioms is one of these things.

TOM can probably be extended to work in some version on "hey google", but it's distinct enough to reassess how we want to discuss its identity. An automaton of the right type might work better (SDC's are a bit out... the networked trend confuses the information-complex-to-body relation so requires the reassessment). You, OTOH, meet the requirements to apply theory of mind to as humans above the age of five regularly do.

But for some reason, instead of asking me what I mean by "I", or getting this plain reference to the notion that you as a unit consistently type out the same themed argument throughout single posts and across time, you keep going to this "experiencing the device" thing. I understand you're rejecting a bad theory of "I"; I too reject it. But I cannot play Doom on my bottle of allergy pills, and I cannot play debate-the-zombies with my toaster (at least yet).
Janus October 31, 2021 at 20:35 #615182
Quoting frank
The argument is just about conceivability. Your question shows you've gone beyond conceiving of the P-zombie to asking why it's like that.

That's all that's needed to drive the wedge in.


Not really. I'm saying that the absurdity of the idea that the p(urported) zombies could say things about their experience, even though they have no experience, is the central point that establishes the inconceivability of such a being.
frank October 31, 2021 at 21:20 #615220
Quoting Janus
Not really. I'm saying that the absurdity of the idea that the p(urported) zombies could say things about their experience, even though they have no experience, is the central point that establishes the inconceivability of such a being.


A computer can't tell you it's conscious?
Janus October 31, 2021 at 21:22 #615230
Reply to frank Only if someone conscious programs it to.
frank October 31, 2021 at 21:23 #615234
Quoting Janus
Only if someone conscious programs it to.


Why would that be a problem for the argument?
Janus October 31, 2021 at 21:28 #615247
Reply to frank As I pointed out before the idea of a programmer is no part of Chalmer's philosophy. If you wanted to posit that there could be p-zombies among us, or we could even all be p-zombies, because we are living in a consciously programmed simulation, then that would not be incoherent, but just a silly idea we could have no reason to believe.
frank October 31, 2021 at 21:30 #615248
Quoting Janus
As I pointed out before the idea of a programmer is no part of Chalmer's philosophy. If you wanted to posit that there could be p-zombies among us, or we could even all be p-zombies, because we are living in a consciously programmed simulation, then that would not be incoherent, but just a silly idea we could have no reason to believe.


It's kind of blatantly obvious that you aren't familiar with Chalmers or philosophy of mind in general.
Janus October 31, 2021 at 21:37 #615255
Reply to frank It's kind of blatantly obvious that you're resorting to ad hominem because you can't mount a decent rebuttal. If you think Chalmers invokes conscious programmers, then cite the relevant text. On the other hand if you think there is a plausible explanation, other than being deliberately programmed, for purported zombies to be speaking about experiences that they, by definition, do not have, then present it.
frank October 31, 2021 at 21:39 #615257
Quoting Janus
It's kind of blatantly obvious that you're resorting to ad hominem because you can't mount a decent rebuttal


Not an ad hominem. It's just a fact.
Janus October 31, 2021 at 21:47 #615266
Reply to frank Just a fact! :rofl: You know nothing about me. As I thought; you can't come up with the goods. It's a little sad that you feel a need to resort to such tactics, Frank. what are you trying to defend?
frank October 31, 2021 at 21:52 #615270
Quoting Janus
Just a fact! :rofl: You know nothing about me. As I thought; you can't come up with the goods. It's a little sad that you feel a need to resort to such tactics, Frank. what are you trying to defend?


If you read a little about Chalmers' p-zombie argument it will become clear to you why your objections are irrelevant.

I have no interest in explaining it to you just to have you repeat your nonsense.
Janus October 31, 2021 at 22:11 #615288
Quoting frank
If you read a little about Chalmers' p-zombie argument it will become clear to you why your objections are irrelevant.

I have no interest in explaining it to you just to have you repeat your nonsense.


When someone has enough interest to make a comment they claim they don't have enough interest to explain; I smell something distinctly rotten.
InPitzotl November 01, 2021 at 00:07 #615360
Quoting frank
A computer can't tell you it's conscious?

I'm a bit lost. This is what a zombie is according to Chalmers:
Chalmers:A zombie is physically identical to a normal human being, but completely lacks conscious experience.

http://consc.net/zombies-on-the-web/

...that doesn't sound like a computer. So what's the objection here?
frank November 01, 2021 at 00:12 #615364
Reply to InPitzotl

That was a reply to Janus, who found it impossible to conceive of an entity that tells you it's conscious when it isn't.



Marchesk November 01, 2021 at 08:06 #615476
Quoting noAxioms
Not if a mechanical device is forbidden from using the word. If a thermostat doesn't feel warmth, then neither do I. I admit to pain being a rare one, with few devices having sensors to provide it.


Is the thermostat biologically equivalent to you? But this isn't about what words you can and can't use if you subscribe to this or that. It's about the fact that you do feel warmth. Whether you want to grant or deny that to thermostats is a different matter. I tend to think they're incapable of feeling warmth, although I wouldn't rule some form of panpsychism completely out.

I simply don't believe you when you claim skepticism about feeling warmth just because you can't say the same for thermostats in this discussion.
noAxioms November 01, 2021 at 17:46 #615577
Quoting InPitzotl
Of course they are. This is why they tend to say we have these properties, but these things over here, they don't.
Don't or can't? This seems awfully begging to me.
if you had your name scratched onto the toaster when I stole it, it will tend to still be scratched on there unless I scratched it off.
Would it make it not the same toaster if the name got scratched off, or was never there in the first place? Despite my calling it a 'legal identity' (an old habit), I'm not talking about being able to prove the fact to a court of law. I'm talking about it actually being the toaster in question or not.
Perhaps 'pragmatic identity' fits better.

We might could even say certain arrangements of physical objects have privileged status, raising them above other arrangements.
That's not the story being pushed:Quoting InPitzotl
A zombie is physically identical to a normal human being, but completely lacks conscious experience.
— consc.net/zombies-on-the-web

Per this assertion, the lack of privilege does not come from a defect or other difference in the physical arrangement.

The "I" I accused you of having is simply a unit of theory of mind as it applies to the linguistic aspect of your posts.
Not how I'm using it when I make a distinction. The "I" refers to the non-physical experiencer, the thing that gives the privilege.
You, OTOH, meet the requirements to apply theory of mind to as humans above the age of five regularly do.
Age of five eh? Does that imply you were a zombie until some sufficient age? What do you experience before then?

Quoting frank
A computer can't tell you it's conscious?

Quoting Janus
Only if someone conscious programs it to.

While there are plenty of computers running fixed algorithms that just play pre-recorded messages (a typical phone tree for instance), a true AI isn't programmed to say any specific words. It learns them, same as you do. Its programming may have been done by another computer, probably better than would have said 'conscious programer'. Your assertions are rapidly going to be demonstrated false as capabilties improve.

InPitzotl November 01, 2021 at 18:59 #615594
Quoting noAxioms
Don't or can't?

Irrelevant. This is ostensive.
Quoting noAxioms
Would it make it not the same toaster if the name got scratched off

Yes.
Quoting noAxioms
I'm talking about it actually being the toaster in question or not.

If you say so, but all I can talk about is what I mean by "being the same".
Quoting noAxioms
That's not the story being pushed

That's David Chalmers' story. I'm not David.
Quoting noAxioms
Not how I'm using it when I make a distinction.

You said the toaster feels warm. It doesn't matter how you're using the word "I"... the toaster doesn't feel warm. It lacks the parts.
Quoting noAxioms
Age of five eh?

Yes; by that age, most humans learn theory of mind.
Quoting noAxioms
Does that imply you were a zombie until some sufficient age?

No, it implies that you can for example pass the Sally Anne test. Theory of Mind has nothing to do with p-zombies.
noAxioms November 02, 2021 at 02:07 #615795
Quoting Marchesk
Is the thermostat biologically equivalent to you?

Most thermostats are not biological, and are thus not the biological equivalent of anything.
I am, however, presenting one as a crude mechanical equivalent of "processing data which could be interpreted as 'feeling warmth'", which is what I claim noAx does.
Sorry for the 3rd person reference, but some of the prior posts have been getting a bit undefined as to what exactly 'I' refers. noAx is the physical biological human.

But this isn't about what words you can and can't use if you subscribe to this or that.
Yes it is. Chalmers forbids the usage of 'feels warmth' for the zombie, and the thermostat is a zombie, lacking the added bit that is the difference between zombies and humans. So the word is forbidden, despite the fact that the thermostat measures (via physics!) temperature and reacts to it, exactly as the zombie does. The vocabulary is reserved (by proponents of the existence of the ';additional bit') for objects that have that additional supernatural bit, as evidenced by assertions of 'lies' when the zombie claims that he also feels warmth.
I chose the thermostat since it is the ultimate in trivial data processing. A sensor and a single mercury switch is enough, the opposite end of the complexity spectrum compared to noAx, but fundamentally doing the same thing.
A rock also reacts to warmth, possibly by breaking, but it lacks both explicit sensor and an information processor, at least of the sort with which we're typically familiar.

It's about the fact that you do feel warmth.
Chalmers claims that the feeling of warmth can only be had by some supernatural experiencer, and since noAx is not one of those, noAx no more can feel warmth than can the thermostat. I refuse to be placed in a privileged category over it, unearned.

I tend to think [thermostats] are incapable of feeling warmth, although I wouldn't rule some form of panpsychism completely out.
Oooh, magic sauce!

I simply don't believe you when you claim skepticism about feeling warmth just because you can't say the same for thermostats in this discussion.
I'm merely skeptical of this immaterial experiencer/possessor, or of the magic sauce, or however it's presented. Surely I'm not the first person on these forums skeptical of dualism.



Marchesk November 02, 2021 at 10:47 #615870
Quoting noAxioms
I'm merely skeptical of this immaterial experiencer/possessor, or of the magic sauce, or however it's presented. Surely I'm not the first person on these forums skeptical of dualism.


User whatever terms you like, but your first person experiences of warmth, pain, color, etc. are not part of the physical descriptions of the world. Thus the motivation for p-zombies. I think it's a problem with the description of the world. As in it's leaving something out, not that p-zombies are actually possible.

Quoting noAxioms
I chose the thermostat since it is the ultimate in trivial data processing. A sensor and a single mercury switch is enough, the opposite end of the complexity spectrum compared to noAx, but fundamentally doing the same thing.


The mercury responds to thermal energy. Cold and warmth are relative to organisms that need to survive in a certain temperature range.
noAxioms November 02, 2021 at 16:01 #615943
Quoting Marchesk
User whatever terms you like, but your first person experiences of warmth, pain, color, etc. are not part of the physical descriptions of the world.

Your opinion is noted, but it provides negligible evidence falsifying an alternate one.

I find the p-zombie description not only possible, but a more accurate description of what's going on. A biological being such as you describe (with agency) would have evolved differently than what we see. Everybody tends to selection-bias that away. I did an old thread on the subject on the defunct PF. Can't reference it anymore. :sad:

Pantagruel November 05, 2021 at 13:30 #617058
P-zombies it could be argued are logically possible, but not metaphysically possible, it is an example that tends to ring hollow for a lot of people.

Last week @180 Proof recommended the book Descartes' Error in the thread on emotional intelligence (thanks!) I just finished it, and right at the end Damasio describes an interesting case of a pre-frontal leucotomy (where portions of the pre-frontal cortex are removed) which was performed in the attempt to alleviate debilitating neuralgia:

Two days after the operation, when Lima and I visited on rounds, he was a different person. He looked relaxed, like anyone else, and was happily absorbed in a game of cards with a companion in his hospital room. Lima asked him about the pain. The man looked up and said cheerfully: "Oh, the pains are the same, but I feel fine now, thank you." Clearly, what the operation seemed to have done, then, was abolish the emotional reaction that is part of what we call pain. It had ended the man's suffering. His facial expression, his voice, and his deportment were those one associates with pleasant states, not pain.

Does this example support the notion of a p-zombie? Or the opposite?
Reply to frank Reply to Janus Reply to noAxioms Reply to InPitzotl Reply to Marchesk
frank November 05, 2021 at 17:55 #617144
Quoting Pantagruel
P-zombies it could be argued are logically possible, but not metaphysically possible


The two are closely linked. For metaphysical possibility, you can just posit a god that makes it happen.

Physical possibility is trickier. That's whether it's possible in our world.
Pantagruel November 05, 2021 at 18:05 #617147
Quoting frank
The two are closely linked. For metaphysical possibility, you can just posit a god that makes it happen.

Physical possibility is trickier. That's whether it's possible in our world.


Yes, the example I give happened in our world. That was what I was thinking. Chalmers calls them logical, metaphysical and natural possibility.
Janus November 05, 2021 at 22:27 #617236
Reply to Pantagruel An interesting case! It doesn't seem to me to support the idea of a p-zombie. The man was still aware of the neuralgic pain, which a p-zombie, by definition, could not be.

P-zombies would seem to be logically possible in the sense that the idea that a robot, for example could be programmed to talk about all the same things we do, without any of the kinds of experiences which prompt us to talk about what we do. On the other hand the idea that an entity could be just like us, and talk about all the same experiences we do, without experiencing any of them, and ,without being programmed to do so, seems illogical. Also it seems physically impossible that an entity could be physically and behaviorally identical to a human in every respect and yet not experience any of the things we do, in fact not experience anything at all.
Pantagruel November 05, 2021 at 23:37 #617292
Reply to Janus This case may be being "aware of the phenomenological experience" without having the experience though....
Janus November 06, 2021 at 00:33 #617335
Reply to Pantagruel I thought the usual definition of the p-zombie is that it has no self-awareness, or actually any awareness at all. The man in the example who had parts of the prefrontal cortex removed said that the pains were the same (which he could not know unless he felt them). He also said "I feel fine now".

I get the distinction between being able to feel pain and being bothered by feeling the pain. Mine was a difficult birth: my mother was thirty seven hours in labour, and I was yanked out with forceps in the end. She told me the pain was extreme and they gave her morphine. She said she still felt the pain when under the influence of the morphine, but that it didn't bother her at all.
Pantagruel November 06, 2021 at 00:42 #617345
Reply to Janus Maybe the P-zombie is just really stoical.....
Janus November 06, 2021 at 00:48 #617353
Reply to Pantagruel If so, the p-zombie is not what we think it is then... :scream:
noAxioms November 06, 2021 at 00:54 #617359
Reply to PantagruelThe guy didn't behave the same afterwards, whereas the zombie behaves indistinguishably from a 'human' in the same situation, so it doesn't really fit the definition.

Nice example though. Be interesting to have the dualists give their explanation of it. Somehow the surgery seems to have cut off the 'uplink' for 'mal-data' to wherever it gets 'processed' in a way that could be interpreted as pain. The data still gets through to say the rational area, but not to the area which causes distress. How might a dualist explain the 'mind' still getting the data for pain, but not feeling it? It's not like the mind was modified, only some connection in the brain.

Just for info, I do notice that there's multiple 'minds' in me. They hold different (contradictory even) beliefs, and one of the two is clearly in charge, but the other isn't epiphenomenal. But both of them seem physical. No funny external magic thingy.
Pantagruel November 06, 2021 at 10:52 #617480
Quoting noAxioms
The guy didn't behave the same afterwards, whereas the zombie behaves indistinguishably from a 'human' in the same situation, so it doesn't really fit the definition.


Right, so I guess it highlights the problem with the p-zombie hypothesis: is it plausible that a p-zombie could accurately report on phenomenal experiences without actually having them? It seems like a p-zombie would actually be in this boat....