Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Incubator Q&A

Welcome to the staging ground for new communities! Each proposal has a description in the "Descriptions" category and a body of questions and answers in "Incubator Q&A". You can ask questions (and get answers, we hope!) right away, and start new proposals.

Could a philosophical zombie verify that it is a philosophical zombie? Question

+1
−1

A philosophical zombie is an entity that is externally, behaviorally indistinguishable from some conscious entity, but lacks inner conscious experience, a.k.a. qualia.

See articles “Zombies” and “Consciousness - Objection 4: Zombies”.

A common idea in thought experiments involving a p-zombie, similar to the Chinese room argument, is that it would be impossible to externally verify the difference between two things, even though they are internally different. Purportedly, if you asked a p-zombie if they were conscious, they would say so, because all cognitive processing abilities are sufficiently in place to give, by calculation alone, an identical answer to one a conscious being would give. (Something like that can already sometimes be seen in certain conversations with an AI like ChatGPT.)

If you asked a p-zombie if it can “see” blue, it would say “yes”, and this would be false.

Consider a different kind of p-zombie. It is capable of giving any answer a human would give, because it fully understands the logic of human cognition. It has never seen blue, but it has flawless understanding of all the facts relating to blueness. This p-zombie can effectively lie in response to any question and be indistinguishable from someone conscious. However, this p-zombie is aware that it is lying. It knows that it cannot actually see blue.

Is this possible? I wonder if something conscious requires some kind of baseline “qualic” experience in order to even perceive what qualia it cannot perceive.

It is said some species have more “cones” in their eyes, allowing them to see colors humans cannot. It boggles the mind to try to imagine a new color. Somehow, we cannot.

I wonder if it is logically impossible for something with purely zero access to the sphere of qualic being, to even pose itself the question of if it does or does not perceive. (I think David Chalmers has argued that a p-zombie is logically impossible.)

History
Why does this post require moderator attention?
You might want to add some details to your flag.
Why should this post be closed?

0 comment threads

0 answers

Sign up to answer this question »