I'm a bit lost. This is what a zombie is according to Chalmers: http://consc.net/zombies-on-the-web/ ...that doesn't sound like a computer. So what's ...
My use of mind here is metaphorical (a reference to the idiom "of one mind"). Incidentally, I think we do indeed agree on a whole lot of stuff... our ...
Of course they are. This is why they tend to say we have these properties, but these things over here, they don't. They are ostensively pointing to th...
It was not a put-down. I'm not just generically using braggart language here; you're literally one step behind. The water example is a response to the...
I am pretty sure you're at least one step behind, not ahead, of the post you just replied to. This is clumsily phrased. Phlogiston theory is a theory ...
I cry foul here. Imagine a believer of the classical elements telling you that he just fetched a pail of water from the well. When you ask the guy wha...
No, being agentively integrated is what makes me (and you) an individual. We might could say you're an individual because you are "of one mind". For b...
This phrase sounds suspicious. There's a me, but there's no I being me? Also, there's definitely an "I" there. Something typed an entire grammatically...
I'm not sure what "want" means to the precision you're asking. The implication here is that every agentive action involves an agent that wants somethi...
Ah, finally... the right question. But why not? Be precise... it's really the same question both ways. What makes what the robot not have a goal, and ...
Why not? But I want you to really answer the question, so I'm going to carve out a criteria. Why am I wrong to say the robot is being agentive? And th...
In terms of explaining agentive acts, I don't think we care. I don't have to answer the question of what my cat is thinking when he's following me aro...
I've no idea why you think it muddies the water... I think it's much clearer to explain why shaking after drinking coffee isn't agentive yet shaking w...
So answer it. The question is, why is it agentive to shake when I dance, but not to shake when I drink too much coffee? And this: ...doesn't answer th...
Just to remind you what you said exactly one post prior. Of course the robot interacts with bananas. It went to the store and got bananas. What you re...
Zombies are functionally equivalent to conscious entities. Generically different entities have different evolutionary histories (because "you count to...
Generically, I reject fate outright. I'm agnostic about determinism. And I'm agnostic about free will. This definition of fate roughly fits into the c...
I'm pretty sure if you understood what I was saying, you would see there's no contradiction. So if you are under the impression there's a contradictio...
I think you're running down the garden path. I'm a human. I experience things. I also understand things. I can do things like play perfect tic tac toe...
Space in and of itself is very important to agents... agents need to manipulate their environment, and space is essentially where the environment is. ...
So if a mind can give a complete description of a photon, then the photon is independent of the mind. But if the mind cannot give a complete descripti...
Let me phrase it this way. Imagine we make a robot driver that will stop at a red light; we need not add experience to the robot. By comparison, I'm a...
But it would be trivial, and tautological in a meaningless sense, to say that functional sight excludes the experience of sight. Words are boxes, and ...
This would imply that the experience of sight is a non-functioning element of sight. But surely the experience of sight is at a minimum functionally n...
Insofar as it's new knowledge, it's necessarily knowledge about particular kinds of mental states. The question is, why can't those be brain states. B...
Try this... Mary is not really learning anything about "red" (the Jane/Joe/LED thing); she is learning something about her experiencing. Now let's wea...
...as opposed to knowledge of something physical. If it's physical, it would likely be a set of states Mary has. ...or some set of physical states of ...
Non-physical means not physical; it does not mean novel. It appears you're using "novel" to establish that this is not physical. That does not seem su...
With a little more precision, let's assume indeed Mary had the ability to see red. By that I mean that if Mary sees a 750nm LED glowing, then Mary has...
Nonsense. I want to pause here and take note of something very specific. The claim under scrutiny is whether physicalism is challenged by this or not....
It's kind of presumptuous to diagnose disagreements. You should just state your business, not theorize what you think is wrong with me such that I dar...
You're confused. khaled's objection is valid because the thought experiment specifically mentions Mary knows everything physical. If I know everything...
Sure, probably. But another possibility would be that Mary doesn't so much "learn" what it's like for her to see red, as she "develops a way for her t...
Not really. Let's define 750nm monochromatic light as red (monochromatic is key in the definition; and what we really mean is that only 750nm light is...
How is that minimal? You can make white by mixing two wavelengths; you're using three, a whole extra wavelength beyond the requirement! Also, didn't y...
It's not really the same thing, in short. Language does more than what perception does, and perception does more than what language does. They deserve...
There's language translation, and there's wrong. What color is a polar bear, Santa's beard, and snow? Your thought experiment is misguided. 7 is a num...
Eyes do not perceive, so the answer to the question is no (I'm sure you didn't literally mean that eyes perceive, but you have to be specific here eno...
Just a quick reminder... we're not talking about robots in general. We're talking about a robot that can manage to go to the store and get me some ban...
Pain is a feeling. Shopping is an act. If I see a person walking through the store, looking at various items, picking up some of them and putting them...
Your example isn't even an example of what you are claiming, unless you seriously expect me to believe that you believe persons with congenital analge...
And yet, Josh (guessing) does not understand Sanskrit, and you do not understand understanding. A person who does not understand something does not un...
The concept of understanding you talked about on this thread doesn't even apply to humans. If "the reason" the robot doesn't understand is because the...
Comments