You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

The mind-brain problem?

Belter June 02, 2018 at 11:04 14600 views 53 comments
Individuals, minds and brains are clearly different and also their mutual relations.
Sociology explains the society through individual mechanisms.
Psychology explains the individual through mental mechanisms.
Biology explains the mind through brain mechanisms.
Why the relation between theses concepts and scientific domains is considered often as a philosophical puzzle?

Comments (53)

SteveKlinko June 02, 2018 at 11:56 #184570
I think the problem with Biology and Brain Mechanisms can be summed up by asking a simple question about a one particular aspect of Consciousness ... Given:

1) Neural Activity for Red happens
2) A Conscious Red experience happens

How does 1 produce or cause 2?

Nobody knows the answer to this. It is the Hard Problem of Consciousness and it is also the Explanatory Gap of Consciousness.
Belter June 02, 2018 at 15:00 #184625
Reply to SteveKlinko

I think that 1) is not the cause of 2) but its biological mechanism.
Causes of Red experience are related with seeing a red objetc, to be alucinating with it, etc.
Pattern-chaser June 02, 2018 at 15:52 #184643
Reply to Belter I don't want to disrupt your thread, but I observe that you are deploying exclusively scientific tools to address these issues. Sociology tells us what an intelligent alien could deduce about us by remote observation. I think this problem could use a more intimate understanding than that. We need the human version of sociology, I think. I don't know what you'd call it, but it concerns what being a human means to a human. Similarly for psychology.

The mind-brain problem is a tricky one that requires (IMO) all the tools, techniques and perspectives we have available. Science is only part of that.
Belter June 02, 2018 at 16:15 #184649
Thanks Reply to Pattern-chaser
My interest is not to elude the philosophical problems but try to understand them. But in this matter I sincerely cannot see it.
In my view, it is possible another minds that they do not use brains, such as the I.A. or an hypothetical alien civilization with a very different biology, but I cannot see the philosophical problem of a mind operating with other things that human neurons.
SteveKlinko June 03, 2018 at 13:07 #184985
Quoting Belter
I think that 1) is not the cause of 2) but its biological mechanism.
Causes of Red experience are related with seeing a red objetc, to be alucinating with it, etc.

There may be multiple reasons why the Red experience happens but with a normal Human being, if Red Neurons fire there will be a Red experience. Seems to me a Biological Mechanism is a cause. I just want to know what the Biological Mechanism is that accomplishes this.
tom June 03, 2018 at 13:33 #184991
Quoting SteveKlinko
There may be multiple reasons why the Red experience happens but with a normal Human being, if Red Neurons fire there will be a Red experience. Seems to me a Biological Mechanism is a cause. I just want to know what the Biological Mechanism is that accomplishes this.


When the red sensors in a CCD fire, will there be red experience?
Harry Hindu June 03, 2018 at 14:05 #185008
Quoting Belter
In my view, it is possible another minds that they do not use brains, such as the I.A. or an hypothetical alien civilization with a very different biology, but I cannot see the philosophical problem of a mind operating with other things that human neurons.

What is a mind for you to start declaring what has it and what doesn't?

What does it mean to "think", or to "have experiences"? What is "consciousness"?

It seems to me that problem lies in dualism itself - the idea that somehow the mind and body are so totally different that they cannot relate or interact. You seem to think that it is impossible for a particular set of active neurons represent some active part of the mind. Why? What argument do you have against it?

Seeing a bent stick in water is evidence that we see a model of the world based on the information we receive in light when it enters our eyes. We don't see bent straws in water. We see bent light passing through the water and glass.

In the same manner, when we look at other people and their brains, we see a model that is constructed with the information we receive from the light entering our eyes. Their body and their brain is a model for the underlying mental processes that they experience. Brains are mental models of other people's mental activity. You might say that the brain is the macroscopic expression of mental processes, which could be extremely small, or part of another process that we have yet to discover and explain.
Belter June 03, 2018 at 17:21 #185048
Quoting SteveKlinko
Seems to me a Biological Mechanism is a cause


In my view, the brain is not the cause of mind in the same way temperature is not an effect of the molecular movement but it physical mechanism (how hot happens at the most detailed and reliable level).
People think by using their brains, and objects become hot with the increasing of the movement in their molecules.
Causes are the "why" something happens and mechanism are the "how" one.
People think due to many different causes, but they seem to make it with a only one mechanism (neural synchrony), in the same way than an object can become hot due to many causes, but it seems to be always by the same mechanism (molecular movement).
That is, the individual "X", the mind state "M" of X, and the brain state "B" of M are not identical, so the relation between X and M corresponds to psychology, and of M and B one to biology.
If you think that between brain and mind there is a causality relation then you should must think that the cause of your onions cutting is the knife, and not the objective of making a salad.
Belter June 03, 2018 at 17:29 #185051
[i][i]Quoting Harry Hindu
What does it mean to "think", or to "have experiences"? What is "consciousness"?


In my opinion, they are concepts that explain what happens with individuals. Mind is that we postulated for modelating how individuals make which they make. Brain activity is that we postulated for modelating how minds make which they make.
SteveKlinko June 03, 2018 at 23:16 #185131
Quoting tom
When the red sensors in a CCD fire, will there be red experience?

I doubt it but who knows? If we knew how maybe we could ask the Conscious Mind, if there is one, that is connected to the camera if it had a Red experience.However, I do know that when Red Neurons fire there is a Red experience for Humans.
SteveKlinko June 03, 2018 at 23:25 #185135
Quoting Belter
Causes are the "why" something happens and mechanism are the "how"

I think it would be proper grammar to say that I want to know the Mechanism that Causes the Conscious Red experience to happen. I believe this means the Mechanism is the Cause. Of course this is all semantics.
Wayfarer June 04, 2018 at 00:23 #185144
Quoting Pattern-chaser
We need the human version of sociology, I think. I don't know what you'd call it, but it concerns what being a human means to a human.


That's possibly the original intent of philosophy itself.

Sociology was devised by Auguste Comte, and is an important aspect of the European Enlightenment. As such it was intended to move from an understanding based on metaphysics to one based on science. This however was long before all of the drawbacks of such positivism - and that term was also devised by Comte - became fully apparent.
Belter June 04, 2018 at 05:44 #185172
Quoting SteveKlinko
I believe this means the Mechanism is the Cause. Of course this is all semantics.


It is not semantic in my opinion. Smoking causes cancer, but the mechanism of the cancer is other thing than to smoke.
Harry Hindu June 04, 2018 at 13:59 #185302
Quoting Belter
In my opinion, they are concepts that explain what happens with individuals. Mind is that we postulated for modelating how individuals make which they make. Brain activity is that we postulated for modelating how minds make which they make.

Huh..wha?
tom June 04, 2018 at 15:51 #185343
Quoting SteveKlinko
I doubt it but who knows? If we knew how maybe we could ask the Conscious Mind, if there is one, that is connected to the camera if it had a Red experience.However, I do know that when Red Neurons fire there is a Red experience for Humans.


I think it's jumping the shark to entertain the idea that a CCD possesses qualia. Asking the conscious mind attached to the CCD would be no more useful than asking the conscious mind attached to a retina.

Yes, we get it, humans have qualia, but if "red neurones" cause qualia in humans, then why don't they in animals, or robots.
Belter June 04, 2018 at 18:41 #185397
Reply to Harry Hindu

My lemma is this matter is: "Individuals think through the brain"

"Mind" is which we postulate to explain individual performance, for example:
- Why John is crying? (action)
- Because he feels pain (mind).
A very different question is:
- How he feels pain? (mind)
- Through the activation of C-Fibers in his brain. (brain)

The Hard Problem, as I understand it, claims that, in one hand, there is 1) a "descriptive" problem, often called the NCC (neural correlates of conscience) one; and 2) another "theoretic" one, which fits obtaining a theory about how these correlates "cause" experience or conscience.

In my opinion, this view fails directly insofar "Y" can not be simultaneously a "correlate" (relation between two effects) and a "cause" of X (relation between cause and effect). When two variables are correlated (show the same effects) is assumed that there is common factor (cause) acting in it. For example, hair loss and the decreasing of tall in humans could be correlated, due to aging in both cases.
Thus, the NCC is only "philosophical" problem insofar it says us what is the "matter" or the "substance" of conscience; the physical mechanism through we think. But it might be transformed into the NMC problem, that is: neural mechanism of conscience.

Brain can not the cause of mind in the same way that your intestinal procedure are not the cause of your digestion but (part of) its mechanism.
Harry Hindu June 04, 2018 at 21:28 #185439
Reply to Belter but what I've been saying is what if the mind creates the brain and its neurons as a model of others mind? When you look at another person's mind, you see a brain.
MiloL June 05, 2018 at 03:14 #185548
Quoting tom
Yes, we get it, humans have qualia, but if "red neurones" cause qualia in humans, then why don't they in animals, or robots.


I've mentioned this in part before but I'll try this a little different. Machine = Body, Brain = Mind, Self = Self. Ok so the Self operates the machine by way of the mind. Frequently the comparison between human and animals questions why/how us and not them. This is easily explained if you consider that the non human inhabitants of this planet are in so many ways anthropomophized because of those examples of emotion that challenges our conceptions that of their intellect and senscience. I mean if they were anything like us really why would they still live in the condition they do. Obviously not smart enough right? Ok I'll give you that but macro it out for a second.

Bear with me here.

If you were designing this planet and the life on it. Seems perfectly reason that you would have all your creations serving a particular purpose all in the greater purpose of your greater creation. Following this logic everyone but us has a purpose. The dung beetles, the lions, blah, blah....you get where I'm going. Humans however doing have anything specific that ties us to terrain or the planet (in general terms) for that matter. At least until the lack of gravity issue brings further word. In the meantime we remain significantly different and capable but what makes us different.

For this we look at a simple PC. Why? The Machine operated by the self via the mind is built with bios and an OS. In animals the OS is preloaded in humans it is not. Our children are helpless at birth because they have only the bios installed. Sleep, eat, cry...fear, happy...all the basics preload but human are the original AI. Our desire and efforts are nothing more than an extension of our collective need for answers about our creation less our creators. We are trying to recreate ourselves.

The human operating system is build almost entirely on experience. It is a model of the adaptive AI programming being dreamed about. As many of you know the BIOS and OS maybe work in concert but they are entirely separate and as humans the variations is evident in the people around us. Now animals while prone to surprise us and certainly must have mechanisms to adapt their OS where it comes to interacting and understanding their surroundings, they are bound to the limitations of their preloaded OS. Kind of reminds me of a Tandy from years ago.

Which is interesting since what I'm really talking about is design and since you are the creator and you've created a slew of different types of animals, insects, birds, etc and wanted to build something without the limitations of that which you've created so far. You've started countless projects a watched as they evolved. This time you wanted to do something different, better. So. you create something that can learn from everything else you created. Can survive in all the places you've made and endure all the conditions of life you've put in place.

Then it hits you. You have to create something not specifically attached as with fins, wings and the symbiotic relationships designed between creatures. This time you'd have to ensure they didn't have any of the predefined attachments like the others. This time the focus would be on the adaptive learning but it would have to start empty and collect data as it goes. It would have that preload bios to ensure it at least have a basic idea about danger and things that threaten its survival much in the way a child knows its hungry and must eat but has to learn to meet the need. In this way it can learn from the area in which it is. It's why frogs aren't born in the desert looking for flies near a lake (crude example I know but its late and you still get what I mean). The end result is a creature who can do well everything we've done leading us to this point and while the risk is certainly huge for such a creator because no one wants their creation to destroy itself but that adaptive learning is what allows us to become what we have and will while animals remain animals.
Belter June 05, 2018 at 06:15 #185599
Reply to Harry Hindu

I'm agree in part. But "mind" does not "create" the brain. A "model" and the phenomena modeled are two thing completely different.
SteveKlinko June 05, 2018 at 10:47 #185644
Quoting tom
I doubt it but who knows? If we knew how maybe we could ask the Conscious Mind, if there is one, that is connected to the camera if it had a Red experience.However, I do know that when Red Neurons fire there is a Red experience for Humans. — SteveKlinko
I think it's jumping the shark to entertain the idea that a CCD possesses qualia. Asking the conscious mind attached to the CCD would be no more useful than asking the conscious mind attached to a retina.

Yes, we get it, humans have qualia, but if "red neurones" cause qualia in humans, then why don't they in animals, or robots

Ok I was just playing along with you in answering the question the way I did. The reality is that we don't know anything about how our own Conscious experience of Red happens. We need to figure that out first before we can ask questions about CCDs.
SteveKlinko June 05, 2018 at 11:02 #185652
Quoting Belter
I believe this means the Mechanism is the Cause. Of course this is all semantics. — SteveKlinko
It is not semantic in my opinion. Smoking causes cancer, but the mechanism of the cancer is other thing than to smoke.

Your Smoking analogy has added a step in front of the problem. Smoking could be analogous to Looking at something Red. If I said that Looking at something Red causes the Red experience then that would be the same as saying Smoking causes cancer. Then you could say that Looking does not cause the Red experience but that there is some deeper Mechanism involving Neurons that is the cause. With the Consciousness problem we are already deep into the problem from the start. The analogous starting point in the Smoking analogy would be to ask the question how does Tar and Nicotine cause Cancer? It is Semantics.
tom June 05, 2018 at 12:38 #185677
Quoting SteveKlinko
The reality is that we don't know anything about how our own Conscious experience of Red happens. We need to figure that out first before we can ask questions about CCDs.


We know that neither "red neurones" nor CCDs cause qualia. That is impossible
Belter June 05, 2018 at 16:50 #185770
Quoting SteveKlinko
The reality is that we don't know anything about how our own Conscious experience of Red happens


The hard problem introduce a new additional problem that in my view does not exist. When Red-Neurons are firing in X, the conscious experience of Red happen in X (X experiences a qualia). We know it due to psychological experiments. But qualia, like digestion, are nontransferable: they are referred to concrete individuals. You can not see what another person sees in the same way that you can not make the same digestion than him but your our digestion. Qualia is like time in relativity theory: it is referred to a reference systems, which experience it, but it is not absolute.
The question "How C-fibers permit subject to feel pain?" is answered "Through their fired when harm in he is perceived". And we can continue questioning: "Why it happens?", so to respond: "Because evolution selected this way of feeling pain in humans". Some people think that it is insufficient, but there is not more steps.
tom June 05, 2018 at 19:25 #185806
Quoting Belter
When Red-Neurons are firing in X, the conscious experience of Red happen in X (X experiences a qualia). We know it due to psychological experiments.


You know it's true from psychological experiments on fish, lizards, and robots. I doubt it.

Belter June 06, 2018 at 05:09 #185961
Quoting tom
You know it's true from psychological experiments on fish, lizards, and robots. I doubt it.


Fish, lizards and robots all use some kind of "brain", in the sense of a material system for thinking. Mind happens without some form of brain is for my not conceivable.

tom June 06, 2018 at 07:49 #185990
Quoting Belter
Fish, lizards and robots all use some kind of "brain", in the sense of a material system for thinking. Mind happens without some form of brain is for my not conceivable.


Fish, lizards and robots don't have minds though. None of them possess qualia.
Belter June 06, 2018 at 08:02 #185992
Reply to tom

Quoting tom
Fish, lizards and robots don't have minds though. None of them possess qualia.


They have qualia (except robots if they have not a "brain", even when it is not "cellular") due to the same (evolutionary) reasons that us: to perceive and predict the world and to our self.
tom June 06, 2018 at 08:42 #185995
Quoting Belter
They have qualia (except robots if they have not a "brain", even when it is not "cellular") due to the same (evolutionary) reasons that us: to perceive and predict the world and to our self.


Robots can be programmed to do all that without qualia.

You are simply making an unwarranted assumption, when the opposite is practically certain. What we know of qualia, apart from our personal exeriences, is that only knowledge-creating entities can have them. Fish don't create knowledge, and they don't possess qualia. They have no use for it.
Belter June 06, 2018 at 08:53 #185997
Quoting tom
Robots can be programmed to do all that without qualia


If you program the robot for view colors, why do you think that it has not a "qualia" such as of humans for which evolution programmed them for that? For my it is an unwarranted assumption.
Brain was the evolutionary solution to the necessity of representing the world for individuals which need to move. Any reason justify that only the human brain, but any other, permit individuals to think. Why only humans are "knowledge-creating entities"?
wellwisher June 06, 2018 at 11:14 #186008
Reply to Belter

The difference between brain and mind is analogous to the differences between hardware; biology, and software; psychology. Software, by itself, is a nothing but a static DVD or thumb drive with semi-conductor bits and bytes. You need hardware to run it to see what it does. Hardware, by itself, is also inert, without an operating system and software. You need both to get the full affect of consciousness.

Hardware is easier to investigate, since this is outside us, and therefore it is a third person object that is more in tune with the philosophy of science. Software is not as tuned to the philosophy of science, since you can't necessarily infer what a static software can do, by looking at it in the third person, unless you are an expert at coding.

The variety of orientations in psychology shows that there is no coding standard. Therefore, this half of the puzzle becomes more subjective. Hardware is easier to standardize because we can see the processor versus the mother board and this can be standardized.

Learning the coding for the brain software is an internal process of introspection. You cannot learn this from the outside, since output affects always involve hardware; cross contamination. If someone smiles, this may start with software, but an outside observation become filtered through the hardware, thereby biasing any attempt to infer the software due to this extra second hard data.

For example, game software played through different quality hardware, can be quite different in terms of performance. One set of hardware can show a lag that has nothing to do with the software, but is a hardware affect. Another set of hardware can show fidelity enhancements that may be do more to better hardware or secondary sound and video software.

As such, one needs to filter out the hardware through self observation, so you can factor out any third person hardware affects, and observe the software in the first person. However, this approach is not fully supported by the philosophy of science, since it cannot be observed by others.

For example, if I am having a dream, there is no way for others to see these dream details, that I can see, to verify what I see. However, this is first hand data for software analysis. It occurs deep in the brain and my consciousness is not aware of my legs moving or my facial expressions changing. It is closer to software only. Enough practice and experience allows one to narrow down coding loops. Over time one can even map out the software side of the psyche.
SteveKlinko June 06, 2018 at 11:22 #186011
Quoting Belter
The hard problem introduce a new additional problem that in my view does not exist. When Red-Neurons are firing in X, the conscious experience of Red happen in X (X experiences a qualia).

You make this statement while saying there is no Problem. Here's the Problem .. Given:

1) Red-Neurons are firing in X
2) Conscious experience of Red happen in X

How does 2 happen when 1 happens? I think your main argument is that the Hard problem does not exist because of an improper use of language when asking this question. You might be correct but you have not explained what exactly is the problem with the question.

For now I see the question as a huge Problem. It is the Hard Problem of Consciousness. It is the Explanatory Gap on full display. Nobody knows how this works.
Belter June 06, 2018 at 11:34 #186013
Quoting SteveKlinko
How does 2 happen when 1 happens?


It is in my view, the question is bad formulated. It is a scientific question the "how" the knife cuts the onion: it simply "cuts" it, separating it in different parts. If you want continue asking when you assume that 2 happens by 1 you only will obtain biological details: "How people think with the brain?" is responded "By circuits, cores, modules, for the different competences, faculties etc.". But even when we have not still an advanced theory of mind (neuroscience is very young) it does not mean that it is another problem that a psychological one.
SteveKlinko June 06, 2018 at 11:41 #186015
Quoting Belter
It is in my view, the question is bad formulated. It is a scientific question the "how" the knife cuts the onion: it simply "cuts" it, separating it in different parts. If you want continue asking when you assume that 2 happens by 1 you only will obtain biological details: "How people think with the brain?" is responded "By circuits, cores, modules, for the different competences, faculties etc.". But even when we have not still an advanced theory of mind (neuroscience is very young) it does not mean that it is another problem that a psychological one.

I guess we will have to disagree on the Hard Problem. I have given it my best shot and have obviously failed to convince you.
Belter June 06, 2018 at 11:43 #186016
Reply to wellwisher

I am agree in general. It is possible that qualias of another persons but also the our self are differentiated by actions, like when you say that you're wearing fur to know if you're dreaming. A priori, we do not know if we are dreaming or not. Even for subjective point of view, qualia must be "inferred" from actions (included verbal actions) so they are not directly accounted. This led us to an skeptic conclusion about the "immediateness" knowledge of conscience.
tom June 06, 2018 at 12:56 #186034
Quoting Belter
If you program the robot for view colors, why do you think that it has not a "qualia" such as of humans for which evolution programmed them for that? For my it is an unwarranted assumption.


So, if you attach a camera to your PC, the PC has qualia? Do you really think so?

Belter June 06, 2018 at 16:10 #186074
Quoting tom
So, if you attach a camera to your PC, the PC has qualia? Do you really think so?


You are which is seeing the PC, but not the opposite. It seem to me very trivial. That is, camera in the example is like human glasses: you extend, increase, etc., your vision through it, but the subject in the example is which is using the camera.
Belter June 06, 2018 at 16:27 #186078
Quoting SteveKlinko
I have given it my best shot and have obviously failed to convince you.


I think that a constructive analysis of Hard Problem could be the following. It is not possible to differentiate a priori, for a given subject, if he is experiencing a dream or hallucination, or instead reality. Then, to know if something is a qualia (a mind state known as mind state; contrary to unconscious mind states, which are not qualia but just mind) is always a posteriori, by empirical evidence. For example, the trivial example of of rubbing your eyes or pinching yourself. A conscious experience is either a one stated as about reality or about fiction. Conscience is to differentiate mental and phsycal experiences. We can not know a priori if X is a physical red object or a mental red one. The problem 1) is about the mind, and 2) is about conscience. But conscience is by definition the ability to differentiate fiction and reality, so it scientific study could be not possible due to it is at the same time the instrument and the object of knowledge.
SteveKlinko June 07, 2018 at 10:41 #186248
Quoting Belter
We can not know a priori if X is a physical red object or a mental red one.

But you never see the actual Physical Red object even when you are looking right at it. You are always only Seeing the Mental, or Conscious, Surrogate of the Physical Red object. The Redness of the Red exists only in the Conscious World, or as I like to say it exists only in Conscious Space.
TheMadFool June 07, 2018 at 10:53 #186251
Reply to Belter In a video game, different levels, different bosses.
Anthony June 14, 2018 at 17:25 #187902
Quoting Belter
Fish, lizards and robots all use some kind of "brain", in the sense of a material system for thinking. Mind happens without some form of brain is for my not conceivable.


Robots have a brain? I realize you're thinking in terms of functionalism. Still, robots just don't have brains. If they have circuits, wires and sensors, then honesty would denote them circuits, wires and sensors. A rose is a rose is a rose.

Wouldn't there be some sort of mind that forms between two different species when they communicate?
MetaphysicsNow June 14, 2018 at 22:07 #187963
Reply to Belter
"How people think with the brain?"

This is a strange question - does it really make sense to suppose that we think with our brains?
I wash with soap.
I wave with my hand.
I laugh with my friend.
I accept a gift with gratitude.
I finish with a flourish.

In all these different cases of doing one thing with another, it makes sense to think of the doing in the absence of the thing that it is being done with. So, I guess I can think without my brain? Or perhaps I could think with someone else's brain, and not my own? If you do not think either scenario is possible, then the preposition "with" is not capturing what you take to be the connection between thinking and the brain.
Belter June 15, 2018 at 05:21 #188049
Quoting MetaphysicsNow
Or perhaps I could think with someone else's brain, and not my own?


I am an instrumentalist, functionalist, pragmatist, etc.: brain is the evolutionary solution to the advantage that think has. Individual is which thinks, and brain is the instrument, such as knife and cutting onions.
MetaphysicsNow June 15, 2018 at 06:36 #188058
Reply to Belter The terms "instrumentalism", "functionalism", "pragmatism" mean different things in different contexts and in some contexts are not even compatible with each other. I presume you mean you subscribe to something like the following argument:
1) Mental states and occurences are defined by their functional roles.
2) The functional roles so defined are filled by states of and occurences in the brain (well, let's be honest, you'll need more than just a brain to fill some of these functional roles, the rest of the body will probably have to get a look-in).
3)Therefore, mental states and occurences are brain (bodily) states and occurences.

This is pretty much classic functional state identity theory - is that what you believe? If so, the knife/cutting onions analogy does not work in this context, since the activity of cutting onions - even if it can be defined in terms of some functional role - requires more than a knife to fill.


Belter June 15, 2018 at 06:40 #188059
Quoting MetaphysicsNow
The terms "instrumentalism", "functionalism", "pragmatism" mean different things in different contexts and in some contexts are not even compatible with each other.


And they can also mean the same, such as the context of my previous answer.
I am explained my view in previous post, so I will not repeat it. Basically I can be considered a some kind of zombie-arguments hunter.
MetaphysicsNow June 15, 2018 at 06:48 #188060
Reply to Belter You can hunt the philosophical-zomby argument without being a functionalist, pragmatist or instrumentalist. The philosophical-zomby argument is just one of a range of different problems thrown at functional state identity theory. Some of what you say in the this thread indicates to me that you do subscribe to that theory, some of what you say seems unclear. Hence I asked you the direct question to get clear on exactly what it is you believe. So, I'll try again: do you believe that the argument I just gave is sound (i.e. is logically valid and has true premises)?
Belter June 15, 2018 at 06:54 #188061
Quoting MetaphysicsNow
1) Mental states and occurences are defined by their functional roles.
2) The functional roles so defined are filled by states of and occurences in the brain (well, let's be honest, you'll need more than just a brain to fill some of these functional roles, the rest of the body will probably have to get a look-in).
3)Therefore, mental states and occurences are brain (bodily) states and occurences.


I think that mental states are not defined by their functional roles, but brain states. That is, in 1) you must change "mental" for "brain". Then 1) and 2) state the same, so 3) is actually "mental states are brain functions".
Belter June 15, 2018 at 07:04 #188062
Reply to MetaphysicsNow

I think, as I posted at beginning, that there are 1) individuals (sociological level), which 2) use its brain (biological level) 3) to think (psychological level).
Basically, we explain 1) by 3), and 3) by 2).
MetaphysicsNow June 15, 2018 at 07:04 #188063
I think that mental states are not defined by their functional roles, but brain states.


If I change "Mental" for "Brain" in premise 1, for the argument to remain valid has to have for its conclusion:
3) Therefore, brain state and occurences are brain states and occurrences.

That would certainly be a logically valid argument, but then with (3) as a conclusion, I could substitute anything for (1) and (2) and have still have a valid argument, since (3) is a vacuous tautology.
Belter June 15, 2018 at 07:11 #188064
Reply to MetaphysicsNow
I try to show that your 1) premise is not true, so your argument is lack (or it is a vacuous tautology).
Belter June 15, 2018 at 07:33 #188066
Reply to MetaphysicsNow

Since I do not endorse that mental states have not a function, but they are a function, of brain states, this argument about the identity does not work for me. In my view, the relation between mind and individual is ontological: that is, subject X makes the action Y, and then we say for example that X makes Y due to think T. Then, T is something that X also makes. However, between brain and individual does not happen the same. Brain is the instrument by X thinks T, so between T and brain state B there is a action/instrument relation like cutting and knife (but not actor/action like for X and T).
MetaphysicsNow June 15, 2018 at 07:48 #188068
Reply to Belter
Since I do not endorse that mental states have not a function

I'm sorry, but I'm having difficulties with the way you are expressing yourself.
Do you mean that, although you believe that there are mental states, you do not believe that they have a function?
Belter June 15, 2018 at 08:04 #188069
Reply to MetaphysicsNow

I want to say that mental states fit brain functions, and mental states also have functions (to predict movement, differentiate objects, etc.), but these are at individual level. The individual level is often neglected in the accounts to mind-brain problem. We can account this problem in a basic way by stating 3 levels and pragmatism: there is actors, actions, instruments, mechanisms, etc. The Hard Problem is lack if you consider that it is not possible that an instrument working into a mechanism (brain producing consciousness) does not realize the corresponding function or action (It is not possible to think Chalmers's zombies, C-fibers without felling pain, etc. as it is not possible to think a knife cutting without something being cut).
MetaphysicsNow June 15, 2018 at 10:24 #188086
Reply to Belter So you do think that mental states have functions. So what exactly is your problem with the first premise of the argument for functional state identity theory?
I want to say that mental states fit brain functions

I'm not sure what you mean by "fit" here - do you mean "are", do you mean "are caused by" or something else entirely?