If Brain States are Mental States...
1. Brain states are mental states.
2. Brain state vocabulary is scientific.
3. If brain states are mental states, then meaningful communication about mental states is meaningful communication about brain states.
4. Meaningful communication about brain states is impossible if two speakers do not have brain state vocabulary.
5. Bob and Sheila do not have brain state vocabulary.
6. Bob and Sheila can meaningfully communicate about mental states.
7. From (3), Bob and Sheila can meaningfully communicate about brain states.
8. (7) is false (because Bob and Sheila do not have brain state vocabulary).
9. Therefore, meaningful communication about mental states is not meaningful communication about brain states.
10. Therefore, (1) is false.
All right, have at me.
2. Brain state vocabulary is scientific.
3. If brain states are mental states, then meaningful communication about mental states is meaningful communication about brain states.
4. Meaningful communication about brain states is impossible if two speakers do not have brain state vocabulary.
5. Bob and Sheila do not have brain state vocabulary.
6. Bob and Sheila can meaningfully communicate about mental states.
7. From (3), Bob and Sheila can meaningfully communicate about brain states.
8. (7) is false (because Bob and Sheila do not have brain state vocabulary).
9. Therefore, meaningful communication about mental states is not meaningful communication about brain states.
10. Therefore, (1) is false.
All right, have at me.
Comments (51)
Like how a description of subatomic particles can tell you what's going on in chemistry, but you don't have to know anything about that deep physics to talk about chemistry.
I'm arguing against the position that equates the two (identity theory). My argument doesn't work against dualists.
Except no one thinks chemistry is identical to physics or subatomic particles. I'm arguing strictly against people who claim mental states are identical to brain states. If that's the case, then two people who know nothing about brains shouldn't be able to meaningfully communicate about their own mental states. But of course we know they can.
Cool. If mental states merely represent brains states, it isn’t contradictory for mental states to have a logic vocabulary while still allowing brain states their scientific vocabulary. Besides, nobody, not even scientists, think in brain state vocabulary terms, so either what we consider thinking isn’t real, or another vocabulary is justified because it is.
Again, my argument is strictly against brain state=mental state.
If mental states are representations of brain states A), what does that mean, exactly, and B) don't you run into the same problem? if mental states represent brain states, then two people who know nothing about brains shouldn't be able to meaningfully talk about representations of brain states. But they can, because even if they don't know anything about brains, they certainly know about their own mental states. So, if they can talk about their mental states without knowing anything about brains, and mental states are representations of brain states, they're talking about representations of brain states without knowing anything about brain states. How is that possible? Can you give me an example of two people not knowing anything about X able to meaningfully talk about representations of X?
There is a definite scientific vocabulary when it comes to brains: neurons, synapses, receptors, potential, etc.
Ok, how about we say mental states are conditioned by brain states. That way, we can talk all day about the one, without having to know anything at all about the other.
Quoting RogueAI
Of course. But I said we don’t think in brain state terms. When I tell you all about what I had for supper, not once do I need to mention how many neurotransmitters I used. Test equipment may tell you, but if I come to your house with one attached to my head, I doubt you’ll care much about what I wanted to tell you anyway.
"i think most people would deny 3. you can communicate about clark kent without communicating about superman"
And I replied that Clark Kent is not identical to Superman. They're the same person, but there are obvious differences between the two.
Yes. I like to say that "Mind is what the Brain does" --- its function. Just as the function of your computer is to process input information, so you can talk about that meaningful information in plain English, without using the technical computer code that does the actual processing.
A similar analogy is used by cognitive psychologist Don Hoffman in his book, The Case Against Reality. He doesn't deny the underlying coded reality, but says that we evolved to think in terms of metaphorical symbols (concepts that he calls "Icons"), rather than in terms of Neurology. That helps me to clarify the old Brain/Mind conundrum. :smile:
Underlying Reality : http://bothandblog6.enformationism.info/page21.html
PS__In response to the OP, you could say that MInd states are analogous to Brain states.
Nobody can communicate about anything without shared vocabulary, this is a red herring and your whole argument depends upon it. Further, it is false to claim that brainstate vocabulary must be scientific, we are talking about it and neither of us are using strictly scientific vocabulary. Lastly, even if scientific vocabulary was the only vocabulary for brainstates it doesnt prevent communication, one would simply have to relay the meaning of the vocabulary being used.
Im afraid your argument is only clever semantics and structure and falls short of its goal.
No one except basically everyone. If you model the physics of a system of particles that bind together into atoms and molecules that then interact with each other, you end up modelling chemical reactions for free. But, you could also just talk about the chemical reactions, without having to talk about that physics stuff at all. One reduces to the other, but not vice versa.
You think the best move is to deny (2)? Brain state vocabulary isn't scientific? What is it then?
Chemistry is not identical to physics, that's absurd.
[i]i·den·ti·cal
/???den(t)?k(?)l/
Learn to pronounce
See definitions in:
All
Zoology
Logic
Mathematics
adjective
1.
similar in every detail; exactly alike.
"four girls in identical green outfits"
2.
LOGIC•MATHEMATICS
expressing an identity.
"an identical proposition"[/i]
Its just normal vocabulary, nothing about the vocabulary used for brain states is special. Its just words, with meanings, that some people know and some people do not and you communicate by using the shared vocabulary in order to clarify the meaning of the vocabulary that is not shared.
...Ive officially used the word “vocabulary” more times in a single day than ive ever used it....
Not really:
[i]Global CNS correction in a large brain model of human alpha-mannosidosis by intravascular gene therapy
Intravascular injection of certain adeno-associated virus vector serotypes can cross the blood–brain barrier to deliver a gene into the CNS. However, gene distribution has been much more limited within the brains of large animals compared to rodents, rendering this approach suboptimal for treatment of the global brain lesions present in most human neurogenetic diseases. The most commonly used serotype in animal and human studies is, which also has the property of being transported via axonal pathways to distal neurons.[/i]
I know it's not exactly neural correlate stuff, but that stuff is just as dense. That is obviously very scientific language by any standard use of the word "scientific".
No, but every chemical state is identical to some physical state. But not the other way around: not every physical state is identical to some chemical state.
If a=b, then b=a
The flaw in the argument would be the suppressed premise of what kind of communication the second kind is?
If brain state vocabulary is "scientific", it needs to said what class of vocabulary is instead employed to talk about mental states. Is it merely "unscientific" (a vague contrary claim)? The argument needs to clarify in what way such communication could be meaningful.
Scientific vocabulary is meaningful in its pragmatic application. If we talk about the world generally as a machine, and thus the brain as a specific kind of mechanism, then the pragmatic effect of this form of language is that - implicitly - we should be able to build this damn thing.
We are viewing the conscious brain as an example of technology - natural technology - that we can thus hope to replicate once we put what it is and what it does into the appropriate engineering language.
So "scientific" vocabulary isn't neutral. It has meaning in terms of what it allows us to build. It is all about learning to see reality as a machine (a closed system of material and efficient causes).
Of course, science is a broad enough church that it doesn't have to reduce absolutely everything to mechanism. And the aim can be also to regulate flows in the world as a substitute to making a machine. Engineering covers that gamut.
But you see the issue. Brain states language is itself a reflection of a particular reason for describing nature. It aims to extract a blueprint of a machine.
Then where does mental state vocabulary fit in to the picture? In what sense is it meaningful to someone or some community of thinkers? What is the larger goal in play?
To be commensurate, the two linguistic communities would have to share the same goal. And they are going to be talking at cross-purposes to the degree that they don't. And in both cases, they may be talking meaningfully (ie: pragmatically), but also, they are both just "talking". They are both modelling the noumenal from within their own systems of phenomenology.
Quoting RogueAI
The conclusion can't be so definite as "mental state vocabulary" is too ill-defined here. What makes it meaningful?
[Note that a social constructionist - as a scientist - would have plenty to say about how humans do use "mental state" language as a pragmatic means of regulating their (social) environment. We talk about our emotions all the time - love, jealousy, boredom, happiness. But are these "feelings" or "culturally meaningful rationalisations"? Even a phenomenologist would examine "feelings of love" and find a whole lot of unreferenced physiological responses that seem fairly aligned with a counter view of the brain and body as "a machine".]
Its still just words. Its not restrictive (which is what you need it to be for your argument to work) because anyone can use shared vocabulary to add those “Scientific” words to the shared vocabulary.
There is nothing about scientific vocabulary that isnt also true about basball vocabulary, or music vocabulary, or texting vocabulary (“lol”, “lmao” etc) as far as communication goes.
Does not sharing baseball vocabulary with someone restrict anyone communicating about it? No, the person just goes “whats a homerun?”, gets a description and moves on with the discussion/communication.
That's just restating supervenience as a claim. The claim only holds if "states" actually exist in the world rather than in the scientific imagination.
The language of states - as part of the language of machines - is certainly a pragmatically useful way of looking at reality. If we frame the facts that way, we have an engineering blueprint we can deal with.
But "states" is a pragmatic construct. And the reality we encounter often doesn't fit that construct so well. The map ain't the territory. And so claims of supervenience must be regarded as having a logical force only within a particular reality-modelling paradigm.
Imagine there are two people from 20,000 years ago who know nothing about brains. One of them stubs his toe and complains about the pain. The other expresses sympathy. Meaningful communication about mental states was exchanged. Information about those mental states was exchanged. Now, if mental states are identical to brain states (alike in every way), doesn't that entail that those two ancient people were talking about their brains? And isn't that an absurdity?
That 1 is false does not follow. E.g., 5 could be false, or 6 could be false.
Bob and Sheila are two cavepeople from 20,000 years ago. I have no problem claiming that Bob and Sheila from 20,000 can talk about their mental states. But does a physicalist want to claim that anyone 20,000 years ago used "brain state vocabulary"? Isn't that prima fascia absurd? In fact, I can just make Bob and Sheila two people from the Blue Lagoon who don't even know they have brains.
I don't think (6) can be false. What would prevent Bob and Sheila from talking about their mental states in a meaningful way? Humans have been doing that since long before anything about brains was known.
Ok ignoring the fact you havent refuted my counter points, Why would that be absurd? When they talk about what they see they are talking about the colour spectrum, retinae, light particles...any number of things they have no knowledge about yet are still talking about. They just dont know that they are talking about those things cuz they lack the words/concepts. Same with mental and brain states. They do t even need to know they have brains to talk about mental or brain states.
???
I don't know, let's find out how absurd this is. Can Bob and Shiela communicate their mental states? Donning my physicalist hat, if you say yes, then it's not absurd to say 5 is false. If you say no, you're ipso facto saying 6 is false.
Don't don any hat then. Pretend you're agnostic. Doesn't it sound absurd to claim that two people who don't even know what a brain is or that they even have one are capable of talking about brain states? Hasn't the term "brain state" at that point lost all meaning?
Nope. People may have no idea that sound is vibration of a medium such as air (i.e., that sound is an "air state"), but still be able to talk about sounds. People may have no idea that mental states are brain states but still be able to talk about brain states in the same fashion.
ETA:
To meaningfully talk about states, all that's required is that you be able to know there are states, and be able to distinguish states somehow. You don't need knowledge of what the substrate or manifestation of the state is; just some sort of identity of the states would do.
Quoting RogueAI
Their communication was not about mental states; it was not about sympathy or pain. Their communication reflects their mental states; the words used in their communication express the mental state that cause them, but are not about the mental states that cause them. I can use words to express a mental state without talking about the mental state.
Examples of scientific language would be talk of neurotransmitters, synpaptic gaps, certain chemicals, etc. Clearly it is vocabulary that is in the domain of science. Two neuroscientists talking about a paper in a journal are going to almost be speaking another language, as far as non-scientists are concerned.
Mental communication is meaningful if accurate information about mental states has been exchanged.
That's all fine, but I don't see the connection to the argument. All you have to grant me is that brain state language is scientific. It clearly is. Papers about brain states in neuroscience journals are almost impenetrable to me, the scientific jargon is so dense.
I don't see the issue. You haven't refuted that brain state language is scientific. You seem to be saying why it's scientific. I'm just claiming it IS scientific.
If someone tells you they're in pain (a mental state word, obviously), they've communicated information to you. You know more now than you did before they talked to you. That's meaningful communication.
I don't think goals have anything to do with this. Maybe I'm missing your point.
— RogueAI
The fact that information was exchanged and knowledge was acquired. "I am in pain" gives you information about me. You know more about me than you did before. That makes it meaningful. If it was just gibberish, you wouldn't have any new knowledge.
That's not what sound is.
1. vibrations that travel through the air or another medium and can be heard when they reach a person's or animal's ear.
You left off a crucial part. Sound is heard. If we're being accurate, two people from long ago aren't talking about sounds (vibrations in the air), they're talking about what they heard. The word for "sound" they're using isn't like ours because our refers to "vibrations in the air" while their word would strictly refer to the mental state of hearing a noise (or maybe Zeus throwing lighting bolts or something)
It's hardly crucial, as this is a red herring. The sounds we're talking about are heard; Bob hears a sound and describes it to Sheila. Neither Bob nor Sheila are required to know that these things they hear are vibrations carried to their ears over a medium to talk about the sounds; all they need is to be able to sense and distinguish states of this sort. They can both do this because they can hear sounds.
But since you started a game of argument-by-dictionary:
Quoting sound (wikipedia)
Quoting RogueAI
If we're being accurate, they experience "objective" states; they talk at all because they are socially minded, and being social, they notice that these things they hear are things other people around them can hear but only if the situation is just right (given theory of mind; also they don't need a word for objective here, which is why I quote it). The nature of this thing they both hear is what they are talking about. The nature of that thing is physical sound, but they don't have to know that this is the nature to meaningfully talk about it... they just have to know there are states that they can sense and distinguish.
Thanks.
Let me start simple. If X=Y then talk of X is talk of Y, even if the words are in different languages. If a word refers to X, and X=Y, that word also refers to Y. How can it not if X is identical to Y? Agreed? For example, if I'm talking about bachelors, I'm necessarily also talking about unmarried men, right?
So the claim the identity theory of mind adherent makes is mental states are identical to brain states. They're one and the same. Equal to each other in every way. If that's true, then talk that refers to mental states is necessarily talk that refers to brain states.
Tell me where you disagree so far.
You're making a conflation. You have a word in mind (call it "sound1") which refers simply to vibrations in the air. The people from long ago don't have that word because they don't know even know air can vibrate. When they talk about sound, they're using their word (call it "sound2") which refers simply to what they're hearing, which is why hearing is so important to the definition.
The problem with your objection is that sound1 is not identical to sound2. In other words, the vibrations in the air are not identical to the mental state of hearing. You're claiming they are, through the conflation I talked about.
Yes, that's what sound is specifically in the domain of physics. Sound as it's defined in the broad sense of the word includes "hearing". It's an incomplete definition if it doesn't talk about hearing. That's the essence of sound: you can hear it.
mechanical radiant energy that is transmitted by longitudinal pressure waves in a material medium (such as air) and is the objective cause of hearing
https://www.merriam-webster.com/dictionary/sound#:~:text=Log%20In-,Definition%20of%20sound,the%20objective%20cause%20of%20hearing
It's a big deal to me as it was the central issue I was dealing with when I first ventured into mind science as a youth. :grin:
Artificial intelligence was the first great disappointment. The guys were only talking about machines it turned out. And then brain imaging promised to be the new revolution. Consciousness would be put on the neuroscientific agenda at last as a concord had been agreed with philosophy of mind.
We would all be starting off in humble fashion by merely identifying the neural correlates of consciousness (NCC) - a dualistic approach where the material explanation in terms of a physical state would be married to the reportable phenomenology produced by a mental state.
But that great and expensive exercise produced remarkably little directly. It just brought home how muddled people were in their conventional Cartesian division of reality into physical and mental states. All that could result was a doubling down on the underlying dualistic incompatibility of descriptive languages.
So science did have to question what was "scientific". And for a start, it wasn't speaking of the brain in terms of a machinery with physical states. It had to be some kind of embodied information process - but information processing is another domain of jargon founded on a mechanical "state-based" ontology. Neuroscience couldn't make progress by swapping out a biological mechanism and wheeling in a computational one. That still just left it chasing the phantom of a purely mechanical explanation.
Long story short, you now have generic models of the "mind~brain" in terms of Bayesian Brain theory and the enactive turn within cognitive psychology, not to mention social constructionism being brought into play to account for the extra features of the human mind~brain system in particular.
So the science here is a shifting beast.
Neuroscience was doing a god-awful job in the 1970s as it was basically a branch of medical science and so absolutely wedded to a mechanist ontology. To fix your schizophrenia, the best theory might to be kick your head hard enough that maybe you might repair it like thumping a TV (back when TVs had vacuum tubes and loose connections, so it could work).
But does modern neuroscience still try to explain the mind - as we humans like to say we experience it - as "talk of neurotransmitters, synaptic gaps, certain chemicals, etc"?
I would certainly question that. A big picture scientific account would use the jargon appropriate to the whole new level of theorising that has emerged over the past 20 years or so.
Quoting RogueAI
Exactly. This would be defining "mental vocabulary" in terms of what works in ordinary social and culturally appropriate settings. It is a way of co-ordinating and regulating "other minds" within a shared "mental space" of pragmatically social meaning.
The problem lies with the extent to which this folk psychology - very useful in the business of existing as a social creature - gets reified as some kind of deep philosophical wisdom. I have "a mind". I can see you have "a mind". Maybe a cockroach has "a mind". Maybe the Comos too? Maybe "mind" is a another substantial property of reality - a soul stuff - like Descartes suggested.
So what contrasts with the scientific vocabulary? Is it a folk psychology vocabulary? A religious vocabulary? A mystical vocabulary? Where does all this mind talk come from?
Good anthropological studies show just how culturally specific our own philosophically-embedded mind talk actually is. The Ancient Greeks played a large hand in inventing it as it had a pragmatic use - it gave birth to the cultural notion of a person as a rational individual who could thus play a full part in a rationally-organised society. It was a way to thinking with powerful results in terms of evolving human social structure. A seed was planted that really took off with the Enlightenment and Scientific Revolution (and which, with its flowering, engendered its own counter-revolution of Romanticism and Idealism).
So mind talk also has its instructive history. It has its pragmatic uses and has continued to evolve to suit the expression of them.
A greater compatibility between the two sources of language might be a good thing.
But for me, the irony there is that mind science has to move away from the machine image and become more use to discussing the brain in properly organic terms. While folk psychology also needs to make its shift away from the "dualistic substance" shtick that mostly just ends up aping the errors of an overly-mechanical model of reality.
Even to oppose the subjective to the objective means you have to buy into the existence of the objective (and vice versa).
The distinction may be pragmatically useful. People seem to like that sharp separation between the world of machines and the realm of the mind. But the mind~brain question is about whether this distinction is real or merely just our pragmatic social model of the reality.
Neuroscience has pressed on to deliver answers I am much more comfortable with these days. Dropping talk of "states" is part of that change. Or rather, always framing the word states in quotes to acknowledge the presumptions we have put into play just there.
So let's take two people (Jack and Jill) who were marooned on a desert island when they were kids. They know essentially nothing about the world. They don't even know they have a brain in their skull. So Jack stubs his toe and tells Jill he's in pain. Jack's talk of pain is a reference to a mental state and therefore is also a reference to a brain state. So there's three problems with this:
1. Epistemological: the identity of mind adherent has to claim that Jack, in the scenario, is referring to a brain state. But Jack doesn't know anything about brains, let alone brain states. Does it make sense to say that someone who doesn't know they're referring to brain states is really referring to brain states when they talk about being in pain? If that's true, shouldn't the person be aware they're communicating all this brain-state information to another person when they talk of pain?
2. Is it possible for a person who doesn't even have the concept of a brain and what it does, let alone a word for it, to refer to brain states? That seems like an absurdity.
3. Informational: When Jack tells Jill he's in pain, he's giving her information. Her store of knowledge increases. Jill knows a new fact about Jack: he's in pain. The identity theory of mind entails that information about brain states was exchanged when Jack told Jill he's in pain.
And that's it.
No my friend, you are.
Sure. But the fact that InPitzotl is using the word sound1 to describe what the invisible stuff around Bob and Sheila do has no bearing on Bob or Sheila.
Sure. But that's the point. Bob and Sheila not only don't know sound1; they don't have to to talk about sound1 states. All they have to do to meaningfully talk about sound1 states is know that there are states, and be able to distinguish them. Bob and Sheila have a different, more vague theory:
...of sound2. They just hear stuff. So, sure.
But Bob and Sheila recognize that there's a "thing" they are hearing... a state in the world such that if it's of moderate volume and a person isn't deaf and is in the same area, the other person would also hear it. But that state of the world they are hearing is still sound1's; it's vibrations in air. They just don't know it. But they don't have to to talk about sound2's.
There's no problem on my end. You're trying to allegedly "correct" me that Bob doesn't know what a sound1 is. But that's the same thing I'm saying... Bob doesn't know what a sound1 is. You're then saying that Bob only meaningfully talks about sound2's. But that's the same thing I'm saying... Bob meaningfully talks about sound2's. So the confusion I'm afraid is all on your end.
Bob and Sheila cannot have a conversation (with spoken words, which it's reasonable to presume is the substrate of choice 20,000 years ago) unless they presume that they can make sounds that the other one hears. This presumption is equivalent to assuming that there's some sort of world state associated with the sounds they hear.
I'm claiming exactly the opposite of this. I'm claiming that Bob and Sheila do not have to know about sound1 states to talk about them; because sound2 states are sound1 states. You're overly subjectifying sound2; you might be forgetting that Bob and Sheila communicate using sounds. By simply assuming they can communicate this way, they are assuming sounds are shared world states. Which, they are; they are, in fact, sound1's... vibrations in media.
I don’t see why it doesn’t make sense. If we assume mind-brain state identity, then using either mind state vocabulary or brain state vocabulary is making reference to the same thing, even if unintentionally. Same goes for your points 2 and 3. And I still don’t see how different vocabularies entails different states.
I can talk about a painting without knowing anything about the chemistry of paint or canvasses, but paintings are just paint on canvasses.
Understood. But really...I’m not sure there should even be one.
Do you think “the mind is what the brain does” to be just a somewhat lame effort to eliminate epiphenomenalism? Or does the proposition, in effect, justify epiphenomenalism? What about the possibility that the immeasurable complexity of the human brain is sufficient in itself, to permit the reality of some lawfully transcendent functionality? If that is the case, then reason and feelings are reducible to mind mechanisms, whereas, if this is not the case, reason and feelings, along with everything else, are reducible only to brain mechanisms.
Never was a fan of chicken/egg dichotomies, myself. But it seems we get ourselves into ‘em almost as a matter of course regardless. Probably because reasonable answers are needed even when facts are missing. Follow the bouncing ball........
No. I think that defining the Mind as the Function of the Brain is a pragmatic explanation. Epiphenomenalism is a kind of Property Dualism, while the functional definition can be interpreted as a Substance Monism, as proposed in my Enformationism thesis. Mind is an emergent holistic property of Brain, not a sub-system of the neural net. In my view, both the material Brain and the immaterial Mind are forms of Generic Information. But that's an emergent concept in Science, not yet orthodox doctrine. :cool:
Function : an activity or purpose natural to or intended for a person or thing.
___Wiki
NOTE : a flatworm can perform its basic functions without a brain. But humans are not automatons, precisely because their over-sized brains can choose between options, based on rational projection of future consequences. Choice is a purposeful function.
Emergent Functionalism : In philosophy, systems theory, science, and art, emergence occurs when an entity is observed to have properties its parts do not have on their own.
https://en.wikipedia.org/wiki/Emergence
Substance Monism : https://www.google.com/search?client=firefox-b-1-d&q=substance+monism+spinoza
Epiphenomenalism : the view that mental events are caused by physical events in the brain, but have no effects upon any physical events.
https://plato.stanford.edu/entries/epiphenomenalism/
NOTE : If the Mind has no causal effect on the physical world, how can your ideas and intentions have any effect in the outside world? Are you (your mental Self) a robot driven by automatic mechanical processes? Or do you have some freedom to choose your actions? I know, the question is moot, but I choose the freewill option. https://www.sciencemag.org/news/2019/03/philosophers-and-neuroscientists-join-forces-see-whether-science-can-solve-mystery-free
Brain/Mind Paradox :
Empirical Science treats the human mind as an integral function of the physical brain. But we intuitively put the mind in a different category. That's why it has traditionally been associated with a non-physical Soul, which requires a dualistic notion of humanity. The Enformationism paradigm though, is ultimately monistic, viewing Information as the single "substance" of reality. But that primordial stuff has two aspects : an active verb form, EnFormAction (energy), and a passive noun form, Information (embodied potential in material forms). The brain is enformed stuff, which converts stored Information (memory) into ideas, images, and feelings.
http://blog-glossary.enformationism.info/page10.html
Quoting Mww
Rather than defining Mind as a "transcendent" function of the Brain, I'd say it's an Emergent Function. Emergence is a natural process of Phase Change. :nerd:
PS__I agree that the Mind/Body conundrum ("the hard question") is only a problem for the scientific method of Reductionism, while the philosophical method of Holism can easily explain emergent properties as potentially inherent in the parts, due to their latent Information.
Physicalism (or maybe speculative realism or whatever) and qualia do not contradict, rather neither entails the other, hence the explanatory gap.
More like a sort of partition than a paradox as such, and sometimes a source of substance dualism. The gap is also related to solipsism.
Simply situating qualia (or whatever aspects of mind) as basic/fundamental/irreducible does not explain mind, but rather avoids explanation, thereby disregarding some things we already do know about mind.
Not sure how holistic and emergent can coexist simultaneously, and not sure how either is possible for humans if not for neural physics.
I think I’ll remain satisfied with “mind” being a concept of explanatory convenience, the purpose being to serve as the unconditioned from which logically consistent metaphysical cognitive theories arises. Hardly sufficient for the hard scientist, I know, but I’m not one, so I can get away with it. Something as ubiquitous and seemingly authoritative as human thought doesn’t profit enough by knowing its source over merely granting its efficacious reality.
Holism is the concept that a single functional system (a whole thing) is "more than the sum of its parts". The "more" is an emergent property of the whole system that is not characteristic of any of its parts. I'm not sure what you meant by "coexist", but a Whole System would not exist if not for the phenomenon of Emergence. That's why I say "the Mind is an emergent property of the whole Brain".
What we call "Mind" is essentially the Consciousness function of the Neural Net. But no single node in that net is conscious. The nodes process bits & pieces of Information, but it takes a whole integrated system to become aware of the output from those processes. So yes, the neurons are necessary but not sufficient, to produce the emergent function of Consciousness. :nerd:
Emergent Holism : [i]Holism in science, or holistic science, is an approach to research that emphasizes the study of complex systems. Systems are approached as coherent wholes whose component parts are best understood in context and in relation to one another and to the whole. . . .
David Deutsch calls holism anti-reductionist and refers to the concept as thinking the only legitimate way to think about science is as a series of emergent, or higher level phenomena.[/i]
https://en.wikipedia.org/wiki/Holism_in_science
Emergent Properties : Examples of emergent properties include cities, the brain, ant colonies and complex chemical systems.
https://sciencing.com/emergent-properties-8232868.html
NOTE :An emergent property is a property which a complex. yet integrated, system has, but which the individual members do not have.
Integrated Information Theory : http://www.scholarpedia.org/article/Integrated_information_theory
Quoting Mww
Yes. "Mind" is a term of convenience to label the non-physical Function of the Brain. Likewise, "Word Processor" is a convenient concept to stand as a symbol for all the complicated electronics going on inside your computer. In both cases, the Process is not a single physical object, but a metaphysical sequence of physical events. But some of us like to know what's going on inside that black box. :joke:
Thinking back to elementary school this does not even hold for mathematics: 3+4 is the question and 7 is the answer.
To label a non-physical function of the brain presupposes there is one. Might be more accurate to say mind is a term of convenience to label a yet unknown physical function of the brain. Or, better yet, in order not to be trapped by objectivist physical determinism on the one hand, and subjectivist point-of-use abstractions on the other, just leave it at convenience. If nothing else, it certainly is exactly that, even if convenient easily translates to uninformative.
I think of function as an empirical condition....it is the function of this to do that, a necessary product of cause and effect. I’m not sure I’d think of consciousness as a function. What does consciousness actually effect, and how can anything be said about its effects, when its cause is itself unknown. And labeling neural nets as source doesn’t necessarily implicate neural nets as either cause or effect. Then of course, we’ve got mind as consciousness as emergent function of brain, which seems altogether overly-complicated, for it appears to make mind practically synonymous with consciousness.
Anyway, interesting topic, so......thanks for that.
The primary function (purpose) of your automobile is "transportation", which is a process, not an object. Can you put "transportation" or "consciousness" under a microscope to see its component parts? As Hume pointed out, the connection between Cause and Effect is an attribution, an association, and a label of convenience for an invisible link or relationship between things or events that have a history of occurring together. His analysis raised the Problem of Induction, which is the method of empirical Science. In practice though, most pragmatic scientists ignore Hume's quibbles, and take the causal connection for granted. Yet technically, the "invisible link" is an imaginary concept in the mind (consciousness) of the observer --- it's a belief, not an empirical fact.
Problem of Induction : https://plato.stanford.edu/entries/induction-problem/
Quoting Mww
Good question. Except for spoon-bending psychics, Consciousness doesn't seem to have any empirical effects on the material world. But you know from personal experience that it does affect your immaterial behavior. Your intentional activities are a function of your awareness of the environment, and of mental projections into the future, to predict possible consequences of specific behaviors. Can you say "conscious behavior" and "subconscious reflexes"?
Again, we moderns attribute conscious behavior to the brain. But ancient Egyptians imagined that the heart was the seat (cause) of the conscious Soul. Some fringe thinkers argue that the source (cause) of consciousness is out-there in the universe (Panpsychism). So, it's a matter of opinion as to the Cause of Consciousness. FWIW, my opinion is that Awareness of the external & internal milieu is a holistic function of the whole brain and body.
Function :
[i]1. an activity or purpose natural to or intended for a person or thing.
2. Mathematics --- a relationship or expression involving one or more variables.[/i]
Behavior : the way in which a natural phenomenon or a machine works or functions.
Quoting Mww
Do you think it's less complicated to identify "world-changing" Consciousness with a lump of Brain Matter? Isn't that a bit simplistic? :joke:
Mind :
1. the element of a person that enables them to be aware of the world and their experiences, to think, and to feel; the faculty ofconsciousness and thought.
Metaphysical :
1. In modern philosophical terminology, metaphysics refers to the studies of what cannot be reached through objective studies of material reality. ... Metaphysics might include the study of the nature of the human mind, the definition and meaning of existence, or the nature of space, time, and/or causality.
PS__By "world-changing" I refer to the sudden acceleration of Evolution after the emergence of Conscious Organisms from non-conscious Matter.
Yes. Similarly, music is but acoustic waves... and supposedly it takes some specialized knowledge to speak competently about acoustics and waves. And yet two average folks can still talk about the music they like or dislike without knowing much about sonic waves...