You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

Do Neural Codes Signify Conscious Content?

Dfpolis March 03, 2020 at 22:40 9675 views 48 comments
For those of us who are not physicalists, it is fairly uncontroversial that conscious perception involves awareness of neurally encoded contents. That is certainly how I think of it. Yet, this simple schema turns out to be highly problematic.

In Descartes' Error neurophysiologist Antonio Damasio argues that our knowledge of the external world started as neural representations of body state and evolved into representations of the external world as the source of changes in our body state:
Anthony Damasio:... to ensure body survival as effectively as possible, nature, I suggest, stumbled on a highly effective solution: representing the outside world in terms of the modifications it causes in the body proper, that is representing the environment by modifying the primordial representations of the body proper whenever an interaction between organism and environment takes place. (p. 230)

I see three problems with this otherwise plausible hypothesis: (1) It requires one neural state to encode mul­tiple concepts. (2) There seems to be no mechanism this "solution" could have evolved. (3) Neural states do not represent as other signs do.

First, on this view, one physical state (the object's modification of our body/neural state), represents two intelligible states (the object sensed and it effect on us).  For example, in seeing an apple our retinal state is modified, so the same neural signal represents both the apple and the fact that our retina has been modified.  Consequently, there is no neural basis for distinguishing data on the object from data on the sub­ject.

The reason neural signals can provide no data by which we can distinguish subject and object is the identity of action and passion first noted by Aristotle: the subject sensing the object is identically the ob­ject being sensed by the subject. Given this ontological identity, there is no way to pry apart data on the subject as sensing and the object as sensed.

Clearly, sensory signals can encode different notes of intelligibility. For example, our complex percep­tion of a ball encodes its matter and form differently. We can see and feel its sphericity, while squeezing it supports the notion of rubbery material. That is not the case with conceptualizing the difference be­tween sensed objects and the concomitant changes in body state. There is no difference, even in prin­ciple, between a neural message saying we are seeing a red apple and one saying we are seeing (having our bodily state modified by) a red apple.

I am not arguing for solipsism. I take as a given that we are conscious of objects other than ourselves. Rather than questioning this datum, I am trying to understand the dynam­ics making it possible. It seems to me that grasping this difference requires a direct intuition of the ob­ject as an object, as other -- and this, or something functionally equivalent, is missing from our model.

Second, while this is not a problem for behavioral evolution, it is a problem for understanding intellec­tion. Effective behavior can evolve interdependently of whether an organism responds to its internal state or to externalities. What a neural signal encodes is immaterial as long as the response to it is bio­logical effective (evolutionarily fit).  However, if consciousness of objects is solely due to awareness of neurally encoded content, we can have no basis for thinking objects are distinct from ourselves.  To do so we must grasp an intelligible difference between our self and the object, and there is none in the neu­ral signal.

This problem is similar to that noted by Alvin Plantinga in his evolutionary argument against natural­ism. Both note that what gives evolution its traction, the selection of successful structure and behavior, provides no traction in selecting certain features of mental experience. In Plantinga’s case, it provides no traction in selecting true over false beliefs. Here, it provides no traction in distinguishing self-infor­mation from information on the other.

Third, the idea that neural impulses act as a signs glosses over and obscures the dynamics of sensory awareness. Signs are means of knowing. Signifiers are only potential signs unless they actually evoke a thought. Smoke, though a potential sign of fire, is operative only when used to indicate fire. Since know­ing is relational, so are signs. In his Ars Logica the Portuguese Dominican John of St. Thomas (John Poinsot 1589-1644) distinguishes formal and instrumental signs. An instru­mental sign requires that we understand the sign’s own nature before it can signify. A formal sign does not. Instrumental signs, are things like smoke, writing, road signs, binary codes, rancid smells, etc. In order to grasp what an instrumental sign signifies we must first grasp what the sign is in itself. For smoke to signify fire, we must grasp that it is smoke, and not dust or a cloud. For writing to signify we must first discern the shape of letters or pictographs. If we cannot do so, such signs fail to convey mean­ing, to act as signs.

It is common (e.g. in represeantational theories of mind) to confuse mental signs such as ideas and judgements with instrumental signs such as binary codes, but they are different. Ideas are formal signs. We need not grasp the nature of a formal sign for it to signify. We do not need to grasp that the concept apple is an idea before it can signify apples. Rather, understanding that we know apples, we see that we must be employing an instrument in knowing them. Thus, we come, retrospectively, to the notion of an apple idea. The whole reality of a formal sign, all that it does and can do, is signifying.

Neurally encoded data works in neither of these ways. Neurons encode data in their firing rates, yet we grasp their encoded contents without the faintest idea of their firing rates, or indeed of anything about our brain state. In consciousness, neither our brain state in general, nor our neural firing rates in particular, function as ordi­nary (instrumental) signs. Since conscious awareness of contents does not involve the proprioception and interpre­tation of brain states, they do not function as instrumental signs in providing content to consciousness.

Neither are neural firing rates a formal signs in consciousness. Remember that the entire reality of a formal sign, all that it does, is to signify. Firing neurons do more than signify, and when their firing rate is used to determine their meaning by in third person observation, they are not operating as formal signs. (If a neuroscientist were to observe the firing of a neuron and discern what it signified, its firing rate would be an instru­mental sign.)

Despite not fitting into prior semiotic categories, neural firing rates do encode data. A great deal of neurophysical work supports the conclusion that the data they encode provides us with, or at least sup­ports, the contents of consciousness. So, they function as a kind of hybrid sign: one whose intrinsic na­ture need not be discerned for them to signify, but which, unlike formal signs, do more than signify.

Like the problem of distinguishing self-data from object-data, this seems to intimate that we have a ca­pacity to grasp intelligibility that is not fully modeled in our present understanding.

Comments (48)

christian2017 March 03, 2020 at 23:00 #388093
Reply to Dfpolis

I would argue that the ability to feel pain or pleasure is proof of either a divine (or soul) or its proof that the entire universe is one living organism and each of us including bacteria are just subset or should a cell with in that giant organism. Are you familiar with the notion of collective conscieeeeeence or collective soul? Its like the entire universe has phantom leg syndrome.
Galuchat March 04, 2020 at 13:44 #388241
Quoting Dfpolis
First...There is no difference, even in principle, between a neural message saying we are seeing a red apple and one saying we are seeing (having our bodily state modified by) a red apple.
It seems to me that grasping this difference requires a direct intuition of the object as an object, as other -- and this, or something functionally equivalent, is missing from our model.


Object and subject are an ontological unity, having epistemological distinctions.

From a Cognitive viewpoint:
A neural message is a function of sensation (stimulation-response).
1) Stimulation is exogenous and/or endogenous stimulus (sensory signal) transduction by receptors, causing a response.
2) Response is the propagation of action potentials in excitable cells.

Seeing a red apple is a function of perception (sensory interpretation), specifically: vision.

The brain processes sensation data, and the mind processes perception data (these are incommensurable levels of abstraction).

From an Ecological viewpoint:
Gibson, James Jerome. 1977. The Theory of Affordances. In R. Shaw & J. Bransford (eds.). Perceiving, Acting, and Knowing: Toward an Ecological Psychology. Hillsdale, NJ: Lawrence Erlbaum.

Quoting Dfpolis
Second...What a neural signal encodes is immaterial as long as the response to it is biological effective (evolutionarily fit). However, if consciousness of objects is solely due to awareness of neurally encoded content, we can have no basis for thinking objects are distinct from ourselves. To do so we must grasp an intelligible difference between our self and the object, and there is none in the neural signal.


This is a function of self-awareness development, levels 2 & 3.
Rochat, Philippe. 2003. "Five Levels of Self-Awareness as They Unfold in Early Life". Consciousness and Cognition 12 (2003): 717–731.

Quoting Dfpolis
Third, the idea that neural impulses act as a signs glosses over and obscures the dynamics of sensory awareness.


Biosemioticians would classify a "neural impulse" as a signal type of sign.
Dfpolis March 04, 2020 at 15:15 #388261
Reply to christian2017 Thank you for your comment,

While I agree that we have an immaterial aspect that makes us subjects in the subject-object relation of knowing (a soul), the fact that each of us is a different subject, with unique experiences, makes it difficult for me to lend credence to the notion of a collective soul. I do think that there is an immaterial God, and that we can be aware of God via rational proof and direct, mystical experience.
Dfpolis March 04, 2020 at 15:20 #388263
Reply to Galuchat Thank you also for responding,

Still, I do not see that anything you said resolves the three issues I raised. Did I miss something?
Relativist March 04, 2020 at 17:43 #388300
Quoting Dfpolis
Like the problem of distinguishing self-data from object-data, this seems to intimate that we have a ca­pacity to grasp intelligibility that is not fully modeled in our present understanding.

I agree with this, and suggest this may just mean we have a problematic paradigm. E.g. reference to "information" seems problematic, because information connotes meaning, and meaning entails (conscious) understanding - which seems circular, and it doesn' seem possible to ground these concepts in something physical. That doesn't prove mind is grounded in the nonphysical, it may just be an inapplicable paradigm.

Consciousness is that which mediates between stimulus and response. As such, we should consider the evolution of consciousness from the simplest (direct stimulus-response), to increasing complexity, and develop a paradigm that can be applied to the development of mediation processes. As far as I know, this has not been done.
Dfpolis March 04, 2020 at 19:30 #388337
Thanks for commenting.

Reply to Relativist Quoting Relativist
it doesn' seem possible to ground these concepts in something physical.


I agree. As I argued last year, I do not think that intentional (mental) realities can be reduced to physical realities.

Quoting Relativist
That doesn't prove mind is grounded in the nonphysical, it may just be an inapplicable paradigm.


"Physical" means now the reality it calls to mind now. Its meaning may change over time (and has), but the present paradigms are based on our conceptual space as it now exists. Changing paradigms involves redefining our conceptual space, and a consequent redefinition of terms such as "physical" and "natural."

Quoting Relativist
Consciousness is that which mediates between stimulus and response.


This seems very behaviorist in conception and inadequate to the data of human mental experience.
Relativist March 04, 2020 at 20:49 #388389
Quoting Dfpolis
Physical" means now the reality it calls to mind now. Its meaning may change over time (and has), but the present paradigms are based on our conceptual space as it now exists. Changing paradigms involves redefining our conceptual space, and a consequent redefinition of terms such as "physical" and "natural."

I don't think it requires redefining "physical" and "natural", it means reconsidering the nature of our thoughts. A visual image is something distinct from the object seen, it's a functionally accurate representation of the object. In general, our conceptual basis for a thought is based on the way things seem to be, but the seemings may be illusory. It seems as if a concept is a mental object, but when employed in a thought, it may more accurate to describe it as a particular reaction, or memory of a reaction: process and feeling, rather than object.




christian2017 March 04, 2020 at 22:38 #388443
Reply to Dfpolis

I'm actually in the middle of something right now. I'll get back to you later. I've been drinking a little bit so i can't quickly reply with a quick answer. Thanks for the reply.
christian2017 March 05, 2020 at 08:16 #388597
Galuchat March 05, 2020 at 12:56 #388625
Quoting Dfpolis
...if consciousness of objects is solely due to awareness of neurally encoded content, we can have no basis for thinking objects are distinct from ourselves. To do so we must grasp an intelligible difference between our self and the object, and there is none in the neural signal.

Signals are not only transmitted from environment to body to mind, but also from mind to body, causing change in the environment. The capacity for motor coordination differentiates object (other) and self in the mind of a sentient being.

Quoting Relativist

E.g. reference to "information" seems problematic, because information connotes meaning, and meaning entails (conscious) understanding - which seems circular, and it doesn' seem possible to ground these concepts in something physical.

Communication (including: data, encoding, code, message, transmission, conveyance, reception, decoding, information) is a good analogy for the sensation process if a physical (as opposed to only semantic) type is acknowledged.
Relativist March 05, 2020 at 20:06 #388746
Quoting Galuchat
Communication (including: data, encoding, code, message, transmission, conveyance, reception, decoding, information) is a good analogy for the sensation process if a physical (as opposed to only semantic) type is acknowledged.

It's a useful analogy in some contexts, but it may not be the best analogy for analyzing the ontology of mind. For example, we aren't going to find a physical structure that corresponds to a packet of data (from perception) or of decomposable information (like the logcal constructs that define a concept). That is not sufficient grounds to dismiss physicalism; it may just mean we need a different paradigm.
Galuchat March 05, 2020 at 21:53 #388790
Quoting Relativist
It's a useful analogy in some contexts, but it may not be the best analogy for analyzing the ontology of mind.


I think that mind is an integrated set of organism events which produce an individual's automatic and controlled acts, so; an open sub-system of (at least certain) organisms (e.g., those having a central nervous system). But, the ontology of mind is off-topic.
Dfpolis March 05, 2020 at 22:12 #388799
Quoting Relativist
A visual image is something distinct from the object seen, it's a functionally accurate representation of the object.


While I tend to agree with this, it does not explain how we distinguish the object from the subject -- which is the problem I have.

Quoting Relativist
It seems as if a concept is a mental object, but when employed in a thought, it may more accurate to describe it as a particular reaction, or memory of a reaction: process and feeling, rather than object.


I think I agree. I would say that the concept apple, while often conceived of as a "thing" is simply the act of thinking of apples.
Dfpolis March 05, 2020 at 22:17 #388801
Quoting Galuchat
Signals are not only transmitted from environment to body to mind, but also from mind to body to environment. The capacity for motor coordination differentiates object (other) and self in the mind of a sentient being.


I agree, but how does this allow us to distinguish body states from external states?
Galuchat March 05, 2020 at 22:25 #388809
Reply to Dfpolis
Do you mean rather, how does this allow us to distinguish body states from the states of other objects?
Relativist March 05, 2020 at 22:59 #388827
Quoting Dfpolis
I am not arguing for solipsism. I take as a given that we are conscious of objects other than ourselves. Rather than questioning this datum, I am trying to understand the dynam­ics making it possible

Quoting Dfpolis
I agree, but how does this allow us to distinguish body states from external states?

I suggest that it's a consequence of the neural connections being different. Consider how we distinguish the location of a pain in the left knee - it's a consequence of the specific connections from peripheral nerves to specific areas of the central nervous system, wherein we become consciously aware of the pain's location. Even after the pain is gone, the memory of the pain is unique from other conscious experiences. Visual and auditory information are also unique, and processed through unique neural paths, and this maps to conscious experiences that are also unique.

You referenced Plantinga, so perhaps you're familiar with "properly basic beliefs". Our "beliefs" about the external world are basic, baked into the mechanism (or support structure) that produces (or supports) consciousness. (I'll add that they are properly basic, because they are a consequence of evolutionary development: a functionally accurate grasp of the external world is advantageous. This is the core of my refutation of his EAAN).




Dfpolis March 06, 2020 at 14:56 #389039
Quoting Galuchat
I think that mind is an integrated set of organism events which produce an individual's automatic and controlled acts, so; an open sub-system of (at least certain) organisms (e.g., those having a central nervous system). But, the ontology of mind is off-topic.


It seems to me that subjectivity (being a knowing and willing subject) is essential to the experience of mind. Functionalism does not cut it.
Dfpolis March 06, 2020 at 14:57 #389041
Quoting Galuchat
Do you mean rather, how does this allow us to distinguish body states from the states of other objects?


Yes, that is what I said.
Dfpolis March 06, 2020 at 15:03 #389043
Quoting Relativist
I suggest that it's a consequence of the neural connections being different.


Different how? To take your example, how do I distinguish a signal indicating the existence of a condition causing pain from a signal that says only that a pain receptor is firing? Since they are one and the same signal, I do not see how I can.
Zelebg March 06, 2020 at 15:39 #389052
Reply to Dfpolis
For those of us who are not physicalists


Just to make one thing clear. There is no such thing as “immaterial” or “non-physical”, it’s a self-contradiction. Instead, it can be undetectable, either due to our limits or in principle, but it must be made of something or otherwise is made of nothing and that means it does not exist.

So, if there is such a thing as soul, it must be physical, it’s the only logically valid semantics, it’s just that substance it is made of is for some reason invisible to us. Any other claim about ontology of “immaterial” is a paradox, simply gibberish. Can we all agree?
Relativist March 06, 2020 at 16:02 #389061
Quoting Dfpolis
how do I distinguish a signal indicating the existence of a condition causing pain from a signal that says only that a pain receptor is firing? Since they are one and the same signal, I do not see how I can.

When a pain receptor is fired, the mind experiences it as the quale "pain". That is the nature of the mental experience. In effect, the signal passes through a transducer that converts the physical signal into a mental experience.


Zelebg March 06, 2020 at 16:39 #389068
Reply to Dfpolis
In Descartes' Error neurophysiologist Antonio Damasio argues that our knowledge of the external world started as neural representations of body state and evolved into representations of the external world as the source of changes in our body state:


It’s most accurate and pragmatic to call it “virtual reality”, a sort of simulation, but to keep in mind that does not necessarily imply digital computation and computer algorithms as we know them today.


I see three problems with this otherwise plausible hypothesis: (1) It requires one neural state to encode mul­tiple concepts. (2) There seems to be no mechanism thise "solution" could have evolved. (3) Neural states do not represent as other signs do.


From a 3rd person perspective, neural states represent mental content in the form of electromagnetic and chemical signals, just like virtual reality of a simulated content is represented inside the computer in the form of signals between the logic gates and other circuits.

Mechanics of even the simplest form of chemical reactions we call “life” is largely still a mystery. We really have no idea how the machine assembled itself, so it’s too optimistic to expect we could yet explain the ghost in the machine. But if you read between the lines of what everyone is talking about and where all the evidence points, this ghost is really just another machine in the machine, but a virtual kind of machine, and that explains everything.
Zelebg March 06, 2020 at 18:27 #389102
Reply to Dfpolis
To take your example, how do I distinguish a signal indicating the existence of a condition causing pain from a signal that says only that a pain receptor is firing? Since they are one and the same signal, I do not see how I can.


Signals, just like words, pictures, and other kinds of representations are meaningless information by itself. Meaning comes from the grounding inherent in a decoder / interpreter system, also called personality, identity, ego, self...

Sentience is a form of understanding, a way of coupling the signal with its meaning, so meanings / feelings are virtual properties, qualia are virtual qualities. Their ontology is virtual like that of Pacman, and in that sense virtual existence offers almost unlimited and arbitrary kinds of different properties or qualities, only visible from the “inside”, or via VR goggles if we ever figure out a way how to connect.
Dfpolis March 07, 2020 at 22:51 #389449
Thanks for your interest.

Quoting Zelebg
Just to make one thing clear. There is no such thing as “immaterial” or “non-physical”, it’s a self-contradiction.


Would you care to show the contradiction? Please define "material" and "existence" and then show that existence entails material. I ask this because on the usual understandings these terms do not mean the same thing.

Obviously, being immaterial does men not made of any kind of matter, but there is no logical reason why something not made of matter can't act, and so exist. For example, my intention to go to the store acts to motivate my motion toward the store. Your argument simply begs the question by assuming, a priori, that everything must be "made of something."

You might find my discussion "Intentional vs. Material Reality and the Hard Problem" of interest. In it, I show why intentional existence cannot be reduced to physical existence.

Quoting Zelebg
From a 3rd person perspective, neural states represent mental content in the form of electromagnetic and chemical signals, just like virtual reality of a simulated content is represented inside the computer in the form of signals between the logic gates and other circuits.


Of course, but what I am discussing is the first person perspective -- how it is that we know the difference between body states and object states.

Quoting Zelebg
so it’s too optimistic to expect we could yet explain the ghost in the machine.


I am not suggesting a ghost in a machine. Rather, unified human have both physical and intentional operations, and neither is reducible to the other -- just as we cannot reduce the sphericity of ball to the rubber it is made of.

Quoting Zelebg
Meaning comes from the grounding inherent in a decoder / interpreter system, also called personality, identity, ego, self...


While I agree, this does not solve the problem I am raising.
Dfpolis March 07, 2020 at 22:53 #389451
Quoting Relativist
When a pain receptor is fired, the mind experiences it as the quale "pain". That is the nature of the mental experience.


Yes, it does. How does this allow us to distinguish data on the sensor state from data on the sensed?
Zelebg March 07, 2020 at 23:37 #389474
Reply to Dfpolis
For example, my intention to go to the store acts to motivate my motion toward the store.


Intentions, and other mental states, feelings and qualities, are not immaterial, they are virtual.


Would you care to show the contradiction? Please define "material" and "existence" and then show that existence entails material. I ask this because on the usual understandings these terms do not mean the same thing.

Your argument simply begs the question by assuming, a priori, that everything must be "made of something."


To exist is to be (made of) something rather than nothing. No assumptions, only logic. “Material / physical” is everything that is not nothing, but material existence can also be virtual, not just actual.


Of course, but what I am discussing is the first person perspective -- how it is that we know the difference between body states and object states.

While I agree, this does not solve the problem I am raising.


Can you give examples of what you are talking about?
Relativist March 07, 2020 at 23:45 #389480
Quoting Dfpolis
Yes, it does. How does this allow us to distinguish data on the sensor state from data on the sensed?

As I said, the pain signal (in effect) reaches a transducer which produces the mental state of localized pain. Does this much sound plausible? If so, what is your specific issue?

If the mind is immaterial, as you assume, the issue seems to he: how do physical, electro-chemical signals produce the related mental states - right? It's not clear what specific issue you're focusing on. I'm just saying there has to be some sort of physical-mental transducer - that's where the magic is (the physical-mental causation).
Dfpolis March 08, 2020 at 00:56 #389506
Quoting Zelebg
Intentions, and other mental states, feelings and qualities, are not immaterial, they are virtual.


I have no idea what this means. "Virtual" usually means "potential." Clearly, my actual intentions are not longer potential.

Quoting Zelebg
To exist is to be (made of) something rather than nothing.


This is begging the question. Clearly, anything that can act in any way exists, and, as I have pointed out, many intentions act to effect motions. Others act to motivate truth claims.

Quoting Zelebg
Can you give examples of what you are talking about?


See the OP. The same signals indicating I am seeing an apple also indicate that my retinal state has change.
Zelebg March 08, 2020 at 00:59 #389508
Reply to Dfpolis
See the OP. The same signals indicating I am seeing an apple also indicate that my retinal state has change.


What's the problem?
Zelebg March 08, 2020 at 01:03 #389510
Reply to Dfpolis
I have no idea what this means. "Virtual" usually means "potential." Clearly, my actual intentions are not longer potential.


I said "it’s most accurate and pragmatic to call it “virtual reality”, a sort of simulation".

https://en.m.wikipedia.org/wiki/Virtual_reality
Dfpolis March 08, 2020 at 01:04 #389511
Quoting Relativist
As I said, the pain signal (in effect) reaches a transducer which produces the mental state of localized pain. Does this much sound plausible? If so, what is your specific issue?


My issue is that the same signals indicate I am seeing an apple as indicate I am seeing (my retinal state is being modified by) and apple. So how do we use the signal to know that there is an apple as opposed to the state of my retina has changed?

Quoting Relativist
If the mind is immaterial, as you assume


I do not assume the mind is immaterial. I deduce from the data of experience that it has both physical and intentional operations.

Quoting Relativist
the issue seems to he: how do physical, electro-chemical signals produce the related mental states


I do not assume that "electro-chemical signals produce the related mental states." Following Aristotle, I see this as the work of the agent intellect, which acts in the intentional, not the physical, theater of operations.
Zelebg March 08, 2020 at 01:06 #389512
Reply to Dfpolis
Clearly, anything that can act in any way exist


Can "something that acts" be made of nothing, or it must be made of something?
Dfpolis March 08, 2020 at 01:08 #389513
Quoting Zelebg
I said "it’s most accurate and pragmatic to call it “virtual reality”, a sort of simulation".


Aside from the fact that this claim is wholly unsupported by data, there is no reason to suppose simulating physical (simulation) operations can generate intentional operations.
Zelebg March 08, 2020 at 02:14 #389529
Reply to Dfpolis
Aside from the fact that this claim is wholly unsupported by data, there is no reason to suppose simulating physical (simulation) operations can generate intentional operations.


It is supported by every single data, some of which I already explained. Programs are intentions.
Relativist March 08, 2020 at 02:26 #389533
Quoting Dfpolis
I do not assume that "electro-chemical signals produce the related mental states."

I suggest that we can deduce this is the case.

Quoting Dfpolis
I do not assume that "electro-chemical signals produce the related mental states." Following Aristotle, I see this as the work of the agent intellect, which acts in the intentional, not the physical, theater of operations.

But surely you must agree that sensory perception originates in physical processes, and ultimately mental states arise. This implies there is a causal chain from the physical to the mental. This suggests that somewhere in the chain, there is a final physical event followed by an initial (non-physical) mental event. There can be parallelism, but at the fundamental level, physical-mental causation has to be taking place. Mental causation entails the converse. I refered to this interface as a "transducer". It seems unavoidable if the mind is non-physical.

Quoting Dfpolis
I do not assume the mind is immaterial. I deduce
I have not deduced it, so I'm considering it a premise, for sake of discussion. Challenging it would entail a different discussion.
Dfpolis March 08, 2020 at 03:14 #389540
Quoting Zelebg
Programs are intentions.


No, programs implement the intentions of their programmers. They themselves are signs requiring human interpreters to actually signify.
Dfpolis March 08, 2020 at 03:44 #389545
Quoting Relativist
I do not assume that "electro-chemical signals produce the related mental states." — Dfpolis

I suggest that we can deduce this is the case.


Please do so.

Quoting Relativist
surely you must agree that sensory perception originates in physical processes, and ultimately mental states arise.


I agree that neural processes are physical. Whether or not mental states arise from them depends on whether or not we attend to them. The act of attending to them is an act of awareness (aka the agent intellect).

Quoting Relativist
This implies there is a causal chain from the physical to the mental.


No, it shows that the agent intellect can transform physically encoded data to concepts (mental intentions).

Quoting Relativist
at the fundamental level, physical-mental causation has to be taking place.


Why?Quoting Relativist
It seems unavoidable if the mind is non-physical.


Immaterial does not mean physically impotent. The laws of nature are not made of matter; nonetheless, they effect physical transformations.

Quoting Relativist
I do not assume the mind is immaterial. I deduce — Dfpolis

... Challenging it would entail a different discussion.


We had that discussion earlier, when I showed that and why intentional realities are not reducible to material realities.

Zelebg March 08, 2020 at 03:56 #389547
Reply to Dfpolis
No, programs implement the intentions of their programmers. They themselves are signs requiring human interpreters to actually signify.


No? Who is interpreting the signs in your DNA? And what do you call a process constrained by a set of instructions, such as processes in your body, your cells and organs, if not a program? Who wrote the function for your heartbeats, your blood flow, and your digestion? It’s all programs, and programs implement intentions too.
Dfpolis March 08, 2020 at 04:30 #389554
Quoting Zelebg
Who is interpreting the signs in your DNA?


Normally, no one. DNA does not work by being a sign, but mechanically. Hence, it needs no interpretation or interpreter. It is a "program" in an analogous way compared to a computer program, which is to say equivocally so. (Or do you think of God as a programmer?) When it is a "sign" of a structure, the interpreter is a molecular geneticist.

Did you see "Hugo"? The atomaton in it had a program that caused it to draw. No interpretation was required. So the program did not act as a sign, except in relation to a human programmer or interpreter.

Quoting Zelebg
And what do you call a process constrained by a set of instructions, such as processes in your body, your cells and organs, if not a program?


We use words analogously to cover new needs. As a result various uses need not mean the same thing, and what they name need not work in the same way. Normal instructions and rules are signs which must be interpreted by a mind before they can be implemented. There is no such set of instructions in human physiology. Rather, there are laws of nature that act on initial physical states to produce later physical states without need of interpretation. So we must be careful not to be fooled when the same words are used with differ meanings in different cases.

I am, however, glad that you see that the laws of nature are works of Mind.
Relativist March 08, 2020 at 04:44 #389559
Quoting Dfpolis
I agree that neural processes are physical. Whether or not mental states arise from them depends on whether or not we attend to them. The act of attending to them is an act of awareness (aka the agent intellect).

OK, this suggests mental states contingently arise. Nevertheless, the relevant mental states do not arise without the physical input.
Quoting Dfpolis
at the fundamental level, physical-mental causation has to be taking place.
— Relativist

Why?

Sensory perception ceases when there's a physical defect. This is strong evidence that the physical processes are in the causal chain even if there are immaterial dependencies as well (like attentiveness).

Quoting Dfpolis
Immaterial does not mean physically impotent. The laws of nature are not made of matter; nonetheless, they effect physical transformations.

Laws of nature describe physical-physical causation. Mental-physical and physical-mental is unique.
Quoting Dfpolis
This implies there is a causal chain from the physical to the mental.
— Relativist

No, it shows that the agent intellect can transform physically encoded data to concepts (mental intentions).

How does the physically encoded data get into an immaterial mind? How do you explain the dependency on physical processes? If you deny the dependency, why does input cease when the equipment is defective? It seems to me the only plausible explanation is that the physical processes cause immaterial mental states. The attentiveness issue doesn't refute this, it just adds a switch.

I hope you can see that I'm treating mind as an immaterial object, and merely trying to infer how the mind interfaces with the world.
Galuchat March 08, 2020 at 11:16 #389639
Quoting Dfpolis
The same signals indicating I am seeing an apple also indicate that my retinal state has change.


Two different signals are involved in the process of sensation.
Light (one type of signal) changes retinal states. Photoreceptors (rods and cones) in the retina transform light into neural signals.

Neural signals and visual perception are related by correlation, not causation.
So, do neural codes signify conscious content?
They may signify conscious content, or semi-conscious content (e.g., sleep, automaticity, or the reception of subliminal and/or supraliminal stimuli), or both (in the case of dual processing).
Zelebg March 08, 2020 at 11:43 #389644
Reply to Dfpolis
DNA does not work by being a sign, but mechanically. Hence, it needs no interpretation or interpreter.


DNA is a set of instructions and it is perfectly valid to call them symbols, as we do in genetics. Besides, there are no signs in the computer either, once the program starts execution it’s all hardware. Everything is mechanical / electrical, and that has absolutely no bearing on whether it needs “interpretation” or it doesn’t. DNA replication programs and every other biological process are obvious examples.


We use words analogously to cover new needs. As a result various uses need not mean the same thing, and what they name need not work in the same way. Normal instructions and rules are signs which must be interpreted by a mind before they can be implemented. There is no such set of instructions in human physiology. Rather, there are laws of nature that act on initial physical states to produce later physical states without need of interpretation. So we must be careful not to be fooled when the same words are used with differ meanings in different cases.


Narrow thinking will not help your understanding. Those were rhetorical questions, but instead of to realize your errors, you are trying to explain what is obviously false. Are you a zombie?


I am, however, glad that you see that the laws of nature are works of Mind.


The point was, you were wrong:

1. even computer programs are not symbolic at the time of execution
2. programs do not need programmers or interpreters to do their program
3. programs do not need programmers to exist, they can spontaneously evolve
Dfpolis March 08, 2020 at 21:55 #389823
Quoting Relativist
OK, this suggests mental states contingently arise. Nevertheless, the mental states do not arise without the physical input.


That is the usual case. However, you may wish to read W. T. Stace, Mysticism and Philosophy (1960) to have more data to reflect upon.

Quoting Relativist
Sensory perception ceases when there's a physical defect. This is strong evidence that the physical processes are in the causal chain even if there are immaterial dependencies as well (like attentiveness).


I agree that consciousness is usually awareness of neurally encoded data. I think I said that in my OP. That does not mean that neurally encoded data is sufficient, only that it is normally necessary.

Quoting Relativist
Laws of nature apply to physical-physical causation. Mental-physical and physical-mental is unique.


Are you sure? I have argued (elsewhere) that the laws of nature are essentially intentional. Like human committed intentions, they effect ends. My arriving at the store is immanent in my initial state and decision to go to the store. The final state of a physical system is immanent in its initial state and the laws of nature. They meet Brentano's criterion for intenionality of pointing beyond themselves by pointing to later states.

Quoting Relativist
How does the physically encoded data get into an immaterial mind?


The immaterial aspect of the mind (the power to choose and attend, aka aware) has no specific "place;" however, experience tells us it generally attends to data processed by and encoded in the brain -- and we have a reasonable idea of how data gets there.

Quoting Relativist
It seems to me the only plausible explanation is that the physical processes cause immaterial mental states.


They inform the mental states, but to inform is not to be an efficient cause. Plans may inform a process, but they do not cause the process.
Dfpolis March 08, 2020 at 22:07 #389828
Quoting Galuchat
Two different signals are involved in the process of sensation.
Light (one type of signal) changes retinal states. Photoreceptors (rods and cones) in the retina transform light signals into neural signals.


The signals in the brain indicate that the state of our rods and cone has changed.

Quoting Galuchat
Neural signals and visual perception are related by correlation, not causation.


So, in your view, no dynamics links neural signals and visual perception??

Quoting Galuchat
So, do neural codes signify conscious content?


No, and that is my point. We do not first become aware (or ever become aware) of our neural state and then interpret what that state means. For an (instrumental) sign to work we need to be aware of what the sign is, and then decide what it means. There is no such process here. So calling neural pulses "signs" only increases confusion.
Galuchat March 08, 2020 at 22:33 #389842
Quoting Dfpolis
So, in your view, no dynamics links neural signals and visual perception??


I don't know what the link is between neural signals and various types of perception. And I don't think current neuroscience has explained it.

Currently, I am inclined to think that it may be emergent (i.e., a property and/or effect not attributable to organism components in isolation or in sum).

What do you think it is?
Zelebg March 08, 2020 at 22:46 #389848
Reply to Dfpolis
There is no such process here.


What is it you expected to find, what exactly do you claim is missing?
Relativist March 08, 2020 at 23:15 #389852
Quoting Dfpolis
The immaterial aspect of the mind (the power to choose and attend, aka aware) has no specific "place;" however, experience tells us it generally attends to data processed by and encoded in the brain -- and we have a reasonable idea of how data gets there.

Quoting Dfpolis
It seems to me the only plausible explanation is that the physical processes cause immaterial mental states. — Relativist

They inform the mental states, but to inform is not to be an efficient cause. Plans may inform a process, but they do not cause the process.

Even if your mind is not spatially located, your brain is - and there's clearly a strong connection between your mind and your brain. Your mind doesn't obtain sensory input from your next door neighbor's brain. This suggests some sort of ontic connection between something located in space and something that is not. (There is an ontic connection between positively charged and negatively charged particles).

Besides sensory input, the mind utilizes memories, and it seems the memories must be physically located in the brain, or at least some necessary neural correlates are in the brain. (By that, I mean that in the absence of those physical neural correlates, the mind cannot attend to a memory). This is the implication of memory loss due to trauma and disease. Do you agree?

You rejected my suggestion that the brain causes mental states, so I assumed you must think the brain reads and interprets neural states. But you also denied that the mind is interpreting neural states (you said to Galuchat, " We do not first become aware (or ever become aware) of our neural state and then interpret what that state means."). What's left?
Galuchat March 09, 2020 at 09:15 #390000
Quoting Dfpolis
No, and that is my point. We do not first become aware (or ever become aware) of our neural state and then interpret what that state means. For an (instrumental) sign to work we need to be aware of what the sign is, and then decide what it means. There is no such process here. So calling neural pulses "signs" only increases confusion.


Nevertheless, functional neuroimaging links the spatiotemporal properties of neural response and the cognitive state of a subject. So, neural signals signify conscious and/or semi-conscious content.

"In each of these techniques, images generated while the subject is in one cognitive state may be directly compared, on a location-by-location basis, with images generated while the subject is in a second, different state."
Talavage, T. M., Gonzalez-Castillo, J., & Scott, S. K. (2014). Auditory neuroimaging with fMRI and PET. Hearing research, 307, 4–15. https://doi.org/10.1016/j.heares.2013.09.009.

This establishes correlation, but not causation.
Does neural response cause perception, or does perception cause neural response?