Do Neural Codes Signify Conscious Content?
For those of us who are not physicalists, it is fairly uncontroversial that conscious perception involves awareness of neurally encoded contents. That is certainly how I think of it. Yet, this simple schema turns out to be highly problematic.
In Descartes' Error neurophysiologist Antonio Damasio argues that our knowledge of the external world started as neural representations of body state and evolved into representations of the external world as the source of changes in our body state:
I see three problems with this otherwise plausible hypothesis: (1) It requires one neural state to encode multiple concepts. (2) There seems to be no mechanism this "solution" could have evolved. (3) Neural states do not represent as other signs do.
First, on this view, one physical state (the object's modification of our body/neural state), represents two intelligible states (the object sensed and it effect on us). For example, in seeing an apple our retinal state is modified, so the same neural signal represents both the apple and the fact that our retina has been modified. Consequently, there is no neural basis for distinguishing data on the object from data on the subject.
The reason neural signals can provide no data by which we can distinguish subject and object is the identity of action and passion first noted by Aristotle: the subject sensing the object is identically the object being sensed by the subject. Given this ontological identity, there is no way to pry apart data on the subject as sensing and the object as sensed.
Clearly, sensory signals can encode different notes of intelligibility. For example, our complex perception of a ball encodes its matter and form differently. We can see and feel its sphericity, while squeezing it supports the notion of rubbery material. That is not the case with conceptualizing the difference between sensed objects and the concomitant changes in body state. There is no difference, even in principle, between a neural message saying we are seeing a red apple and one saying we are seeing (having our bodily state modified by) a red apple.
I am not arguing for solipsism. I take as a given that we are conscious of objects other than ourselves. Rather than questioning this datum, I am trying to understand the dynamics making it possible. It seems to me that grasping this difference requires a direct intuition of the object as an object, as other -- and this, or something functionally equivalent, is missing from our model.
Second, while this is not a problem for behavioral evolution, it is a problem for understanding intellection. Effective behavior can evolve interdependently of whether an organism responds to its internal state or to externalities. What a neural signal encodes is immaterial as long as the response to it is biological effective (evolutionarily fit). However, if consciousness of objects is solely due to awareness of neurally encoded content, we can have no basis for thinking objects are distinct from ourselves. To do so we must grasp an intelligible difference between our self and the object, and there is none in the neural signal.
This problem is similar to that noted by Alvin Plantinga in his evolutionary argument against naturalism. Both note that what gives evolution its traction, the selection of successful structure and behavior, provides no traction in selecting certain features of mental experience. In Plantinga’s case, it provides no traction in selecting true over false beliefs. Here, it provides no traction in distinguishing self-information from information on the other.
Third, the idea that neural impulses act as a signs glosses over and obscures the dynamics of sensory awareness. Signs are means of knowing. Signifiers are only potential signs unless they actually evoke a thought. Smoke, though a potential sign of fire, is operative only when used to indicate fire. Since knowing is relational, so are signs. In his Ars Logica the Portuguese Dominican John of St. Thomas (John Poinsot 1589-1644) distinguishes formal and instrumental signs. An instrumental sign requires that we understand the sign’s own nature before it can signify. A formal sign does not. Instrumental signs, are things like smoke, writing, road signs, binary codes, rancid smells, etc. In order to grasp what an instrumental sign signifies we must first grasp what the sign is in itself. For smoke to signify fire, we must grasp that it is smoke, and not dust or a cloud. For writing to signify we must first discern the shape of letters or pictographs. If we cannot do so, such signs fail to convey meaning, to act as signs.
It is common (e.g. in represeantational theories of mind) to confuse mental signs such as ideas and judgements with instrumental signs such as binary codes, but they are different. Ideas are formal signs. We need not grasp the nature of a formal sign for it to signify. We do not need to grasp that the concept apple is an idea before it can signify apples. Rather, understanding that we know apples, we see that we must be employing an instrument in knowing them. Thus, we come, retrospectively, to the notion of an apple idea. The whole reality of a formal sign, all that it does and can do, is signifying.
Neurally encoded data works in neither of these ways. Neurons encode data in their firing rates, yet we grasp their encoded contents without the faintest idea of their firing rates, or indeed of anything about our brain state. In consciousness, neither our brain state in general, nor our neural firing rates in particular, function as ordinary (instrumental) signs. Since conscious awareness of contents does not involve the proprioception and interpretation of brain states, they do not function as instrumental signs in providing content to consciousness.
Neither are neural firing rates a formal signs in consciousness. Remember that the entire reality of a formal sign, all that it does, is to signify. Firing neurons do more than signify, and when their firing rate is used to determine their meaning by in third person observation, they are not operating as formal signs. (If a neuroscientist were to observe the firing of a neuron and discern what it signified, its firing rate would be an instrumental sign.)
Despite not fitting into prior semiotic categories, neural firing rates do encode data. A great deal of neurophysical work supports the conclusion that the data they encode provides us with, or at least supports, the contents of consciousness. So, they function as a kind of hybrid sign: one whose intrinsic nature need not be discerned for them to signify, but which, unlike formal signs, do more than signify.
Like the problem of distinguishing self-data from object-data, this seems to intimate that we have a capacity to grasp intelligibility that is not fully modeled in our present understanding.
In Descartes' Error neurophysiologist Antonio Damasio argues that our knowledge of the external world started as neural representations of body state and evolved into representations of the external world as the source of changes in our body state:
Anthony Damasio:... to ensure body survival as effectively as possible, nature, I suggest, stumbled on a highly effective solution: representing the outside world in terms of the modifications it causes in the body proper, that is representing the environment by modifying the primordial representations of the body proper whenever an interaction between organism and environment takes place. (p. 230)
I see three problems with this otherwise plausible hypothesis: (1) It requires one neural state to encode multiple concepts. (2) There seems to be no mechanism this "solution" could have evolved. (3) Neural states do not represent as other signs do.
First, on this view, one physical state (the object's modification of our body/neural state), represents two intelligible states (the object sensed and it effect on us). For example, in seeing an apple our retinal state is modified, so the same neural signal represents both the apple and the fact that our retina has been modified. Consequently, there is no neural basis for distinguishing data on the object from data on the subject.
The reason neural signals can provide no data by which we can distinguish subject and object is the identity of action and passion first noted by Aristotle: the subject sensing the object is identically the object being sensed by the subject. Given this ontological identity, there is no way to pry apart data on the subject as sensing and the object as sensed.
Clearly, sensory signals can encode different notes of intelligibility. For example, our complex perception of a ball encodes its matter and form differently. We can see and feel its sphericity, while squeezing it supports the notion of rubbery material. That is not the case with conceptualizing the difference between sensed objects and the concomitant changes in body state. There is no difference, even in principle, between a neural message saying we are seeing a red apple and one saying we are seeing (having our bodily state modified by) a red apple.
I am not arguing for solipsism. I take as a given that we are conscious of objects other than ourselves. Rather than questioning this datum, I am trying to understand the dynamics making it possible. It seems to me that grasping this difference requires a direct intuition of the object as an object, as other -- and this, or something functionally equivalent, is missing from our model.
Second, while this is not a problem for behavioral evolution, it is a problem for understanding intellection. Effective behavior can evolve interdependently of whether an organism responds to its internal state or to externalities. What a neural signal encodes is immaterial as long as the response to it is biological effective (evolutionarily fit). However, if consciousness of objects is solely due to awareness of neurally encoded content, we can have no basis for thinking objects are distinct from ourselves. To do so we must grasp an intelligible difference between our self and the object, and there is none in the neural signal.
This problem is similar to that noted by Alvin Plantinga in his evolutionary argument against naturalism. Both note that what gives evolution its traction, the selection of successful structure and behavior, provides no traction in selecting certain features of mental experience. In Plantinga’s case, it provides no traction in selecting true over false beliefs. Here, it provides no traction in distinguishing self-information from information on the other.
Third, the idea that neural impulses act as a signs glosses over and obscures the dynamics of sensory awareness. Signs are means of knowing. Signifiers are only potential signs unless they actually evoke a thought. Smoke, though a potential sign of fire, is operative only when used to indicate fire. Since knowing is relational, so are signs. In his Ars Logica the Portuguese Dominican John of St. Thomas (John Poinsot 1589-1644) distinguishes formal and instrumental signs. An instrumental sign requires that we understand the sign’s own nature before it can signify. A formal sign does not. Instrumental signs, are things like smoke, writing, road signs, binary codes, rancid smells, etc. In order to grasp what an instrumental sign signifies we must first grasp what the sign is in itself. For smoke to signify fire, we must grasp that it is smoke, and not dust or a cloud. For writing to signify we must first discern the shape of letters or pictographs. If we cannot do so, such signs fail to convey meaning, to act as signs.
It is common (e.g. in represeantational theories of mind) to confuse mental signs such as ideas and judgements with instrumental signs such as binary codes, but they are different. Ideas are formal signs. We need not grasp the nature of a formal sign for it to signify. We do not need to grasp that the concept apple is an idea before it can signify apples. Rather, understanding that we know apples, we see that we must be employing an instrument in knowing them. Thus, we come, retrospectively, to the notion of an apple idea. The whole reality of a formal sign, all that it does and can do, is signifying.
Neurally encoded data works in neither of these ways. Neurons encode data in their firing rates, yet we grasp their encoded contents without the faintest idea of their firing rates, or indeed of anything about our brain state. In consciousness, neither our brain state in general, nor our neural firing rates in particular, function as ordinary (instrumental) signs. Since conscious awareness of contents does not involve the proprioception and interpretation of brain states, they do not function as instrumental signs in providing content to consciousness.
Neither are neural firing rates a formal signs in consciousness. Remember that the entire reality of a formal sign, all that it does, is to signify. Firing neurons do more than signify, and when their firing rate is used to determine their meaning by in third person observation, they are not operating as formal signs. (If a neuroscientist were to observe the firing of a neuron and discern what it signified, its firing rate would be an instrumental sign.)
Despite not fitting into prior semiotic categories, neural firing rates do encode data. A great deal of neurophysical work supports the conclusion that the data they encode provides us with, or at least supports, the contents of consciousness. So, they function as a kind of hybrid sign: one whose intrinsic nature need not be discerned for them to signify, but which, unlike formal signs, do more than signify.
Like the problem of distinguishing self-data from object-data, this seems to intimate that we have a capacity to grasp intelligibility that is not fully modeled in our present understanding.
Comments (48)
I would argue that the ability to feel pain or pleasure is proof of either a divine (or soul) or its proof that the entire universe is one living organism and each of us including bacteria are just subset or should a cell with in that giant organism. Are you familiar with the notion of collective conscieeeeeence or collective soul? Its like the entire universe has phantom leg syndrome.
Object and subject are an ontological unity, having epistemological distinctions.
From a Cognitive viewpoint:
A neural message is a function of sensation (stimulation-response).
1) Stimulation is exogenous and/or endogenous stimulus (sensory signal) transduction by receptors, causing a response.
2) Response is the propagation of action potentials in excitable cells.
Seeing a red apple is a function of perception (sensory interpretation), specifically: vision.
The brain processes sensation data, and the mind processes perception data (these are incommensurable levels of abstraction).
From an Ecological viewpoint:
Gibson, James Jerome. 1977. The Theory of Affordances. In R. Shaw & J. Bransford (eds.). Perceiving, Acting, and Knowing: Toward an Ecological Psychology. Hillsdale, NJ: Lawrence Erlbaum.
Quoting Dfpolis
This is a function of self-awareness development, levels 2 & 3.
Rochat, Philippe. 2003. "Five Levels of Self-Awareness as They Unfold in Early Life". Consciousness and Cognition 12 (2003): 717–731.
Quoting Dfpolis
Biosemioticians would classify a "neural impulse" as a signal type of sign.
While I agree that we have an immaterial aspect that makes us subjects in the subject-object relation of knowing (a soul), the fact that each of us is a different subject, with unique experiences, makes it difficult for me to lend credence to the notion of a collective soul. I do think that there is an immaterial God, and that we can be aware of God via rational proof and direct, mystical experience.
Still, I do not see that anything you said resolves the three issues I raised. Did I miss something?
I agree with this, and suggest this may just mean we have a problematic paradigm. E.g. reference to "information" seems problematic, because information connotes meaning, and meaning entails (conscious) understanding - which seems circular, and it doesn' seem possible to ground these concepts in something physical. That doesn't prove mind is grounded in the nonphysical, it may just be an inapplicable paradigm.
Consciousness is that which mediates between stimulus and response. As such, we should consider the evolution of consciousness from the simplest (direct stimulus-response), to increasing complexity, and develop a paradigm that can be applied to the development of mediation processes. As far as I know, this has not been done.
Quoting Relativist
I agree. As I argued last year, I do not think that intentional (mental) realities can be reduced to physical realities.
Quoting Relativist
"Physical" means now the reality it calls to mind now. Its meaning may change over time (and has), but the present paradigms are based on our conceptual space as it now exists. Changing paradigms involves redefining our conceptual space, and a consequent redefinition of terms such as "physical" and "natural."
Quoting Relativist
This seems very behaviorist in conception and inadequate to the data of human mental experience.
I don't think it requires redefining "physical" and "natural", it means reconsidering the nature of our thoughts. A visual image is something distinct from the object seen, it's a functionally accurate representation of the object. In general, our conceptual basis for a thought is based on the way things seem to be, but the seemings may be illusory. It seems as if a concept is a mental object, but when employed in a thought, it may more accurate to describe it as a particular reaction, or memory of a reaction: process and feeling, rather than object.
I'm actually in the middle of something right now. I'll get back to you later. I've been drinking a little bit so i can't quickly reply with a quick answer. Thanks for the reply.
oh ok.
Signals are not only transmitted from environment to body to mind, but also from mind to body, causing change in the environment. The capacity for motor coordination differentiates object (other) and self in the mind of a sentient being.
Quoting Relativist
Communication (including: data, encoding, code, message, transmission, conveyance, reception, decoding, information) is a good analogy for the sensation process if a physical (as opposed to only semantic) type is acknowledged.
It's a useful analogy in some contexts, but it may not be the best analogy for analyzing the ontology of mind. For example, we aren't going to find a physical structure that corresponds to a packet of data (from perception) or of decomposable information (like the logcal constructs that define a concept). That is not sufficient grounds to dismiss physicalism; it may just mean we need a different paradigm.
I think that mind is an integrated set of organism events which produce an individual's automatic and controlled acts, so; an open sub-system of (at least certain) organisms (e.g., those having a central nervous system). But, the ontology of mind is off-topic.
While I tend to agree with this, it does not explain how we distinguish the object from the subject -- which is the problem I have.
Quoting Relativist
I think I agree. I would say that the concept apple, while often conceived of as a "thing" is simply the act of thinking of apples.
I agree, but how does this allow us to distinguish body states from external states?
Do you mean rather, how does this allow us to distinguish body states from the states of other objects?
Quoting Dfpolis
I suggest that it's a consequence of the neural connections being different. Consider how we distinguish the location of a pain in the left knee - it's a consequence of the specific connections from peripheral nerves to specific areas of the central nervous system, wherein we become consciously aware of the pain's location. Even after the pain is gone, the memory of the pain is unique from other conscious experiences. Visual and auditory information are also unique, and processed through unique neural paths, and this maps to conscious experiences that are also unique.
You referenced Plantinga, so perhaps you're familiar with "properly basic beliefs". Our "beliefs" about the external world are basic, baked into the mechanism (or support structure) that produces (or supports) consciousness. (I'll add that they are properly basic, because they are a consequence of evolutionary development: a functionally accurate grasp of the external world is advantageous. This is the core of my refutation of his EAAN).
It seems to me that subjectivity (being a knowing and willing subject) is essential to the experience of mind. Functionalism does not cut it.
Yes, that is what I said.
Different how? To take your example, how do I distinguish a signal indicating the existence of a condition causing pain from a signal that says only that a pain receptor is firing? Since they are one and the same signal, I do not see how I can.
Just to make one thing clear. There is no such thing as “immaterial” or “non-physical”, it’s a self-contradiction. Instead, it can be undetectable, either due to our limits or in principle, but it must be made of something or otherwise is made of nothing and that means it does not exist.
So, if there is such a thing as soul, it must be physical, it’s the only logically valid semantics, it’s just that substance it is made of is for some reason invisible to us. Any other claim about ontology of “immaterial” is a paradox, simply gibberish. Can we all agree?
When a pain receptor is fired, the mind experiences it as the quale "pain". That is the nature of the mental experience. In effect, the signal passes through a transducer that converts the physical signal into a mental experience.
It’s most accurate and pragmatic to call it “virtual reality”, a sort of simulation, but to keep in mind that does not necessarily imply digital computation and computer algorithms as we know them today.
From a 3rd person perspective, neural states represent mental content in the form of electromagnetic and chemical signals, just like virtual reality of a simulated content is represented inside the computer in the form of signals between the logic gates and other circuits.
Mechanics of even the simplest form of chemical reactions we call “life” is largely still a mystery. We really have no idea how the machine assembled itself, so it’s too optimistic to expect we could yet explain the ghost in the machine. But if you read between the lines of what everyone is talking about and where all the evidence points, this ghost is really just another machine in the machine, but a virtual kind of machine, and that explains everything.
Signals, just like words, pictures, and other kinds of representations are meaningless information by itself. Meaning comes from the grounding inherent in a decoder / interpreter system, also called personality, identity, ego, self...
Sentience is a form of understanding, a way of coupling the signal with its meaning, so meanings / feelings are virtual properties, qualia are virtual qualities. Their ontology is virtual like that of Pacman, and in that sense virtual existence offers almost unlimited and arbitrary kinds of different properties or qualities, only visible from the “inside”, or via VR goggles if we ever figure out a way how to connect.
Quoting Zelebg
Would you care to show the contradiction? Please define "material" and "existence" and then show that existence entails material. I ask this because on the usual understandings these terms do not mean the same thing.
Obviously, being immaterial does men not made of any kind of matter, but there is no logical reason why something not made of matter can't act, and so exist. For example, my intention to go to the store acts to motivate my motion toward the store. Your argument simply begs the question by assuming, a priori, that everything must be "made of something."
You might find my discussion "Intentional vs. Material Reality and the Hard Problem" of interest. In it, I show why intentional existence cannot be reduced to physical existence.
Quoting Zelebg
Of course, but what I am discussing is the first person perspective -- how it is that we know the difference between body states and object states.
Quoting Zelebg
I am not suggesting a ghost in a machine. Rather, unified human have both physical and intentional operations, and neither is reducible to the other -- just as we cannot reduce the sphericity of ball to the rubber it is made of.
Quoting Zelebg
While I agree, this does not solve the problem I am raising.
Yes, it does. How does this allow us to distinguish data on the sensor state from data on the sensed?
Intentions, and other mental states, feelings and qualities, are not immaterial, they are virtual.
To exist is to be (made of) something rather than nothing. No assumptions, only logic. “Material / physical” is everything that is not nothing, but material existence can also be virtual, not just actual.
Can you give examples of what you are talking about?
As I said, the pain signal (in effect) reaches a transducer which produces the mental state of localized pain. Does this much sound plausible? If so, what is your specific issue?
If the mind is immaterial, as you assume, the issue seems to he: how do physical, electro-chemical signals produce the related mental states - right? It's not clear what specific issue you're focusing on. I'm just saying there has to be some sort of physical-mental transducer - that's where the magic is (the physical-mental causation).
I have no idea what this means. "Virtual" usually means "potential." Clearly, my actual intentions are not longer potential.
Quoting Zelebg
This is begging the question. Clearly, anything that can act in any way exists, and, as I have pointed out, many intentions act to effect motions. Others act to motivate truth claims.
Quoting Zelebg
See the OP. The same signals indicating I am seeing an apple also indicate that my retinal state has change.
What's the problem?
I said "it’s most accurate and pragmatic to call it “virtual reality”, a sort of simulation".
https://en.m.wikipedia.org/wiki/Virtual_reality
My issue is that the same signals indicate I am seeing an apple as indicate I am seeing (my retinal state is being modified by) and apple. So how do we use the signal to know that there is an apple as opposed to the state of my retina has changed?
Quoting Relativist
I do not assume the mind is immaterial. I deduce from the data of experience that it has both physical and intentional operations.
Quoting Relativist
I do not assume that "electro-chemical signals produce the related mental states." Following Aristotle, I see this as the work of the agent intellect, which acts in the intentional, not the physical, theater of operations.
Can "something that acts" be made of nothing, or it must be made of something?
Aside from the fact that this claim is wholly unsupported by data, there is no reason to suppose simulating physical (simulation) operations can generate intentional operations.
It is supported by every single data, some of which I already explained. Programs are intentions.
I suggest that we can deduce this is the case.
Quoting Dfpolis
But surely you must agree that sensory perception originates in physical processes, and ultimately mental states arise. This implies there is a causal chain from the physical to the mental. This suggests that somewhere in the chain, there is a final physical event followed by an initial (non-physical) mental event. There can be parallelism, but at the fundamental level, physical-mental causation has to be taking place. Mental causation entails the converse. I refered to this interface as a "transducer". It seems unavoidable if the mind is non-physical.
Quoting DfpolisI have not deduced it, so I'm considering it a premise, for sake of discussion. Challenging it would entail a different discussion.
No, programs implement the intentions of their programmers. They themselves are signs requiring human interpreters to actually signify.
Please do so.
Quoting Relativist
I agree that neural processes are physical. Whether or not mental states arise from them depends on whether or not we attend to them. The act of attending to them is an act of awareness (aka the agent intellect).
Quoting Relativist
No, it shows that the agent intellect can transform physically encoded data to concepts (mental intentions).
Quoting Relativist
Why?Quoting Relativist
Immaterial does not mean physically impotent. The laws of nature are not made of matter; nonetheless, they effect physical transformations.
Quoting Relativist
We had that discussion earlier, when I showed that and why intentional realities are not reducible to material realities.
No? Who is interpreting the signs in your DNA? And what do you call a process constrained by a set of instructions, such as processes in your body, your cells and organs, if not a program? Who wrote the function for your heartbeats, your blood flow, and your digestion? It’s all programs, and programs implement intentions too.
Normally, no one. DNA does not work by being a sign, but mechanically. Hence, it needs no interpretation or interpreter. It is a "program" in an analogous way compared to a computer program, which is to say equivocally so. (Or do you think of God as a programmer?) When it is a "sign" of a structure, the interpreter is a molecular geneticist.
Did you see "Hugo"? The atomaton in it had a program that caused it to draw. No interpretation was required. So the program did not act as a sign, except in relation to a human programmer or interpreter.
Quoting Zelebg
We use words analogously to cover new needs. As a result various uses need not mean the same thing, and what they name need not work in the same way. Normal instructions and rules are signs which must be interpreted by a mind before they can be implemented. There is no such set of instructions in human physiology. Rather, there are laws of nature that act on initial physical states to produce later physical states without need of interpretation. So we must be careful not to be fooled when the same words are used with differ meanings in different cases.
I am, however, glad that you see that the laws of nature are works of Mind.
OK, this suggests mental states contingently arise. Nevertheless, the relevant mental states do not arise without the physical input.
Quoting Dfpolis
Sensory perception ceases when there's a physical defect. This is strong evidence that the physical processes are in the causal chain even if there are immaterial dependencies as well (like attentiveness).
Quoting Dfpolis
Laws of nature describe physical-physical causation. Mental-physical and physical-mental is unique.
Quoting Dfpolis
How does the physically encoded data get into an immaterial mind? How do you explain the dependency on physical processes? If you deny the dependency, why does input cease when the equipment is defective? It seems to me the only plausible explanation is that the physical processes cause immaterial mental states. The attentiveness issue doesn't refute this, it just adds a switch.
I hope you can see that I'm treating mind as an immaterial object, and merely trying to infer how the mind interfaces with the world.
Two different signals are involved in the process of sensation.
Light (one type of signal) changes retinal states. Photoreceptors (rods and cones) in the retina transform light into neural signals.
Neural signals and visual perception are related by correlation, not causation.
So, do neural codes signify conscious content?
They may signify conscious content, or semi-conscious content (e.g., sleep, automaticity, or the reception of subliminal and/or supraliminal stimuli), or both (in the case of dual processing).
DNA is a set of instructions and it is perfectly valid to call them symbols, as we do in genetics. Besides, there are no signs in the computer either, once the program starts execution it’s all hardware. Everything is mechanical / electrical, and that has absolutely no bearing on whether it needs “interpretation” or it doesn’t. DNA replication programs and every other biological process are obvious examples.
Narrow thinking will not help your understanding. Those were rhetorical questions, but instead of to realize your errors, you are trying to explain what is obviously false. Are you a zombie?
The point was, you were wrong:
1. even computer programs are not symbolic at the time of execution
2. programs do not need programmers or interpreters to do their program
3. programs do not need programmers to exist, they can spontaneously evolve
That is the usual case. However, you may wish to read W. T. Stace, Mysticism and Philosophy (1960) to have more data to reflect upon.
Quoting Relativist
I agree that consciousness is usually awareness of neurally encoded data. I think I said that in my OP. That does not mean that neurally encoded data is sufficient, only that it is normally necessary.
Quoting Relativist
Are you sure? I have argued (elsewhere) that the laws of nature are essentially intentional. Like human committed intentions, they effect ends. My arriving at the store is immanent in my initial state and decision to go to the store. The final state of a physical system is immanent in its initial state and the laws of nature. They meet Brentano's criterion for intenionality of pointing beyond themselves by pointing to later states.
Quoting Relativist
The immaterial aspect of the mind (the power to choose and attend, aka aware) has no specific "place;" however, experience tells us it generally attends to data processed by and encoded in the brain -- and we have a reasonable idea of how data gets there.
Quoting Relativist
They inform the mental states, but to inform is not to be an efficient cause. Plans may inform a process, but they do not cause the process.
The signals in the brain indicate that the state of our rods and cone has changed.
Quoting Galuchat
So, in your view, no dynamics links neural signals and visual perception??
Quoting Galuchat
No, and that is my point. We do not first become aware (or ever become aware) of our neural state and then interpret what that state means. For an (instrumental) sign to work we need to be aware of what the sign is, and then decide what it means. There is no such process here. So calling neural pulses "signs" only increases confusion.
I don't know what the link is between neural signals and various types of perception. And I don't think current neuroscience has explained it.
Currently, I am inclined to think that it may be emergent (i.e., a property and/or effect not attributable to organism components in isolation or in sum).
What do you think it is?
What is it you expected to find, what exactly do you claim is missing?
Quoting Dfpolis
Even if your mind is not spatially located, your brain is - and there's clearly a strong connection between your mind and your brain. Your mind doesn't obtain sensory input from your next door neighbor's brain. This suggests some sort of ontic connection between something located in space and something that is not. (There is an ontic connection between positively charged and negatively charged particles).
Besides sensory input, the mind utilizes memories, and it seems the memories must be physically located in the brain, or at least some necessary neural correlates are in the brain. (By that, I mean that in the absence of those physical neural correlates, the mind cannot attend to a memory). This is the implication of memory loss due to trauma and disease. Do you agree?
You rejected my suggestion that the brain causes mental states, so I assumed you must think the brain reads and interprets neural states. But you also denied that the mind is interpreting neural states (you said to Galuchat, " We do not first become aware (or ever become aware) of our neural state and then interpret what that state means."). What's left?
Nevertheless, functional neuroimaging links the spatiotemporal properties of neural response and the cognitive state of a subject. So, neural signals signify conscious and/or semi-conscious content.
"In each of these techniques, images generated while the subject is in one cognitive state may be directly compared, on a location-by-location basis, with images generated while the subject is in a second, different state."
Talavage, T. M., Gonzalez-Castillo, J., & Scott, S. K. (2014). Auditory neuroimaging with fMRI and PET. Hearing research, 307, 4–15. https://doi.org/10.1016/j.heares.2013.09.009.
This establishes correlation, but not causation.
Does neural response cause perception, or does perception cause neural response?