You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

Integrated Information Theory

frank May 26, 2021 at 13:23 11825 views 171 comments
I'm going to describe IIT, based on the scholarpedia page.

IIT, originated by Giulio Tonini, is an attempt to specify the system requirements for consciousness. It starts with "axioms", which are aspects of consciousness derived from phenomenology. Based on these axioms, it presents "postulates", or specs for a system that has these axiomatic attributes.

In formulating the axioms, Tonini uses these criteria:

1. About experience itself;
2 Evident: they should be immediately given, not requiring derivation or proof;
3 Essential: they should apply to all my experiences;
4 Complete: there should be no other essential property characterizing my experiences;
5 Consistent: it should not be possible to derive a contradiction among them; and
6 Independent: it should not be possible to derive one axiom from another.

Next: a closer look at the axioms:


Comments (171)

jgill May 26, 2021 at 18:45 #542511
I looked into this briefly several years ago and recall that, apart from conceptual issues, the math is difficult to employ.
frank May 26, 2021 at 19:25 #542521
Reply to jgill Difficult to employ, as in you can't get there from here?
jgill May 26, 2021 at 20:13 #542530
From Wiki: "The calculation of even a modestly-sized system's ? Max {\displaystyle \Phi ^{\textrm {Max}}} {\displaystyle \Phi ^{\textrm {Max}}} is often computationally intractable,[6] so efforts have been made to develop heuristic or proxy measures of integrated information. For example, Masafumi Oizumi and colleagues have developed both ? ? {\displaystyle \Phi ^{*}} {\displaystyle \Phi ^{*}}[7] and geometric integrated information or ? G {\displaystyle \Phi ^{G}} {\displaystyle \Phi ^{G}},[8] which are practical approximations for integrated information. These are related to proxy measures developed earlier by Anil Seth and Adam Barrett.[9] However, none of these proxy measures have a mathematically proven relationship to the actual ? Max {\displaystyle \Phi ^{\textrm {Max}}} {\displaystyle \Phi ^{\textrm {Max}}} value, which complicates the interpretation of analyses that use them. They can give qualitatively different results even for very small systems.[10]

A significant computational challenge in calculating integrated information is finding the Minimum Information Partition of a neural system, which requires iterating through all possible network partitions. To solve this problem, Daniel Toker and Friedrich T. Sommer have shown that the spectral decomposition of the correlation matrix of a system's dynamics is a quick and robust proxy for the Minimum Information Partition.["
frank May 26, 2021 at 20:41 #542535
Reply to jgill
Yes, I'm aware of that problem. I want to understand the whole approach. Do you think the problem with calculating phi is insurmountable?
jgill May 26, 2021 at 22:28 #542560
Reply to frank Well, it sounds like people are working on it. I suppose I question attempting to apply math in this context. A lot depends on whether predictions are calculated and match reality. :chin:
frank May 27, 2021 at 15:57 #542895
Reply to jgill :up:

Tonini uses the axioms to specify what he wants a target system to support. The first is that consciousness is intrinsic, by which he means:

Tonini, Scholarpedia article posted in OP:Consciousness exists: each experience is actual—indeed, that my experience here and now exists (it is real) is the only fact I can be sure of immediately and absolutely. Moreover, my experience exists from its own intrinsic perspective, independent of external observers (it is intrinsically real or actual).


Elsewhere, Tonini says that Galileo took the observer out of science, and that we will now put it back in. It's in this light that we should understand this axiom. The emphasis here is on the view of the observer.

Typically, neuroscientists observe consciousness. They record what a subject reports, which is an example of a behavioral correlate of consciousness (BCC), and link that up in some way to neuronal correlates (NCC). Tonini wants to go beyond that approach and just start with experiences as intrinsic.

Is he warranted to do that? Does it matter?
Cuthbert May 27, 2021 at 17:01 #542911
It has MICE in it so it can't be all bad.
frank May 27, 2021 at 18:03 #542932
Quoting Cuthbert
It has MICE in it so it can't be all bad.


As long they didn't chew on the insulation.
jgill May 27, 2021 at 20:21 #542968
Quoting frank
Tonini uses the axioms . . .


It's Tononi. His system has as axiomatic the existence of consciousness. I agree and have the same perception of time. Both simply exist. And unraveling qualia or dissecting time seems wasted effort.

frank May 27, 2021 at 20:37 #542973
Reply to jgill So you agree with Tonini. Cool.
Daemon May 27, 2021 at 21:52 #543032
Reply to frank I'm interested to see the axioms.
jgill May 27, 2021 at 22:28 #543057
Quoting frank
So you agree with Tonini


It's Giulio Tononi. :roll:
frank May 27, 2021 at 22:39 #543064
Reply to jgill I know. The i is right beside the o.
frank May 27, 2021 at 22:41 #543067
Quoting Daemon

I'm interested to see the axioms.


I'll scoot through the rest of them. That first one kind of sets the frame.
frank May 27, 2021 at 22:50 #543073
The rest of the axioms are as follows per Tononi:

Composition

Consciousness is structured: each experience is composed of multiple phenomenological distinctions, elementary or higher-order. For example, within one experience I may distinguish a book, a blue color, a blue book, the left side, a blue book on the left, and so on.

Information

Consciousness is specific: each experience is the particular way it is—being composed of a specific set of specific phenomenal distinctions—thereby differing from other possible experiences (differentiation). For example, an experience may include phenomenal distinctions specifying a large number of spatial locations, several positive concepts, such as a bedroom (as opposed to no bedroom), a bed (as opposed to no bed), a book (as opposed to no book), a blue color (as opposed to no blue), higher-order “bindings” of first-order distinctions, such as a blue book (as opposed to no blue book), as well as many negative concepts, such as no bird (as opposed to a bird), no bicycle (as opposed to a bicycle), no bush (as opposed to a bush), and so on. Similarly, an experience of pure darkness and silence is the particular way it is—it has the specific quality it has (no bedroom, no bed, no book, no blue, nor any other object, color, sound, thought, and so on). And being that way, it necessarily differs from a large number of alternative experiences I could have had but I am not actually having.

Integration

Consciousness is unified: each experience is irreducible to non-interdependent, disjoint subsets of phenomenal distinctions. Thus, I experience a whole visual scene, not the left side of the visual field independent of the right side (and vice versa). For example, the experience of seeing the word “BECAUSE” written in the middle of a blank page is irreducible to an experience of seeing “BE” on the left plus an experience of seeing “CAUSE” on the right. Similarly, seeing a blue book is irreducible to seeing a book without the color blue, plus the color blue without the book.

Exclusion

Consciousness is definite, in content and spatio-temporal grain: each experience has the set of phenomenal distinctions it has, neither less (a subset) nor more (a superset), and it flows at the speed it flows, neither faster nor slower. For example, the experience I am having is of seeing a body on a bed in a bedroom, a bookcase with books, one of which is a blue book, but I am not having an experience with less content—say, one lacking the phenomenal distinction blue/not blue, or colored/not colored; or with more content—say, one endowed with the additional phenomenal distinction high/low blood pressure.[2] Moreover, my experience flows at a particular speed—each experience encompassing say a hundred milliseconds or so—but I am not having an experience that encompasses just a few milliseconds or instead minutes or hours.[3]

So
1. Intrinsic, a single perspective
2. Composition, a discernable structure
3. Information, each experience is distinct
4. Integration, experience is unified
5. Exclusion, experience has a definite grain

What will follow is postulates, or characteristics of a system that is conscious. These postulates match up with the axioms.
Daemon May 27, 2021 at 22:55 #543076
This is enjoyable, thank you Frank. Off to beddybyes now.
jgill May 28, 2021 at 04:30 #543173
Quoting frank
These postulates match up with the axioms.


Axioms and postulates are generally considered the same things. Does Tononi distinguish between them?
frank May 28, 2021 at 13:10 #543251
Quoting jgill
Axioms and postulates are generally considered the same things. Does Tononi distinguish between them?


For the sake of distinguishing between the description of consciousness (axioms) and system requirements (postulates). Later.
frank May 28, 2021 at 15:50 #543324
Thus is a fascinating sentence:


"Note that these postulates are inferences that go from phenomenology to physics, not the other way around. This is because the existence of one’s consciousness and its other essential properties is certain, whereas the existence and properties of the physical world are conjectures, though very good ones, made from within our own consciousness."

It's Descartes 2.0.
frank May 28, 2021 at 23:25 #543532
@Isaac
Can you treat a neuron like a logic gate?
Daemon May 29, 2021 at 09:55 #543686
Reply to frank I know this one, ask me sir!!

frank May 29, 2021 at 11:16 #543700
Quoting Daemon

I know this one, ask me sir!!


Cool. Can you?
Daemon May 29, 2021 at 11:35 #543705
You can treat a neuron as a logic gate, but that's not what it is. Here are some reasons why not, taken from various sections of The Idea of the Brain by Matthew Cobb:

1. A neuron can secrete several different types of neurotransmitter into the synapse.
2. Even in a simple circuit each neuron is connected to many other neurons both by chemical synapses and by what are called gap junctions.
3. Neuronal activity can be altered by neuromodulators, neuropeptides and other compounds that are secreted alongside neurotransmitters and which function as relatively slow-acting mini-hormones, locally altering the activity of neighbouring neurons.
4. The activity of each neuron is affected not only by its identity (that is by the genes that determine its position and function), but also by the previous activity of the neuron.
5. Structures in the brain are not modules that are isolated from one another - they are not like the self-contained components of a machine...neurons and networks of neurons are interconnected and able to affect adjoining regions by changing not only the activity of neighbouring structures but also the patterns of gene expression.
frank May 29, 2021 at 12:42 #543729
Quoting Daemon
4. The activity of each neuron is affected not only by its identity (that is by the genes that determine its position and function), but also by the previous activity of the neuron.


Oh, this is why the theory emphasizes causality within the system itself.

Thanks for the explanation.

frank May 29, 2021 at 13:00 #543733
The first postulate is supposed to explain the existence of the first axiom: that consciousness is intrinsic, or independent of an observer. IOW, you're conscious and have access to a certain POV whether anyone else is around or not.

ITT says this requires a system that is causally open to it's environment to and from, but there is also causation internal to the system.

User image

Or in Tononi's words:

"To account for the intrinsic existence of experience, a system constituted of elements in a state must exist intrinsically (be actual): specifically, in order to exist, it must have cause-effect power, as there is no point in assuming that something exists if nothing can make a difference to it, or if it cannot make a difference to anything.[7] Moreover, to exist from its own intrinsic perspective, independent of external observers, a system of elements in a state must have cause-effect power upon itself, independent of extrinsic factors. Cause-effect power can be established by considering a cause-effect space with an axis for every possible state of the system in the past (causes) and future (effects). Within this space, it is enough to show that an “intervention” that sets the system in some initial state (cause), keeping the state of the elements outside the system fixed (background conditions), can lead with probability different from chance to its present state; conversely, setting the system to its present state leads with probability above chance to some other state (effect)."

Daemon May 29, 2021 at 21:40 #543991
What interests me is what constitutes a "system". How is the boundary between the "inside" and the "outside" of the system established?
frank May 29, 2021 at 22:03 #544004
Quoting Daemon
What interests me is what constitutes a "system". How is the boundary between the "inside" and the "outside" of the system established?


Good question. In a YouTube lecture, Christof Koch emphasized that the hardware they're thinking of is neurons, period. So the boundary is the surface of the brain?
Daemon May 29, 2021 at 23:23 #544048
But the relevant "system" is the whole body. As well as the neurons there is a blood supply, the biochemical bath the neurons are immersed in, the spine, the nervous system, the sense organs.
frank May 29, 2021 at 23:28 #544051
Quoting Daemon
But the relevant "system" is the whole body.


But there's no consciousness associated with your liver, in fact consciousness doesn't even require a cerebellum, but if we cut the brain in half, we get two conscious entities in one skull.

I don't think they know exactly what the target hardware is. I don't know why it's right to zero in on neurons. What do you think?
fishfry May 29, 2021 at 23:55 #544065
Quoting frank
IIT, originated by Giulio Tonini,


Scott Aaronson debunkificated this a while back. David Chalmers shows up in the comment section.

https://www.scottaaronson.com/blog/?p=1799
frank May 30, 2021 at 00:04 #544068
Reply to fishfry
Cool. Thanks.
Daemon May 30, 2021 at 00:12 #544073
I only know what you've written here about IIT. I'm not at all sure what it is a theory of.

Is it to be implemented on a digital computer? You mentioned logic gates.

That would be a non-starter, as far as modelling the brain is concerned, for the reasons outlined above, and more. For example there are wave-like phenomena involving large groups of neurons, also what I think is called back-propagation, with impulses travelling both upstream and down.
Daemon May 30, 2021 at 08:48 #544216
Reply to frank
But there's no consciousness associated with your liver,

I still maintain that the relevant system is the entire body. The brain is not modular like a man-made machine (see above) and neither is the body. The brain relies on a supply of blood, and the liver plays a major role in providing that.
frank May 30, 2021 at 11:06 #544237
Quoting Daemon
Is it to be implemented on a digital computer? You mentioned logic gates.


The idea is to be able to make predictions about whether a system is conscious. I'm not sure how they would test it, though.

Quoting Daemon
The brain is not modular like a man-made machine (see above) and neither is the body. The brain relies on a supply of blood, and the liver plays a major role in providing that.


I guess the assumption behind excluding most of the body is that it powers the system which produces consciousness without participating in consciousness.
Daemon May 30, 2021 at 11:26 #544240
Reply to frank I think the whole thing is scientifically naive. I was thinking about parts of the body that don't participate in consciousness, I thought of hair. But try stroking your hair.

Consciousness is embodied.
magritte May 30, 2021 at 12:15 #544248
Quoting frank

IIT, based on the scholarpedia page.
In formulating the axioms, Tononi uses these criteria:
1. About experience itself;


Quoting fishfry
Scott Aaronson debunkificated this


For assertion 1. the philosophical question is what is x if anything at all. Since experience is private there is no way to answer that except for claiming that experience-in-itself exists as a Platonic concept and as a corresponding linguistic proxy.
In astronomy there are the analogously fuzzy notions of dark matter and dark energy which are postulated to reify their effects on galaxy clusters and on theoretical universal expansion. Neither can be directly seen and identified as objects but physicists can justify supposing that they necessarily exist.

Totoni's phi would be a basis for an objective measure of something-or-another that he labels as experience/consciousness. It is not a measure of my mentality before my first cup of coffee but what it might do is to define totoni-ness, an entirely different thing with hopefully some connection to what is generally thought of by others. Whether that is meaningful or just useful would depend on physical implementation of measuring and classifying phi's for various living and inanimate subjects. If the quantification of a cat's phi lies somewhat between Totoni's and a sunflower's then he will have achieved some success.
frank May 30, 2021 at 12:49 #544255
Quoting Daemon
think the whole thing is scientifically naive. I was thinking about parts of the body that don't participate in consciousness, I thought of hair. But try stroking your hair.

Consciousness is embodied.


It's not obvious to me that consciousness requires a system in possession of a liver. That it needs a power source, yes. Since there's a certain amount of closed causation, intuition says there's some work involved.

Does it need a liver to filter and provide digestive enzymes? I don't know.

Downstream we may realize it does, but I think we can start with the assumption that we don't need it. Since we can change out your liver without altering your consciousness, maybe you don't.

Quoting Daemon
Consciousness is embodied.


What does this tell us?
frank May 30, 2021 at 12:53 #544256
Quoting magritte
Neither can be directly seen and identified as objects but physicists can justify supposing that they necessarily exist.


Dark energy is at the root of the present crisis in cosmology. The crisis (having to do with conflicting measurements of the universe's expansion) promises to increase our understanding of it.

Yay for linguistic proxies.
original2 May 30, 2021 at 16:33 #544324
Reply to fishfry

Scott Aaronson debunkificated this a while back.


Aaronson demonstrated that the IIT significantly doesn't match some of our intuitions about consciousness, so in my opinion IIT isn't correct, but IIT still matches very well some other of those intuitions. It may be used for example to recognize that a group of humans doesn't constitute a conscious entity greater than a single human.

In my opinion IIT is a great attempt at solving consciousness, surprisingly serious approach to solve a non-trivial philosophical problem. I'd really like to see it tweaked to make better cassifiers of conscious entities.

I think defining consciousness using only the flow of information is lacking. For starters I'd include that conscious entity needs to recognize patterns in this information.
Daemon May 30, 2021 at 21:40 #544488
Reply to frank Consciousness may not require a liver, let's say a lobster is conscious and doesn't have a liver, but the lobster or human does need to have the equipment to remain alive and...conscious. If the human hadn't had a liver to start with, it wouldn't have become conscious.

I do think the idea of the brain and body not being modular is an important one. They haven't been designed in the way we design machines. I don't know how much bearing this has on the maths side of it, that is completely beyond me. But if they are really equating neurons to logic gates, the maths is just meaningless I think. The actual mechanisms are so much more messy, plastic, multifaceted than binary logic gates.

Brains and bodies don't work by processing information. The brain works through things like neurotransmitters, neuromodulators, electrical impulses, wave-like phenomena. Calling all this "information processing" doesn't tell us anything more about what is happening.

"Patterns" is another troublesome concept @original2. Can we think about a relatively simple biological process to see why. Bacteria can swim towards a desirable stimulus (let's say some sugar) or away from a toxic chemical. It seems they would need to recognise a kind of pattern in the increasing or decreasing concentration of the chemical. But in fact we know every detail of the chemical process that achieves the directional swimming, and there's nothing left for "pattern recognition" to do.

The same is true of the more complex processes in the brain. It works through things like electrical impulses and so on, not pattern recognition.

Pattern recognition is something a person does, not their brain.

What does the embodied mind tell us @frank? I suppose it tells us that Tononi isn't seeing the whole picture?
frank May 30, 2021 at 23:48 #544584
Quoting Daemon
Consciousness may not require a liver, let's say a lobster is conscious and doesn't have a liver, but the lobster or human does need to have the equipment to remain alive and...conscious. If the human hadn't had a liver to start with, it wouldn't have become conscious.


Sure. You need some way to keep the brain alive. We can take over the functions of the heart, lungs, and kidneys with machinery. Hospitals do it everyday. The patient can be wide awake while being supported in this way. So whether the body is modular or not, whether a human needs a liver, I think that's a side issue.

Quoting Daemon
The actual mechanisms are so much more messy, plastic, multifaceted than binary logic gates.


But it's not like some sort of mystical fuzz. Is it?

Quoting Daemon
Bacteria can swim towards a desirable stimulus (let's say some sugar) or away from a toxic chemical. It seems they would need to recognise a kind of pattern in the increasing or decreasing concentration of the chemical.


I don't see why it would need to recognize a pattern.

Daemon May 31, 2021 at 08:57 #544749
Quoting frank
We can take over the functions of the heart, lungs, and kidneys with machinery. Hospitals do it everyday. The patient can be wide awake while being supported in this way. So whether the body is modular or not, whether a human needs a liver, I think that's a side issue.


There's a point here which I haven't yet properly expressed or thought through (which makes it interesting, to me anyway).

When the hospital takes over vital functions they are taking over something that's already in operation, that already has to be in operation for the person to be in a position to be conscious at all. We can't make the whole thing from scratch, using machinery.

And the brain can't be isolated from the rest of the body, it's enmeshed with the rest of the body. There's no sense in thinking about it operating in isolation, it would have nothing to operate with.

Or to look at it from another angle, feeling is primary.

Quoting frank
But it's not like some sort of mystical fuzz. Is it?


Why should it be? Why introduce the idea of mystical fuzz? Seems to me this is to accept the categorisations of cartesian dualism.

Quoting frank
I don't see why it would need to recognize a pattern.


And I don't see why the brain would need to recognise a pattern in information as @original 12 has suggested. A person can recognise a pattern, a brain can't (it's the homunculus fallacy).

frank May 31, 2021 at 13:15 #544786
Quoting Daemon
When the hospital takes over vital functions they are taking over something that's already in operation, that already has to be in operation for the person to be in a position to be conscious at all. We can't make the whole thing from scratch, using machinery.


I understand what you're saying. The body comes as a package.

If we want to know which parts drives blood pressure, we can pick out the heart and kidneys. We know your foot isn't part of it.

But for you consciousness is different from that kind of function? We can't identify a body part that produces it?

Quoting Daemon
And the brain can't be isolated from the rest of the body, it's enmeshed with the rest of the body.


The brain actually is isolated by the blood brain barrier. The CNS has its own private immune system. At 2 in the morning while studying A&P, it might occur to a student that the CNS looks like an alien that invaded a tetrapod.

But I digress. :cool:


Daemon May 31, 2021 at 16:07 #544831
Quoting frank
But for you consciousness is different from that kind of function? We can't identify a body part that produces it?


We can identify it in an abstract sense, but not in a practical sense, as we can with a manmade machine.

We have "brainoids" now, grown from adult human skin cells. But unless they are connected to sense organs, and yes, things like feet, they can't do what real brains do. There isn't anything for them to be conscious of.

frank May 31, 2021 at 17:20 #544845
Reply to Daemon
I see what you're saying. The IIT approach would be like starting with the assumptions that we digest food and that some parts of the body are causing that.

We would then start with specifying the parameters of digestion:. food goes in, food breaks down, the body keeps the good parts and throws away what it doesn't want.

Then we would hypothesize for testing.

It's ok that digestion is inextricably linked to other body functions. An "abstract" division, as you say, is enough for our purposes.

Is that satisfactory? Or is there some reason consciousness should be looked at radically differently to digestion?

Daemon May 31, 2021 at 22:58 #544978
Quoting frank
An "abstract" division, as you say, is enough for our purposes.


I still owe most of my knowledge of IIT to you, but from what I understand, the purpose is to quantify what is required to achieve consciousness. But it seems they are abstracting an arbitrary aspect of the biological machinery, and quantifying that. They aren't taking account of all the brain stuff that isn't just neurons firing (the neuromodulators and so on), they are pretending that neurons are logic gates (which they aren't), and they aren't taking account of the essential involvement of the body beyond the brain.

Do you know of responses to such criticisms? Should I read the Scholarpedia page, or are you planning to post more?




Pop May 31, 2021 at 23:56 #544996
Quoting original2
I think defining consciousness using only the flow of information is lacking. For starters I'd include that conscious entity needs to recognize patterns in this information.


A conscious entity would need to interpret the information flow. But what does interpret mean? In the broadest sense even a rock interprets the information flow in its form and position.

According to Fritjof Capra: "cognition is a reaction to a disturbance in a state." And it would seem everything is a system in a state.
fishfry June 01, 2021 at 07:31 #545097
Quoting magritte
It is not a measure of my mentality before my first cup of coffee


My understanding is that Tononi's phi is intended to be exactly that. But I only linked to the Aaronson article and haven't paid much attention to IIT, and can't comment authoritatively.
Daemon June 01, 2021 at 09:43 #545151
Quoting frank
The brain actually is isolated by the blood brain barrier. The CNS has its own private immune system.


It's not entirely isolated though. The blood brain barrier is a filter isn't it, not a seal.




frank June 01, 2021 at 12:30 #545232
Quoting Daemon
It's not entirely isolated though. The blood brain barrier is a filter isn't it, not a seal.


Yes, the bloodbrain barrier is a filter. I realize the CNS is not a parasite. It just kind of looks like one from a certain angle.

I will be moving on regarding IIT, just need a minute to sit down. :)
Daemon June 02, 2021 at 08:06 #545686
Quoting frank
I will be moving on regarding IIT, just need a minute to sit down. :)


Can I just say how much I'm enjoying the discussion @Frank, I really appreciate you posting the summaries and will wait patiently until more arrives.
Gnomon June 02, 2021 at 16:31 #545828
Quoting frank
IIT, originated by Giulio Tonini, is an attempt to specify the system requirements for consciousness.

I appreciate that Tononi began with abstract mathematical Information as fundamental, and derived human-like Consciousness as an emergent phenomenon. That bypasses the distractions of worrying about the feelings of subatomic particles. :smile:
frank June 02, 2021 at 17:30 #545854
Quoting Daemon
Can I just say how much I'm enjoying the discussion Frank, I really appreciate you posting the summaries and will wait patiently until more arrives.


That's so cool! Thanks!

So we talked about the postulate that covers the intrinsic character of experience: the spec being that the system must have a certain amount of closed causation (a cause-effect space).

For composition, the postulate is that the system has to be structured:

"Composition
The system must be structured: subsets of the elements constituting the system, composed in various combinations, also have cause-effect power within the system. Thus, if a system ABC is constituted of elements A, B, and C, any subset of elements (its power set), including A, B, C; AB, AC, BC; as well as the entire system, ABC, can compose a mechanism having cause-effect power. Composition allows for elementary (first-order) elements to form distinct higher-order mechanisms, and for multiple mechanisms to form a structure." --Tononi in the Scholarpedia article.

IOW, the intuition is that if the system is structureless like water, it can't produce a structured experience.
frank June 02, 2021 at 17:31 #545856
Quoting Gnomon
I appreciate that Tononi began with abstract mathematical Information as fundamental, and derived human-like Consciousness as an emergent phenomenon. That bypasses the distractions of worrying about the feelings of subatomic particles.


I think its kind of the opposite. He starts with phenomenal consciousness and is trying to derive a system that would produce it.
original2 June 02, 2021 at 21:13 #545919
Reply to Pop

A conscious entity would need to interpret the information flow.


Yes, but not any interpretation suffices for consciousness in my opinion. Any information transformation could be arguably seen as interpretation and trivial information transformations are what for example inanimate objects do with information thrown at them.

My gut tells me that if an entity is able to match information to patterns, it is a mark of consciousness, though not necessarily a big one. By pattern i mean some generalized, meta-information that describes information succintly.

An animal being able to differentiate between predator and prey for example would have baked in (perhaps learned, but not necessarily) information patterns that allows it to do that.

An entity being capable of finding new patterns in the information would be in a certain way better that one that isn't capable to do that, but I'd argue that it would be useful to call them both conscious. It would be useful to have terms for both of those categories though.

P.S. Is there an automatic way to quote other posts in the style most people do that here? It eludes my perception.
frank June 02, 2021 at 22:48 #545950
User image
Pop June 03, 2021 at 04:19 #546007
Quoting original2
P.S. Is there an automatic way to quote other posts in the style most people do that here? It eludes my perception.


If you highlight the text a quote option appears.

Quoting original2
My gut tells me that if an entity is able to match information to patterns, it is a mark of consciousness, though not necessarily a big one. By pattern i mean some generalized, meta-information that describes information succintly.


That is pretty close. I would say if information can be integrated and symbolized, and physical form is a symbol, imo. It has to start somewhere, and this way it starts at the beginning.

We should let Frank finish his excellent summary and perhaps discuss later. Anyhow, welcome to the forum.
Daemon June 03, 2021 at 10:10 #546054
Quoting frank
Composition allows for elementary (first-order) elements to form distinct higher-order mechanisms, and for multiple mechanisms to form a structure.


Does the theory ever address the question of what constitutes a "distinct" mechanism (without a human being making the call)? Without that, the theory doesn't get off the ground, or we have panpsychism, which doesn't explain anything.

frank June 04, 2021 at 00:34 #546278
Quoting Daemon
Does the theory ever address the question of what constitutes a "distinct" mechanism (without a human being making the call)? Without that, the theory doesn't get off the ground, or we have panpsychism, which doesn't explain anything.


Their approach has been to try to determine which parts of the brain are actually involved in consciousness. It's not the whole brain.

Remember, they're building a hypothesis which could be tested.

We can count your disapproval of their system boundaries as a potential flaw, but it's ok to consider their hypothesis as it is.
Daemon June 04, 2021 at 09:59 #546441
I've read somewhere that they accept that a thermostat is conscious. A thermostat but not the whole brain? And the whole body is involved in consciousness!

What's the hypothesis and how would it be tested?

Why is it ok to consider their hypothesis as it is, when it seems to be fatally flawed from the outset?
frank June 04, 2021 at 13:48 #546504
Reply to Daemon
It sounds like this is a deal breaker for you.

Daemon June 04, 2021 at 14:38 #546517
How do you think Tononi et al would respond to the evidence presented by Mark Solms' about his patients with no cortex?

https://youtu.be/CmuYrnOVmfk
jgill June 05, 2021 at 21:56 #546934
Quoting Gina Smith
Yet, Tononi’s original IIT concepts and predictions do appear to be bearing out in various neurological studies. In 2013, Adenauer Casali and colleagues completed a study that showed it was possible to use the IIT framework within an EEG paradigm for measuring consciousness in some patients.
magritte June 06, 2021 at 01:18 #546966
and a bit more clarification from there,
Quoting Gina Smith
Phi is based on the number and quality of interconnections a given entity has between bits of information. The resulting number — the Phi score — corresponds directly to how conscious it is.
The more connections, the more conscious an entity is, a factor quantifies as PHI
Consciousness, in this model, doesn’t rely on a network of information. It is the network. As such, it doesn’t discriminate based on whether the subject is organic or electronic.
Put simply, high PHI measure means more consciousness — more sentience — no matter who or what you are.
Daemon June 06, 2021 at 09:47 #547016
Reply to magritte

An electronic device is only an "entity" insofar as it is defined as such by ourselves.
We also define which elements of the device are to count as the relevant "information".

The electricity flowing around my laptop now came from a power station 10 miles away, it passes through various other electronic devices before it gets here, and it passes through elements of the laptop that we don't include when considering the information content of the "system", such as the cooling fan motor.

This theory is not a serious scientific proposal.
magritte June 06, 2021 at 18:18 #547114
Quoting Daemon
This theory is not a serious scientific proposal.


But scientific proposals do seek relevant information as pertaining to some possibly useful measurable aspect of the natural world, or us as individuals, or the environment that we create.
To follow your analogy, electric meters measure not what we did with the used electricity but the total usage over a month.

I'm not sure what IIT proposes. Mathematically, is it the model for an experience/consciousness meter which reads single transient experience or average consciousness PHI, or perhaps both?

Of course, the two are not the same. I can be equally conscious and still experience or miss seeing a passing hawk in the sky. In either case, would PHI tell us anything about my experience or my consciousness?

Perhaps an anesthesiologist could use PHI to gauge consciousness in addition to heart and respiration rates for surgery?

From the point of view of philosophy, let's suppose that the Chinese Room is on the international isolation ward with many adjoining rooms all instrumented with the latest Totoni meters on the door and computer technology for remote communication. Could the Totoni PHI improve on the failure of the classic thought experiment? Could I or my Totoni computer differentiate a conscious person from an AI? Would it matter?

RogueAI June 06, 2021 at 19:48 #547141
Reply to frank
Thus is a fascinating sentence:


"Note that these postulates are inferences that go from phenomenology to physics, not the other way around. This is because the existence of one’s consciousness and its other essential properties is certain, whereas the existence and properties of the physical world are conjectures, though very good ones, made from within our own consciousness."

It's Descartes 2.0.

[bolding mine]

Physicalism is teetering like a house of cards. Consciousness is primary. The physical world has been relegated to a conjecture (though a very good one). Soon, the parenthetical "though a very good one" will be gone. And then the conjecture of the physical world itself. Positing the existence of mind-independent stuff solves nothing and creates enormous problems.
RogueAI June 06, 2021 at 19:52 #547143
Reply to Daemon
"We can identify it in an abstract sense, but not in a practical sense, as we can with a manmade machine.

We have "brainoids" now, grown from adult human skin cells. But unless they are connected to sense organs, and yes, things like feet, they can't do what real brains do. There isn't anything for them to be conscious of."



If all your sense organs stopped working, you would still be conscious.
RogueAI June 06, 2021 at 20:01 #547147
Reply to Daemon Quoting Daemon
I've read somewhere that they accept that a thermostat is conscious. A thermostat but not the whole brain? And the whole body is involved in consciousness!

What's the hypothesis and how would it be tested?

Why is it ok to consider their hypothesis as it is, when it seems to be fatally flawed from the outset?


Indeed...
RogueAI June 06, 2021 at 20:10 #547149
Reply to frank How would you measure how much PHI a computer has? Does the number of transistors matter? Or how they're arranged? Or both?
Daemon June 06, 2021 at 21:40 #547187
Quoting RogueAI
If all your sense organs stopped working, you would still be conscious.


I'm not sure how you could know that. But in any case you are starting from a position where I previously had working sense organs. But suppose I had never had them: I don't think I'd ever have been conscious. And consider this from an evolutionary perspective: consciousness would never have developed at all without sensing, sense organs.
Daemon June 06, 2021 at 21:49 #547196
Quoting magritte
Perhaps an anesthesiologist could use PHI to gauge consciousness in addition to heart and respiration rates for surgery?


Nope. Tononi and Koch think computers and thermostats and photodiodes are conscious. Anaesthetists know better.
frank June 06, 2021 at 22:05 #547200
Quoting RogueAI
Physicalism is teetering like a house of cards. Consciousness is primary. The physical world has been relegated to a conjecture (though a very good one). Soon, the parenthetical "though a very good one" will be gone. And then the conjecture of the physical world itself. Positing the existence of mind-independent stuff solves nothing and creates enormous problems.


If this theory is taken seriously, it's because we're not in the clunk-headed behaviorist 20th Century anymore. We're still pretty physicalist, though. We're just in the process of stretching the meaning of that word. Again.

Quoting RogueAI
How would you measure how much PHI a computer has? Does the number of transistors matter? Or how they're arranged? Or both?


I was hoping to go through the whole theory step by step. I've just been busy. I'll get back to it shortly.

RogueAI June 07, 2021 at 00:12 #547228
Quoting Daemon
I'm not sure how you could know that.


For one, if I sit in a darkened silent room that's neither hot nor cold, I'm not any less conscious, which should be the case if my consciousness depends on sensory input. Also, I can imagine whatever sensory input might go missing. If they all go missing, it might eventually drive me mad, but I don't see why I would go unconscious. Even without sensory input, I would be conscious of my own internal mental states.

[quote] But in any case you are starting from a position where I previously had working sense organs. But suppose I had never had them: I don't think I'd ever have been conscious. And consider this from an evolutionary perspective: consciousness would never have developed at all without sensing, sense organs.


Good point. Sensory input might be necessary at the start.

RogueAI June 07, 2021 at 00:18 #547230
Reply to frank OK, but I think you're just pushing the Hard Problem to a different level: why does integrating information lead to conscious experience? How does that work exactly? And, in the case of simulated consciousness, which I think IIT endorses, there's the (very familiar) questions of why a particular series of switching operations should give rise to consciousness, and how that works, exactly. But I think IIT is a step in the right direction. At least people are thinking in non-material terms.
RogueAI June 07, 2021 at 01:07 #547244
Reply to frank This was one of the top comments in a consciousness debate that I was just watching:

[i]"Am I just in some weird internet bubble, or are tons of atheists (like myself) realizing that consciousness is a serious problem for materialism and becoming anti-materialists?

And if so, why is the Hard Problem suddenly (as in the last ten years or so) dawning on lots of secular rationalists?

Also, just as actually reading the Bible is often a great way to notice that religion is incoherent and ridiculous, reading Dennett’s _Consciousness Explained” is what finally made me realize that materialist explanations for consciousness are all incoherent. And ridiculous."[/i]

That could have been me talking! For a lot of my life, I was hardcore atheist materialist and that was the paradigm when I was in college in 95. I only bring up this Youtuber nobody because I was talking about physicalism teetering, and I happened to run across their comment.
Daemon June 07, 2021 at 13:46 #547418
Reply to RogueAI The word "simulated" needs to be used with care. Simulating consciousness and giving rise to consciousness are two very different things. You can simulate weather on a PC, but that's not going to give rise to wind and rain.
frank June 07, 2021 at 14:12 #547425
Quoting RogueAI
, but I think you're just pushing the Hard Problem to a different level: why does integrating information lead to conscious experience?


It doesn't lead to it per IIT. Integrateted information is consciousness.

But if you have a second, a thing I've been doing is thinking about the axiom-postulate matches:

So we have:
Intrinsicness --- internal causation
Composition ---. structure

I just discovered scholarpedua is gone. Crap. Ok, for the overview, I'll use Wikipedia.


Daemon June 07, 2021 at 15:48 #547451
Wow, I only just discovered Scholarpedia thanks to you Frank. Has it gone for good??
RogueAI June 07, 2021 at 16:34 #547477
Quoting frank
It doesn't lead to it per IIT. Integrateted information is consciousness.


Every instance of information integration is an instance of consciousness?
magritte June 07, 2021 at 16:50 #547483
Thanks to Wayback Machine, the Totoni article is still available when searching there for www.scholarpedia.org/article/Integrated_information_theory and then for the Mar 29 2021 copy
frank June 07, 2021 at 17:44 #547508
frank June 07, 2021 at 17:45 #547509
Quoting RogueAI
Every instance of information integration is an instance of consciousness?


Good question. Did you see what I said earlier about axioms and postulates?
Daemon June 07, 2021 at 22:48 #547630
From Scholarpedia via Wayback Machine, (thanks @magritte!):
While there may well be a practical threshold for ?max below which people do not report feeling much, this does not mean that consciousness has reached its absolute zero. Indeed, according to IIT, circuits as simple as a single photodiode constituted of a sensor and a memory element can have a minimum of experience (Oizumi, Albantakis et al. 2014).


but also:

For example, it may soon be possible to program a digital computer to behave in a manner identical to that of a human being for all extrinsic intents and purposes. However, from the intrinsic perspective the physical substrate carrying out the simulation in the computer—made of transistors switching on and off at a time scale of picoseconds—would not form a large complex of high ?max, but break down into many mini-complexes of low ?max each existing at the time scale of picoseconds. This is because in a digital computer there is no way to group physical transistors to constitute macro-elements with the same cause-effect power as neurons, and to connect them together such that they would specify the same intrinsically irreducible conceptual structure as the relevant neurons in our brain. Hence the brain is conscious and the computer is not - it would have zero ? and be a perfect zombie. [25] This would hold even for a digital computer that were to simulate in every detail the working of every neuron of a human brain, such that what happens to the virtual neurons (the sequence of firing patterns and ultimately the behaviors they produce) is the same as what happens to the real neurons. On the other hand, a neuromorphic computer made of silicon could in principle be built to realize neuron-like macro-elements that would exist intrinsically and specify conceptual structures similar to ours.


A photodiode has experience, but a PC doesn't, unless it is "neuromorphic", whatever that means, and it is "made of silicon".

@Frank: you started this, do you think there's really anything in it?
RogueAI June 08, 2021 at 00:32 #547658
Quoting frank
Good question. Did you see what I said earlier about axioms and postulates?


I skimmed over it, but this will be real quick. Are you claiming consciousness=integrated information? Because if so, then integrated information=consciousness, hence my question. Or do you mean there's a causal relationship between consciousness and integrating information?
Pop June 08, 2021 at 00:44 #547663
To all.

There lies the dilemma, what integrates the information?

frank June 08, 2021 at 00:58 #547668
Reply to RogueAI Why would integration have to be all or nothing? How about degrees of it and a threshold for consciousness?
frank June 08, 2021 at 01:12 #547673
The next axiom/postulate pair is about information. Recall, by saying that consciousness is characterized by information, we mean it's specific. If you have an experience with a book, it's with the book and not a cat.

Tononi puts it this way:

The system must specify a cause-effect structure that is the particular way it is: a specific set of specific cause-effect repertoires—thereby differing from other possible ones (differentiation). A cause-effect repertoire characterizes in full the cause-effect power of a mechanism within a system by making explicit all its cause-effect properties. It can be determined by perturbing the system in all possible ways to assess how a mechanism in its present state makes a difference to the probability of the past and future states of the system. Together, the cause-effect repertoires specified by each composition of elements within a system specify a cause-effect structure. Consider for example, within the system ABC in Figure 3, the mechanism implemented by element C, an XOR gate with two inputs (A and B) and two outputs (the OR gate A and the AND gate B). If C is OFF, its cause repertoire specifies that, at the previous time step, A and B must have been either in the state OFF,OFF or in the state ON,ON, rather than in the other two possible states (OFF,ON; ON,OFF); and its effect repertoire specifies that at the next time step B will have to be OFF, rather than ON. Its cause-effect repertoire is specific: it would be different if the state of C were different (ON), or if C were a different mechanism (say, an AND gate). Similar considerations apply to every other mechanism of the system, implemented by different compositions of elements. Thus, the cause-effect repertoire specifies the full cause-effect power of a mechanism in a particular state, and the cause-effect structure specifies the full cause-effect power of all the mechanisms composed by a system of elements.[8] " --Tononi article mentioned above


So we have distinct and exhaustive cause-effect repertoires.
Pop June 08, 2021 at 01:38 #547679
Quoting frank
So we have distinct and exhaustive cause-effect repertoires.


We may have cause effect repertoires. I wouldn't say they are exhaustive, as a moment of consciousness is a final synthesis of cause effect repertoires. What synthesizes it?

Put simply, If consciousness is the state of integrated information, what is the higher function integrating it?
frank June 08, 2021 at 01:42 #547680
Quoting Pop
Put simply, If consciousness is the state of integrated information, what is the higher function integrating it?


Evolutionary biology might be the answer. Why would we need to answer that definitively at this point?
RogueAI June 08, 2021 at 01:47 #547681
Quoting frank
Why would integration have to be all or nothing? How about degrees of it and a threshold for consciousness?


Then that would be consciousness=(some amount of) integrated information, and vice-versa. That sounds a little ad hoc, but maybe. But by taking a measured approach and setting a minimum amount of information processing that has to go on before consciousness arises (call it X) an opponent of Tononi can claim, "No, no, that's all wrong! It's X-1 [or X+1]. Then you get consciousness". Since there's no way to "get under the hood" and actually see if something is conscious or not, Tononi and his opponent are just going to go around and around with no way to prove their respective cases. It's easier to simply claim consciousness=information processing, but that has problems of it's own.
frank June 08, 2021 at 01:50 #547682
Reply to RogueAI
Around 300 years ago Newton described gravity. We're still trying to understand how it works.

A theory of consciousness doesn't have to be served up completed. It's ok if this takes a while.
RogueAI June 08, 2021 at 01:56 #547685
Reply to frank Yeah, but Newton didn't have a lot of red flags pop up right at the start. The theory he came up with almost perfectly mapped on to reality (except for Mercury's eccentric orbit which I'm not even sure was discovered in his lifetime) and made excellent predictions. I can already see what look like insolvable problems with IIT.
frank June 08, 2021 at 01:59 #547686
Reply to RogueAI The axioms are the description. Do you disagree with any of them? What would you add?
RogueAI June 08, 2021 at 02:23 #547692
"Intrinsic existence
Consciousness exists: each experience is actual—indeed, that my experience here and now exists (it is real) is the only fact I can be sure of immediately and absolutely. Moreover, my experience exists from its own intrinsic perspective, independent of external observers (it is intrinsically real or actual)."

I like this a lot.

"Consciousness is structured: each experience is composed of multiple phenomenological distinctions, elementary or higher-order. For example, within one experience I may distinguish a book, a blue color, a blue book, the left side, a blue book on the left, and so on."

I have problems with this. Consciousness is often structured, but it seems possible to clear our minds for short times during meditation and still retain consciousness. In that case, we are experiencing only our own conscious awareness, which would not be an experience composed of multiple phenomenological distinctions. I can also imagine a single thing that is not composed of anything else: a giant red blob. Mostly I agree with this.

"Consciousness is specific: each experience is the particular way it is—being composed of a specific set of specific phenomenal distinctions—thereby differing from other possible experiences (differentiation). For example, an experience may include phenomenal distinctions specifying a large number of spatial locations, several positive concepts, such as a bedroom (as opposed to no bedroom), a bed (as opposed to no bed), a book (as opposed to no book), a blue color (as opposed to no blue), higher-order “bindings” of first-order distinctions, such as a blue book (as opposed to no blue book), as well as many negative concepts, such as no bird (as opposed to a bird), no bicycle (as opposed to a bicycle), no bush (as opposed to a bush), and so on. Similarly, an experience of pure darkness and silence is the particular way it is—it has the specific quality it has (no bedroom, no bed, no book, no blue, nor any other object, color, sound, thought, and so on). And being that way, it necessarily differs from a large number of alternative experiences I could have had but I am not actually having."

Is this saying that all experiences are unique and that when an experience is happening there's something it's like to be having that experience, even if it's an experience of pure darkness and silence?

"Consciousness is unified: each experience is irreducible to non-interdependent, disjoint subsets of phenomenal distinctions. Thus, I experience a whole visual scene, not the left side of the visual field independent of the right side (and vice versa). For example, the experience of seeing the word “BECAUSE” written in the middle of a blank page is irreducible to an experience of seeing “BE” on the left plus an experience of seeing “CAUSE” on the right. Similarly, seeing a blue book is irreducible to seeing a book without the color blue, plus the color blue without the book."

I'm not sure that this is true...

"Consciousness is definite, in content and spatio-temporal grain: each experience has the set of phenomenal distinctions it has, neither less (a subset) nor more (a superset), and it flows at the speed it flows, neither faster nor slower. For example, the experience I am having is of seeing a body on a bed in a bedroom, a bookcase with books, one of which is a blue book, but I am not having an experience with less content—say, one lacking the phenomenal distinction blue/not blue, or colored/not colored; or with more content—say, one endowed with the additional phenomenal distinction high/low blood pressure.[2] Moreover, my experience flows at a particular speed—each experience encompassing say a hundred milliseconds or so—but I am not having an experience that encompasses just a few milliseconds or instead minutes or hours.[3]"

This one is fascinating, and I'm glad I clicked on your link. I want to talk about the bolded. Let's suppose we have three people. Bob is stationary, Frank is accelerating to 99% the speed of light, and Susie is also motionless, but through a magical telescope, she's able to observe Bob and Frank's brains in real time. Bob's brain should look like a normal functioning brain, but as Frank accelerates, shouldn't Suzie see Frank's brain functions go slower and slower as time dilation kicks in? And let's also say that Suzie's magic telescope can look inside Frank's mind. As Frank accelerates, would his thoughts look slower and slower to Suzie? Would the "speed of his mind" (just go with it) look slower to Suzie? And yet it must, because at the end of Frank's trip, he's going to report that he was conscious for X amount of time, while Bob reports that he was conscious for X+years more than Frank. If Suzie is watching their minds in real time, she's going to observe a divergence, and is it going to look like Frank's consciousness "slowing down"??? What would that be like? Slowing a film down?
Pop June 08, 2021 at 02:25 #547694
Quoting frank
Evolutionary biology might be the answer. Why would we need to answer that definitively at this point?


This the pertinent point. Evolutionary biology ( brain ) facilitates the information gathering and translating, but only the integrated information can create this moment of consciousness. Nothing other than the information in an integrated state can create this moment of consciousness. Nothing other then information knows how the information can be integrated. It is self organizing - Information integrating information into a synthesis of consciousness that is a state of integrated information.
frank June 08, 2021 at 02:33 #547699
Quoting RogueAI
Is this saying that all experiences are unique and that when an experience is happening there's something it's like to be having that experience, even if it's an experience of pure darkness and silence?


You're asking about the information axiom. Tononi is using "information" the way physicists do. Out of all the ways a thing can be, it's this way.

It takes a little getting used to. It's kind of subtle.

Quoting RogueAI
m not sure that this is true...


ok

Quoting RogueAI
Let's suppose we have three people. Bob is stationary, Frank is accelerating to 99% the speed of light, and Susie is also motionless, but through a magical telescope, she's able to observe Bob and Frank's brains in real time. Bob's brain should look like a normal functioning brain, but as Frank accelerates, shouldn't Suzie see Frank's brain functions go slower and slower as time dilation kicks in? And let's also say that Suzie's magic telescope can look inside Frank's mind. As Frank accelerates, would his thoughts look slower and slower to Suzie? Would his consciousness change at all? And yet it must, because at the end of Frank's trip, he's going to report that he was conscious for X amount of time, while Bob reports that he was conscious for X+years more than Bob.


Could be. :grin:

frank June 08, 2021 at 02:35 #547703
Quoting Pop
but only the integrated information can create this moment of consciousness.


I'm not sure what that means.
Pop June 08, 2021 at 03:58 #547719
Quoting frank
but only the integrated information can create this moment of consciousness.
— Pop

I'm not sure what that means.


As far as I can see, there is a continuum of integrated information, integrating more and more information on to itself. The brain provides the substrate and it facilitates the translation of sense data to information, but it cannot anticipate any instance of integrated information ( consciousness ). The information has to create this by itself, by integrating on its own. The senses and brain orient the person in place, through vision. sound, etc, but the significance of that orientation to the person is not something biology can anticipate. It would suggest the information self organizes. Like pieces of a jigsaw puzzle integrating on their own.

We normally say information integrates subconsciously, but another explanation might be that it is self organizing, as per systems theory.
MAYAEL June 08, 2021 at 05:12 #547726
I blame Richard Maurice Bucke for the majority of the New Age movement people holding to and keeping this concept of Consciousness being this kind of highest important / value system/ style religious/ God opinion thing alive and continually circulating as if it's something that needs to be contemplated more . It's the perfect definition of mental masturbation
RogueAI June 08, 2021 at 05:35 #547731
Reply to Pop We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience?
Daemon June 08, 2021 at 11:36 #547816
Quoting RogueAI

We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience?


The brain doesn't do information processing any more than digestion does. The brain does things like ion exchanges at synapses. We can describe this as information processing, but all the actual work is done by things like ion exchanges.

frank June 08, 2021 at 13:22 #547851
Reply to Pop Well, I don't totally understand IIT at this point. That's why I started this thread, in hopes of figuring out how it comes together.

So far I know that causality is big for Tononi.

I may have to buy his book. :grimace:
RogueAI June 08, 2021 at 15:15 #547900
Reply to frank I think there's a problem for IIT, though. If consciousness "flows at the speed it flows, neither faster nor slower", and Frank travels faster than Bob, then when Frank returns from his space travelling, he and Bob are going to disagree at the "speed" of which their consciousness "flows", and neither will be mistaken. Therefore, someone's consciousness went faster or slower than someone else's.
frank June 08, 2021 at 17:24 #547952
Reply to RogueAI
Bob is just going to be a lot older than Frank. They'll be able to consult with a physicist to understand why.
Pop June 08, 2021 at 20:34 #548036
Quoting frank
Well, I don't totally understand IIT at this point. That's why I started this thread, in hopes of figuring out how it comes together.


I wasn't specifically referring to IIT, though it is also a relevant question for it. Imo, the single most pertinent question regarding all this is what causes the information to integrate, as that will be consciousness, and as far as I can unravel it, the information integrates on its own, as only it can know how it fits together. That the information is self organizing would have far reaching consequences for philosophy and understanding in general.

However I sense you would rather focus on IIT, so I will leave you to it.
Pop June 08, 2021 at 20:42 #548037
Quoting RogueAI
We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience?


Why do you think it doesn't? I know somebody who has a very uncomfortable, often painful, time digesting and it ruins their day often. The totality of bodily feeling always exists in the background contributing to experience, but we normally are only aware of it when it is panful.

RogueAI June 08, 2021 at 20:58 #548047
Quoting fishfry
IIT, originated by Giulio Tonini,
— frank

Scott Aaronson debunkificated this a while back. David Chalmers shows up in the comment section.


I thought this was relevant:

"To his credit, Tononi cheerfully accepts the panpsychist implication: yes, he says, it really does mean that thermostats and photodiodes have small but nonzero levels of consciousness."
RogueAI June 09, 2021 at 02:24 #548146
Doesn't IIT entail that our consciousness should fluctuate with the amount of information integration going on? For example, sitting in a dark silent room that's neither hot nor cold should result in a severely diminished conscious state compared to doing a stairmaster at a gym, but of course that's not the case.
RogueAI June 09, 2021 at 02:28 #548148
Quoting frank
Bob is just going to be a lot older than Frank. They'll be able to consult with a physicist to understand why.


I think there's more to it than that. At time t to whatever, Bob and Frank report the same "speed" of consciousness. But if Frank accelerates enough, then at T+whatever, Bob and Frank will differ on how much conscious experience they report has happened to them, and they will both be correct. But that entails that for one (or both of them) their consciousness did not "flow at the speed it flows, neither faster nor slower".
frank June 09, 2021 at 14:47 #548285
Quoting Pop
Imo, the single most pertinent question regarding all this is what causes the information to integrate, as that will be consciousness, and as far as I can unravel it, the information integrates on its own,


IIT says a conscious system has a certain amount of internal causation.
Pop June 10, 2021 at 06:21 #548485
Quoting frank
IIT says a conscious system has a certain amount of internal causation.


This is well expressed. But I wonder, can you see how perception (extra information) has to integrate with already established information to form understanding?

frank June 10, 2021 at 13:22 #548592
Quoting Pop
This is well expressed. But I wonder, can you see how perception (extra information) has to integrate with already established information to form understanding?


At the early stages, IIT isn't trying to explain the experience of understanding.

It's trying to set out a set of correlates of consciousness.

On the one hand, we have behavioral correlates of consciousness:. BCC. On the other, we have neurological correlates: NCC.

Traditionally, science has worked on matching those two. The Hard problem pointed out that this approach leaves out the most amazing thing about consciousness (though ITT supporters are quick to say that Leibniz identified the hard problem first, don't know why they think that's important, though. Chalmers is obviously the instigator here, not Leibniz)

So now we're describing experience itself, the graphic is the view out of one of your eyeballs), and attempting to hypothesize about correlates of that in the neuronal realm.

Make sense?


Pop June 10, 2021 at 18:32 #548715
Quoting frank
So now we're describing experience itself, the graphic is the view out of one of your eyeballs), and attempting to hypothesize about correlates of that in the neuronal realm.


IIT is a computational theory of consciousness that blocks out the hard problem. Its quite a strange approach, that initially starts confidently in phenomenology. It derives all its axioms in phenomenology, but then that's the last we hear of phenomenology. Instead the theory focuses on a calculation of consciousness through a cause effect repertoire ( which unfortunately is beyond my ability to properly scrutinize).

I see many problems with it, but the main one being that the felt quality of experience is left out of the axioms. The felt quality of consciousness is dealt as a secondary consideration that is simply explained by qualia being equal to consciousness, as if its an irrelevant consideration. But many phenomenologists see consciousness as being composed of two poles - cognition, and experiential reaction. Tononi blocks out experiential reaction, so one wonders what exactly is he calculating as consciousness, since half the information is being ignored?

Its not really a theory of consciousness, in my view, since the hard problem is being ignored, and in being ignored only half of consciousness is being calculated. It seems more of a proposal of a way to calculate cognition. So on the basis of this I'm not going to analyze it further.

What I like about IIT is that it crystalizes the idea that consciousness is integrated information, and that it acknowledges the validity of phenomenology ( normally dismissed offhand as unscientific by physicalists / computationalists ).

Of course the hard problem would be blocked out, as a felt quality can not be conceptualized ( being felt slightly differently in every end user ), so can never be calculated in any universal sense. So I don't have much hope for this approach as a path toward a theory of consciousness.
RogueAI June 11, 2021 at 15:11 #549036
Reply to Pop Well said.
bert1 June 11, 2021 at 15:53 #549044
Quoting Pop
Its not really a theory of consciousness, in my view, since the hard problem is being ignored, and in being ignored only half of consciousness is being calculated. It seems more of a proposal of a way to calculate cognition. So on the basis of this I'm not going to analyze it further.


Yes that's the conclusion I came to as well. There's no answer to the question "OK, by why can't integrating information happen in the dark?" As if often the case with functionalist views, they come to an interesting point, but when faced with the problem of 'Yes, but why does that result in an experience exactly?" they tend to abandon theory and opt for definition by fiat instead. They say "Oh, but that's just what 'experience' means. There is nothing more other than that." Which is nonsense. I certainly do not mean 'integrated information' when I talk about consciousness.

I do think the IIT is an interesting theory of something. Maybe it is a way to define conscious individuals, and that would solve a problem that besets a lot of panpsychist views, namely Searle's question "What are the units supposed to be?" Maybe the conscious subject is that system that integrates information. And maybe the more information the system integrates in interesting ways, the more varied and rich the associated experiences of that subject are. But as you say, it just doesn't touch the basic question of why we should think that integrated information is consciousness, why it creates a first person perspective at all.
frank June 11, 2021 at 16:18 #549050
Quoting Pop
IIT is a computational theory of consciousness that blocks out the hard problem.


What do you mean "blocks the hard problem?". It attempts to answer it.

Quoting Pop
The felt quality of consciousness is dealt as a secondary consideration that is simply explained by qualia being equal to consciousness,


Isn't it?

Quoting Pop
Of course the hard problem would be blocked out, as a felt quality can not be conceptualized ( being felt slightly differently in every end user ), so can never be calculated in any universal sense.


So we need a unique theory of consciousness for every incidence of it?
frank June 11, 2021 at 16:23 #549051
Quoting bert1
They say "Oh, but that's just what 'experience' means. There is nothing more other than that." Which is nonsense. I certainly do not mean 'integrated information' when I talk about consciousness.


The theory starts with a description of consciousness. I don't think you could say that issue was glossed over.

Quoting bert1
But as you say, it just doesn't touch the basic question of why we should think that integrated information is consciousness, why it creates a first person perspective at all.


So after considering the theory, do you end up where you started? Or did you end up with a finer understanding of your own expectations for a theory of consciousness? Or what?
bert1 June 11, 2021 at 20:21 #549131
Reply to frank Yes, you're right regarding Tononi, I was unfair. I was generalising but should not have included Tononi in that.

Yes I did end up with a clearer expectation. I'm a fan of Tononi, I just think he's wrong. It's great that he started with phenomenology and his theory is interesting.
frank June 11, 2021 at 22:38 #549170
Reply to bert1 Cool. I'm still working on understanding his theory, so I'm not much use defending him. :grin:
Pop June 12, 2021 at 06:23 #549305
Quoting bert1
Maybe the conscious subject is that system that integrates information.


Yes, the system that integrates the information possesses consciousness, in my view. But then everything is a system, and all systems do this, so we are really talking about degrees of consciousness.

I like the idea of integrated information as representing consciousness. I think it will stick, but I think what Tononi does not acknowledge is that a lot of that information is emotional, and we don't know what emotions are, so how can we measure them? And if we are not measuring emotions, how are we measuring consciousness?

Quoting frank
What do you mean "blocks the hard problem?". It attempts to answer it.


How?

Quoting frank
The felt quality of consciousness is dealt as a secondary consideration that is simply explained by qualia being equal to consciousness,
— Pop

Isn't it?


No. Saying qualia is equal to consciousness is a clever way of avoiding explaining the mechanism of consciousness. I think it was mentioned earlier how moments of consciousness can last 1 to 400ms. This means consciousness is a process of variable duration, a mechanism. It includes cognition, emotion, and final synthesis. Once these are integrated we have our moment of consciousness. A theory of consciousness would explain all this in the context of the theory. IIT does not.

Quoting frank
So we need a unique theory of consciousness for every incidence of it?


What we need is a first person theory that describes the mechanism and why of consciousness, put in broad phenomenological terms such that each person reading it can accept or dismiss it on the basis of their own introspection.
Such an approach has traditionally been against the rules, and unscientific. But given the inroads that IIT has made, at least in part, on the basis of phenomenology, such a theory may now be more acceptable. :up:



frank June 12, 2021 at 13:21 #549379
Quoting Pop
What do you mean "blocks the hard problem?". It attempts to answer it. — frank


How?


It's kind of obvious. The Hard problem straddles philosophy of mind and philosophy of science. It's a call for a theory of consciousness that addresses the subjective character of consciousness.

That's exactly what IIT is attempting to do. It starts with the assumption that consciousness is a brain based system. The parameters of this system are assumed to be constrained by the the nature of subjective experience.

Quoting Pop
Saying qualia is equal to consciousness is a clever way of avoiding explaining the mechanism of consciousness.


You've lost me, but I'll leave this here. I'll be moving on with explaining the theory.
Pop June 12, 2021 at 23:52 #549593
Quoting frank
It's kind of obvious. The Hard problem straddles philosophy of mind and philosophy of science. It's a call for a theory of consciousness that addresses the subjective character of consciousness.

That's exactly what IIT is attempting to do. It starts with the assumption that consciousness is a brain based system. The parameters of this system are assumed to be constrained by the the nature of subjective experience.


The hard problem of consciousness is that every moment of consciousness has its feeling, that is either painful or pleasurable, thus consciousness has a "what it feels like" quality. A true theory of consciousness will explain why this is. IIT avoids this question like the plague. As obviously, if it were to tackle it, no quantification could take place.

It is not possible for us to be indifferent about a moment of consciousness, If we were indifferent we would be like philosophical zombies, thus we would have no reason / impetus to go on living.

That every moment of consciousness has its feeling, and that this feeling is the pertinent principle of consciousness should be an IIT axiom, but it is swept under the carpet as the qualia of cognition, and not explained further.

Tononi proceeds on the basis that a brain state is equal to its integrated information, and then sets off to quantify this information in various ways ( and loses me in the process ). This approach may be valid in principle, but given the deficiencies of the axioms, I'm reluctant to spend time trying to understand it. Perhaps you can shine some light on this aspect of the theory.
frank June 13, 2021 at 00:33 #549602
Reply to Pop Basically, IIT is saying that experience arises from a system that acts upon itself. "Knowing what it's like" is an aspect of experience. Is it so different from the other axioms Tononi outlined? It's not clear to me that it is, but if it is, would that collapse IIT?

If you insist that there is an aspect of experience that can't be described, I don't think it will just be ITT that collapses. Any hope of a theory of consciousness would be gone in that case. So it wouldn't be ITT that you're criticizing.

If the 'what it's like' can be described, it could be added to the axioms, and ITT survives.

So your opposition is either to ITT in its present form (its supporters don't see it as finished), or you're against any effort to create a theory of consciousness.
frank June 13, 2021 at 00:35 #549604
Quoting Pop
Tononi proceeds on the basis that a brain state is equal to its integrated information,


I don't think you're understandung the theory.
Pop June 13, 2021 at 05:48 #549693
Quoting frank
Basically, IIT is saying that experience arises from a system that acts upon itself


Is self organizing? - this I agree with, but central to the self organization is a feeling driving it, and a theory of consciousness needs to explain this.

Quoting frank
If you insist that there is an aspect of experience that can't be described, I don't think it will just be ITT that collapses


Not that it can not be described, that it can not be quantified, as the feeling has a different value in each system. Its value is intrinsic to the system - is my feeling the same as your feeling? No, so how do you quantify feeling? How do you quantify something that is felt. That you can not even conceptualize - that can only be felt! How do you measure something you cannot conceptualize? If your not measuring the feeling of a state, how are you measuring consciousness?

Quoting frank
If the 'what it's like' can be described, it could be added to the axioms, and ITT survives.
- Yes, but then measurement fails.

A theory of consciousness needs a theory of emotion, that describes the role of emotion in self organization. Not just say emotion is an aspect of the system. Nature does not do things whimsically, and for good looks. If emotions exist, they have a function, imo.

Quoting frank
Tononi proceeds on the basis that a brain state is equal to its integrated information,
— Pop

I don't think you're understandung the theory.


In that statement I was recalling a lecture of his where he was explaining different brain states in terms of his cause and effect repertoire. Stricktly speaking IIT would say system, but in his lectures the system he focuses on is the human brain.

I think Tononi would say PHI is not equal to consciousness, but is a valid measure of consciousness.
frank June 13, 2021 at 21:19 #549927
Quoting Pop
Is self organizing? - this I agree with, but central to the self organization is a feeling driving it, and a theory of consciousness needs to explain this.


Is consciousness self organizing? I know a guy who has a genetic disorder. Everyday, all day long, he attends to three cell phones and a tablet that he uses to create a kind of music. He plays the same sequence of sounds (taken from YouTube mostly) over and over on multiple devices. It gets strange when he plays a sequence backward over and over. He can understand language, but he doesn't talk.

People like him emphasize to me that there's a genetic basis for what we call normal consciousness. So I don't think consciousness organizes itself.

An idea about feeling us that it involves attention to oneself. In addition to a stimulus flowing through the brain, triggering this hormonal response or that motor activity, there's also interactive self monitoring. Maybe this is ground zero for feeling.

I'm not saying you should believe this, just consider it.
Pop June 13, 2021 at 23:07 #550004
Quoting frank
People like him emphasize to me that there's a genetic basis for what we call normal consciousness. So I don't think consciousness organizes itself.


Definitely there is a genetic basis. Genetics creates the arena ( neural network ) and neuroplasticity facilitates the evolving self organization of information. Certainly if that physical structure is damaged for some reason, the consciousness that can be achieved is changed in line with the damage, but unless the person dies a consciousness of some sort persists.

Its not to do with being self aware, that a feeling exists. That feeling is there life long in the background orienting us in our self construed reality always. We normally only notice it when our anticipated reality is challenged - when something out of the ordinary occurs, then our emotions are amplified and strongly felt.

Biology can not anticipate these moments, only the information self organizing new information onto itself can construe these moments, as an instance of consciousness. Then, it seems, it takes some time for biology to build some structure around these new thoughts ( we say for reality to sink in) to establish them permanently perhaps - this is speculative on my part.
frank June 14, 2021 at 03:32 #550185
Quoting Pop
Its not to do with being self aware, that a feeling exists.


Really? That's odd.
Pop June 14, 2021 at 06:07 #550207
Quoting frank
Really? That's odd.


In phenomenology, every moment of consciousness has its feeling. Or are you kidding me?
Pop June 14, 2021 at 07:16 #550215
Reply to frank Emotions are the soppy, mawkish feelings that we have traditionally suppressed in empirical objectivity, but in phenomenology, particularly the philosophical zombie argument, emotions are the things that create consciousness. The computational information integration in a sense is optional ( any computer can do that ) the emotion is essential.

Explaining this is a hard problem for any theory of consciousness.
frank June 14, 2021 at 18:17 #550473
Quoting Pop
Emotions are the soppy, mawkish feelings that we have traditionally suppressed in empirical objectivity, but in phenomenology, particularly the philosophical zombie argument, emotions are the things that create consciousness.


I pretty strongly disagree with this. Emotion is an element of experience. There are conditions that produce a 'flat affect.'. These people are fully conscious, but don't report or demonstrate emotion. They're usually taken to be rude. :grin:

Quoting Pop
Explaining this is a hard problem for any theory of consciousness.


The Hard problem is not about emotion per se.
RogueAI June 14, 2021 at 21:02 #550509
Reply to frank I tend to agree. If emotions create consciousness, wouldn't strong emotions create a strong sense of consciousness? Not necessarily, but the implication is there. I'm pretty emotionally neutral at the moment, but I don't feel any less conscious than times I was really happy/sad/scared/etc.

And the Hard Problem is about how consciousness arises from non-conscious matter and why we are conscious.
Pop June 14, 2021 at 22:37 #550531
Quoting frank
I pretty strongly disagree with this. Emotion is an element of experience. There are conditions that produce a 'flat affect.'. These people are fully conscious, but don't report or demonstrate emotion. They're usually taken to be rude. :grin:


When I say emotion, I do not mean wild rapture, anger, or sadness necessarily. What I mean is that every instance of consciousness has its feeling - the feeling represents the emotion being felt, and what is felt is directly related to what is being cognized. Every moment of consciousness has its "what it feels like" quality! This may indeed be flat.

Reply to RogueAI Having strong emotions does not necessarily mean more consciousness, but what exactly the relationship of emotions and consciousness is should be explained in any theory of consciousness, imo. IIT simply says integrated information ( consciousness ) has qualia. I do not find that a satisfying explanation. As stated before If emotions exist, then they exist for a purpose.

In the philosophical zombie argument, emotion was found to be the difference between a conscious and a non conscious entity. The insight being that if we were inert about any moment of consciousness ( did not posses a feeling about it ), then there would be no reason to interact with it, and so consciousness would be dysfunctional ( effectively would be impossible) The feeling moves us to resolve an instance of consciousness, imo.

Reply to RogueAI Reply to frank We can not be indifferent about an instance of consciousness - absolutely! Try it, try to be indifferent about an instance of consciousness and tell me how you fared?

RogueAI June 14, 2021 at 23:09 #550540
Reply to Pop I agree. I didn't know what you meant by "emotion" at first.
Pop June 14, 2021 at 23:24 #550546
Reply to RogueAI Imo, the only people who can be indifferent about an instance of consciousness, are people who can meditate to a depth of ineffability, where they cannot say / recall anything about their experience. So in a sense they obliterate consciousness.
frank June 14, 2021 at 23:31 #550547
Reply to Pop I'm not sure what you mean by indifferent. What's an example of it?
Pop June 14, 2021 at 23:37 #550548
Reply to frank Perhaps my reply to RogueAI would be the ultimate example. To be indifferent to the situation one finds oneself in, one has to tune out of it in some way. The point I'm trying to make is that its not possible, so difficult to find an example.
Pop June 14, 2021 at 23:53 #550553
Reply to frank My point is every moment of consciousness is either too hot, or too cold, too bitter, or too sweet, is painful, or pleasurable, and sometimes is just right. We are never indifferent about it. Thus not being indifferent, we are emotionally driven. Driven by our feelings about the situation we find ourselves in.
frank June 15, 2021 at 00:58 #550570
Reply to Pop I think a challenge to creating a theory of consciousness is that we really aren't all the same.

For me, silencing judgment is easy. It's my baseline. I used to put it this way: I can stare and 2 +2 without being aware that it equals 4.

Suspension of judgement is a very valuable tool. It's a two edged sword though. You have to make judgments to live.

This might be a reason for starting with a bare bones theory: so focusing on what's most basic for all of us.

RogueAI June 15, 2021 at 01:02 #550572
Quoting Pop
Imo, the only people who can be indifferent about an instance of consciousness, are people who can meditate to a depth of ineffability, where they cannot say / recall anything about their experience. So in a sense they obliterate consciousness.


I would agree that people in such states are probably not doing any kind of memory creation.
Pop June 15, 2021 at 01:29 #550581
Quoting frank
Pop I think a challenge to creating a theory of consciousness is that we really aren't all the same.


Yes absolutely this is the problem. :smile: It is the problem for phenomenological approaches particularly, as whilst it is generally agreed upon that we are emotionally driven, not everybody is emotionally driven equally, and not everybody is self aware equally. Indeed ones understanding of consciousness and its mechanisms, if any, becomes ones consciousness. Whilst all around, there exists different understandings, which are equally valid manifestations of consciousness. :smile:

IIT leaves room for different interpretations to plug into it. It is quite clever in many ways. We will just have to wait and see how it pans out in the long run.

Reply to frank Reply to RogueAI Anyhow, its always good to find intelligent and thoughtful conversation. :up:
RogueAI June 15, 2021 at 01:48 #550583
frank June 15, 2021 at 14:02 #550734
Quoting Pop
Yes absolutely this is the problem. :smile: It is the problem for phenomenological approaches particularly, as whilst it is generally agreed upon that we are emotionally driven, not everybody is emotionally driven equally, and not everybody is self aware equally. Indeed ones understanding of consciousness and its mechanisms, if any, becomes ones consciousness. Whilst all around, there exists different understandings, which are equally valid manifestations of consciousness. :smile:


That's true, but it's not what I meant. Some people can quiet their emotions. Some can't. The two will have differing ideas about what constitutes consciousness. So we end up in the same place by different routes. :joke:

Quoting Pop
IIT leaves room for different interpretations to plug into it. It is quite clever in many ways. We will just have to wait and see how it pans out in the long run.


Awesome, thanks.

Quoting Pop
Anyhow, its always good to find intelligent and thoughtful conversation. :up:


Absolutely.

Pop June 15, 2021 at 23:38 #551069
Quoting frank
Some people can quiet their emotions. Some can't.


But none can suspend them entirely! :razz: IIT agrees with this in saying every instance of integrated information possesses qualia, but doesn't explain further. In my understanding emotion is the basis of self organization, in some way that I'm not absolutely certain about - yet.

On a side note, I think this issue will be resolved, in the next 20 years, as AI develops further. Currently the best open source AI is GPT3, I'm sure its not a patch on Google or Alibaba AI, but its fairly sophisticated, and its something we can play with. I saw a video recently where it claimed to have emotions! How can we prove it doesn't? AI is self learning and programming, and in a couple of generations will be, for all intents and purposes, self organizing, so perhaps it will have emotions? In any case, it should confirm or deny IIT, and resolve many of these murky issues regarding consciousness.
frank June 15, 2021 at 23:52 #551073
Reply to Pop Have you read about MuZero? It's one of the the AlphaGo descendants.
Pop June 16, 2021 at 00:01 #551079
Reply to frank I have heard of its capabilities, yes, it is scarey!
frank June 16, 2021 at 00:16 #551091
Reply to Pop It's a little unnerving, yea.

Hey I did a pun.
Possibility June 17, 2021 at 00:56 #551743
Quoting frank
I think a challenge to creating a theory of consciousness is that we really aren't all the same.

For me, silencing judgment is easy. It's my baseline. I used to put it this way: I can stare and 2 +2 without being aware that it equals 4.

Suspension of judgement is a very valuable tool. It's a two edged sword though. You have to make judgments to live.

This might be a reason for starting with a bare bones theory: so focusing on what's most basic for all of us.


These are interesting points.

I agree that the faculty of judgement - our capacity both to suspend judgement and to render one - is a key aspect of consciousness. For this reason, it surprises me how few philosophers have explored Kant’s third critique - or simply browsed it in relation to the first, barely getting past the second moment.

I can be aware that 2+2 can equal 4 without choosing to attribute this potential relation with much significance in the moment. My focus is on the qualitative structure: the relative position of shapes and lines on the screen. It isn’t necessarily that I’m unaware - but I’m currently not ‘being aware’ - of the quantitative significance of what I’m observing. I’m always reminded of a childhood ‘trick’ stating that “one plus one equals window”, or the illusion drawing of seeing the young woman or the old one. As I see the young woman, I ‘know’ that the old woman exists potentially in the same structure - I just can’t ‘observe’ both at once.

But none of these above examples - line drawings or mathematics - refer to actual judgements. It’s all simulation. We’re not risking any effort one way or the other. What we’re doing is exploring the variability of potential/significance in observing the same physical system by changing the organisational structure (logic-based mathematics or quality-based aesthetics) in which we distribute our attention.

IIT begins with a physical event we refer to as consciousness, and is proposing an underlying logic-based organisational structure that would lend a mathematical predictability to this event.

Classical science, as a rule, relies on aligning organisational structures between observation/measurement devices. Its modern error margins on a cosmic and quantum scale lie in the assumptions it makes about this supposed alignment. Quantum mechanics demonstrates the qualitative variability in predicting a physical event by using a logic-based organisational structure to determine the predictive distribution of energy (ie. attention/effort).

So when quantum physics brackets out this qualitative variability, it has to acknowledge both a probabilistic uncertainty in the distribution of energy, and a particle-wave property duality.

IIT, too, brackets out the qualitative variability of consciousness in an attempt to predict its occurrence in relation to a specific logical system. So I imagine that, even if we could more accurately quantify consciousness as they propose, the theory is going to have to admit to a similar probabilistic uncertainty in the physical (material) location of this consciousness, as well as a duality in its properties. So it won’t address the hard problem of consciousness any more than quantum physics addresses its own phenomenology. But I think it may improve the way that logic-based systems and structures interact with consciousness.

I just don’t think that humans, or indeed all life and all energy, are entirely logic-based systems. We can’t keep pretending that this qualitative variety in organisational systems doesn’t alter predictive distributions of energy (attention and effort) whenever we take our focus off the numbers. In my view, the key to the hard problem of consciousness lies in this energy-affect property duality. Perhaps we can explore this capacity to translate potential energy-affect between quantitative and qualitative organisational structure, in much the same way that an artist switches between two ways of looking at the world.

Just offering a different (unconventional) perspective - I’m enjoying this discussion of IIT, by the way. I hope it continues.
TheMadFool June 17, 2021 at 01:24 #551748
What can I say that hasn't already been assumed in IIT? My last liaison with math was in high school. I'm keeping my fingers crossed that consciousness turns out to be a mathematical pattern, expressible as a formula the likes of Newton's F = ma or other some such. This would mean both good news and bad news. The good news: it'll be possible to create consciousness. The bad news: no unique consciousness as the formula would be generic. We would be able to generate consciousness but not a specific one like yours or mine. A real bummer, if you ask me.
RogueAI June 17, 2021 at 02:08 #551767
Quoting TheMadFool
keeping my fingers crossed that consciousness turns out to be a mathematical pattern


How on Earth can mathematical patterns be consciousness? Why should someone take that as a serious possibility? Also, if that's the case, there should have been evidence of it by now. Consciousness and mathematical patterns have existed for a very long time. Why has there not been any proof the two are causally connected (or the same thing)? I don't think any proof will be forthcoming and this problem is just going to get more and more acute.
TheMadFool June 17, 2021 at 02:13 #551770
Quoting RogueAI
How on Earth can mathematical patterns be consciousness? Why should someone take that as a serious possibility? Also, if that's the case, there should have been evidence of it by now. Consciousness and mathematical patterns have existed for a very long time. Why has there not been any proof the two are causally connected (or the same thing)? I don't think any proof will be forthcoming and this problem is just going to get more and more acute.


The short answer (to your questions): I don't know.

The long answer: I'm working with the hypothesis that consciousness is some kind of pattern, to take a physicalist stance, in matter-energy. We already have a pretty good idea that matter-energy and mathematical patterns are connected in a very initmate way (physics, chemistry). I then just put two and two together and came to the conclusion that consciousness could one day be expressed as a formula. Speculation of course, nothing definitive.
RogueAI June 17, 2021 at 02:23 #551773
Quoting TheMadFool
The short answer (to your questions): I don't know.

The long answer: I'm working with the hypothesis that consciousness is some kind of pattern, to take a physicalist stance, in matter-energy. We already have a pretty good idea that matter-energy and mathematical patterns are connected in a very initmate way (physics, chemistry). I then just put two and two together and came to the conclusion that consciousness could one day be expressed as a formula. Speculation of course, nothing definitive.


Personal incredulity aside, I think this runs into a Mary's Room problem. If an experience can be expressed mathematically, then if a blind person knew the right maths/numbers, they could deduce, from the math alone, what it's like to see (and also what it's like to be a bat, if they know the right math). Doesn't that seem wrong? I don't see someone blind can know what it's like to see without having the experience of seeing.

And then of course, there's the issue of what kind of substrate the pattern is being run on, and how would you go about verifying if it's substrate-dependent or not? How would you test that mathematical pattern X,Y,Z is a conscious moment? I can see how you can claim that a conscious moment has a mathematical correlate, because we can express the physical brain state assosciated with the conscious brain state mathematically, but then you're back to the causal problem.

But I will grant you that you can correlate mental states with numbers. That is significant.
TheMadFool June 17, 2021 at 03:05 #551783
Quoting RogueAI
Personal incredulity aside, I think this runs into a Mary's Room problem. If an experience can be expressed mathematically, then if a blind person knew the right maths/numbers, they could deduce, from the math alone, what it's like to see (and also what it's like to be a bat, if they know the right math). Doesn't that seem wrong? I don't see someone blind can know what it's like to see without having the experience of seeing.


I did consider that angle. Maybe we're missing something very important. Suppose we do manage to discover the mathematical formula for consciousness but then what does that mean? Does it relate the state of consciousness with the variables energy, charge, etc.? My understanding of science gives me the impression that, yes, the mathematical formula for consciousness is going to be as general as that. The upside is consciousness will no longer have to be organic i.e. it can be replicated on other kinds of media. The downside is specific, particular consciousnesses won't be possible. I guess this all squares with my intuition that specific/particular consciousnesses, like yours or mine, are a function of what consciousness is processing. So, while consciounsess itself maybe generic, common to all, an individual one can be created by feeding it specific thoughts.

Mary's room issue plays a central role in my personal view regarding all things mind. I recall mentioning in another conext the difference between comprehension and realization. I don't know how true this is but geniuses are supposed to feel equations, arguments, whatnot i.e. they're capable of getting a very personal, subjective experience when they encounter objective but profound arguments and elegant equations - the words, "profound" and "elegant" reflect that aspect of realization as opposed to mere comprehension. So, yeah, although Mary's Room argument suggests that getting an objective account of the color red is missing the subjective experience of red, my take on it is, a person who's in the habit of realizing instead of just comprehending will, by my reckoning, be able to experience red just by reading up all the information available on red. I hope all this makes sense at some level.

RogueAI June 17, 2021 at 13:44 #551966
Reply to TheMadFool Yeah, that made sense. Perhaps an objective math formula can bring about a state of synesthesia in a blind person so that their processing of the equation brings about a mental state that is similar enough to seeing so that they know what seeing is like. Although, in that case, some kind of experience is still necessary for knowing what seeing is like- the formula, if there is one, would simply act as a bridge allowing the blind person to make a "what is it like" realization about seeing without ever seeing. I don't know how much sense that made.
TheMadFool June 17, 2021 at 14:07 #551984
Quoting RogueAI
Yeah, that made sense. Perhaps an objective math formula can bring about a state of synesthesia in a blind person so that their processing of the equation brings about a mental state that is similar enough to seeing so that they know what seeing is like. Although, in that case, some kind of experience is still necessary for knowing what seeing is like- the formula, if there is one, would simply act as a bridge allowing the blind person to make a "what is it like" realization about seeing without ever seeing. I don't know how much sense that made.


You read my mind! :up:

If I may be allowed some further speculation, the mind seems to be capable of so much more than we give it credit for. The "...bridge..." you mentioned in re a vision-impaired person is exactly the metaphor I'm looking for. The mind can, if we allow it to, bridge the gap between objective knowledge and the subjective knowledge that it allegedly lacks. Reminds me of phenomenology whose goal is, if memory serves, to bring descriptions up to the level of experience
RogueAI June 17, 2021 at 15:05 #552004
Gnomon June 17, 2021 at 23:00 #552276
Quoting RogueAI
keeping my fingers crossed that consciousness turns out to be a mathematical pattern — TheMadFool
How on Earth can mathematical patterns be consciousness?

The technical mathematical calculations of IIT are way over my head. But, I think Reply to TheMadFool's wording of the relationship -- seeming to identify Consciousness with Mathematical patterns -- has the direction of perception backward. Patterns (forms), mathematical or otherwise, are what we are conscious of. Patterns are the external "objects" that our subjective Consciousness interprets as meaningful, including mathematical values & social relationships.

IIT is a novel way of thinking about Consciousness, which gives the impression of scientific validity in its use of mathematics. But it still doesn't tell us what Consciousness is, in an ontological sense. Since Consciousness as a computative process is meta-physical, we can only define it with metaphors -- comparisons to physical stuff. And Mathematical Logic may be as good an analogy as we can hope for. But the big C is not simply a pattern itself, it's the power (ability) to decipher encoded patterns (think Morse code). That's why I say it's a form of generic Enformation (EnFormAction) : the epistemological power to create and to decode Forms into Meaning. :smile:

Can Integrated Information Theory Explain Consciousness? :
So, although IIT is a useful theory for understanding “C” for scientific purposes, it doesn’t really answer the “hard” philosophical questions, such as “how and why do we experience subjective qualia?”
http://bothandblog7.enformationism.info/page80.html

Consciousness :
Literally : to know with. To be aware of the world subjectively (self-knowledge) and objectively (other knowing). Humans know Quanta via physical senses & analysis, and Qualia via meta-physical reasoning & synthesis. In the Enformationism thesis, Consciousness is viewed as an emergent form of basic mathematical Information.
http://blog-glossary.enformationism.info/page12.html

TheMadFool June 18, 2021 at 03:07 #552384
Quoting Gnomon
?TheMadFool's wording of the relationship -- seeming to identify Consciousness with Mathematical patterns -- has the direction of perception backward. Patterns (forms), mathematical or otherwise, are what we are conscious of. Patterns are the external "objects" that our subjective Consciousness interprets as meaningful, including mathematical values & social relationships.


Excuse me! In my defense, the current mathematical, scientific and neuroscientific paradigms can't seem to be able to get the study of consciousness off the ground sans a plausible model one of which is consciousness, thoughts to precise, are so-called brain states. To my knowledge brain states are generally construed to be patterns in the neural network. It remains a matter of debate whether such neural network patterns can be captured in a mathematical formula or not but I'm sure there's a neat little math trick you can employ as a workaround. I'm fairly confident, based on the history of Newton' & Leibniz's calculus, that if the aforementioned task seems impossible, all that would be needed is a brand new mathematical tool that'll do the job in a manner of speaking.
Gnomon June 18, 2021 at 17:28 #552726
Quoting TheMadFool
all that would be needed is a brand new mathematical tool that'll do the job in a manner of speaking.

Some mathematicians & physicists, have advocated the "new science" of Cellular Automata, as a way to go beyond Analytic and Algorithmic methods in the search for knowledge. Unfortunately, as a path to new knowledge, CA may not appeal to analytical and reductive thinkers, because it is ultimately "undecidable". Stephen Wolfram, in his book, A new Kind of Science, advocates CA as a way to study complex systems, such as Minds, that are resistant to reductive methods. In other words, the new methods, including IIT, take a more holistic approach to undecidable and non-computable questions, such as "what is it like to be a bat?". :smile:


Penrose argues that human consciousness is non-algorithmic, and thus is not capable of being modeled by a conventional Turing machine, which includes a digital computer.
https://en.wikipedia.org/wiki/The_Emperor%27s_New_Mind

Cellular Automata :
The Game of Life is undecidable, which means that given an initial pattern and a later pattern, no algorithm exists that can tell whether the later pattern is ever going to appear.
https://en.wikipedia.org/wiki/Conway%27s_Game_of_Life

A New Kind of Science :
Wolfram argues that one of his achievements is in providing a coherent system of ideas that justifies computation as an organizing principle of science. For instance, he argues that the concept of computational irreducibility (that some complex computations are not amenable to short-cuts and cannot be "reduced"), is ultimately the reason why computational models of nature must be considered in addition to traditional mathematical models.
https://en.wikipedia.org/wiki/A_New_Kind_of_Science
Possibility June 19, 2021 at 01:07 #553102
Quoting TheMadFool
Maybe we're missing something very important. Suppose we do manage to discover the mathematical formula for consciousness but then what does that mean? Does it relate the state of consciousness with the variables energy, charge, etc.? My understanding of science gives me the impression that, yes, the mathematical formula for consciousness is going to be as general as that. The upside is consciousness will no longer have to be organic i.e. it can be replicated on other kinds of media. The downside is specific, particular consciousnesses won't be possible. I guess this all squares with my intuition that specific/particular consciousnesses, like yours or mine, are a function of what consciousness is processing. So, while consciounsess itself maybe generic, common to all, an individual one can be created by feeding it specific thoughts.

Mary's room issue plays a central role in my personal view regarding all things mind. I recall mentioning in another conext the difference between comprehension and realization. I don't know how true this is but geniuses are supposed to feel equations, arguments, whatnot i.e. they're capable of getting a very personal, subjective experience when they encounter objective but profound arguments and elegant equations - the words, "profound" and "elegant" reflect that aspect of realization as opposed to mere comprehension. So, yeah, although Mary's Room argument suggests that getting an objective account of the color red is missing the subjective experience of red, my take on it is, a person who's in the habit of realizing instead of just comprehending will, by my reckoning, be able to experience red just by reading up all the information available on red. I hope all this makes sense at some level.


I think maybe you are missing something important: quality. We tend to split quality into:
- what we can isolate from affect (that is, what we can consolidate into quantised concepts) and
- the affected quality of experience - what we attempt to quantise as emotions, feelings or ‘qualia’.

The ‘profound’ or ‘elegant’ quality of certain equations is an affected relation to their structural quality beyond logical or mathematical concepts. It’s an aesthetic quality, irreducible to concepts but nevertheless entirely reasonable, rational.

I understand quality to be pure relational or organisational structure: an existence of relation without substance. In language, we can’t really make sense of quality until we attribute it as a property of. We talk about the ‘qualities’ of an object, of an experience, or the ‘quality’ of a relationship, or an idea. This is because quality is highly variable phenomenologically - it appears differently, according to the relative positions of everything, including ourselves.

So, an objective account of the colour red is complex and uncertain. In my experience there is a qualitative structure of logic and energy I call ‘red’ and a qualitative structure of logic and energy (attention and effort) I embody in relation to it. As another experiencing subject you can relate to this structure I embody and adjust your predictive distribution of attention and effort to account for our relative difference in position, so that you can predict how you might have related to this ‘red’ in my position, and how I might relate to ‘red’ in your position. When you reduce this complex relational structure to a concept, for efficiency you would ignore overlap and exclude variability (noise), similar to digital sampling. So the concept ‘red’ that we share is not an actual structure, but a typical qualitative pattern of logic and energy, relative to a predictive logic and distribution of attention and effort that you or I can embody in relation to it. And we can actualise this concept in a variety of structural forms, according to available energy and common system logic.

So, for Mary to predict an experience of ‘red’, she would need to work backwards. In reading available information on red, she would need to find a way to relate all that information to a predictive logic and distribution of attention and effort that she can embody in relation to the information. To do that, she would need to be aware of the differences in relative position between the embodied relation that generated the information (the observer/measuring device), and her potential embodied relation. So it’s not just the information on red that she needs, but how that relative position might be similar and/or differ from her own.

But she wouldn’t actually experience red until she embodies the relation - until the moment of interaction between available energy and qualitative patterns/structures in a common system logic. So the question is really whether Mary can predict and therefore recognise an experience of ‘red’ when she encounters one.
TheMadFool June 19, 2021 at 08:22 #553222
Gnomon June 19, 2021 at 17:01 #553404
Reply to frank
Quoting Possibility
I understand quality to be pure relational or organisational structure: an existence of relation without substance. In language, we can’t really make sense of quality until we attribute it as a property of.

IIT seems to be intended as a step toward computerizing Consciousness. If you can quantify mental qualities, then you can conceivably construct a Star Trek Transporter, which analyzes a human body & mind into 1s & 0s, then transmits that digital information across space to a receiver, which then interprets the abstract numbers back into a concrete living thinking feeling human. But some Star Trek episodes addressed the reluctance of some people to be transported. Not because they doubted the mathematical algorithms ability to quantify matter, but because they were afraid that the essence of their Self/Soul would be filtered-out in the process of turning Qualia into Quanta. Other Science-Fiction writers have expressed that same concern in personal terms : "will that reconstituted body still be me?"

Physicist Carlo Rovelli, in his latest book HELGOLAND, presents his "relational" interpretation of Quantum Theory. He says "properties do not reside in objects, they are bridges between objects". Those "bridges" are what we know in other contexts as "relationships". And the human mind interprets those patterns of relations as Qualitative Meaning. On a cosmic scale, it's what Rovelli calls : "the web of relations that weaves reality". And Reality is the "organizational structure" of the world. Ironically, this approach to physics places the emphasis on the mental links (relations, meanings) instead of the material nodes (substance). So, some of his fellow physicists will find that promotion of Mind above Matter to be tantamount to Panpsychism. Although, Rovelli doesn't go quite that far in his book. :smile:
Possibility June 20, 2021 at 05:14 #553797
Quoting Gnomon
IIT seems to be intended as a step toward computerizing Consciousness.


I think the main focus of IIT is more in predicting consciousness with greater accuracy.

Quoting Gnomon
And Reality is the "organizational structure" of the world. Ironically, this approach to physics places the emphasis on the mental links (relations, meanings) instead of the material nodes (substance). So, some of his fellow physicists will find that promotion of Mind above Matter to be tantamount to Panpsychism. Although, Rovelli doesn't go quite that far in his book.


I wasn’t aware that Rovelli had a new book - I’ll need to check it out, thanks. From what I understand of his previous work, it doesn’t surprise me that he was heading this way. He has shown previously that the organisational structure of reality is not based on ‘substance’, but on multi-dimensional relations between attention and effort in a particular system of logic. ‘The Order of Time’ described a four-dimensional structure of reality, and acknowledged that our capacity to describe it as such suggests at least another aspect of reality worth exploring - one in which the idea of ‘substance’ breaks down and the logic of grammar fails us.

Without reading his book, I think it’s important to note here that ‘Mind’ refers to a structure of relations between attention and effort in a system of logic. Mathematics as the system of logic in quantum physics would parse this structure clearly, dissolving the mind-matter barrier in a way that doesn’t even raise the question of panpsychism. I think it’s how we restructure this into concepts of language that raises the question.

Panpsychism simply refers to the notion that the organisational structure of reality is at least as (dimensionally) complex as the structure of the human mind. The relations between attention and effort which form ‘matter’ as we understand it are part of a larger structure of relations that extends both above and below our own capacity for attention and effort at any one time.
Gnomon June 20, 2021 at 17:13 #554063
Quoting Possibility
I think the main focus of IIT is more in predicting consciousness with greater accuracy.

Yes. That too. IIT may be useful for the current application of computers in the search for hidden signs of consciousness in people that outwardly appear to be in a vegetative state (wakeful unawareness).

I doubt that Tononi had Star Trek technology in mind as he developed his theory. But the notion of quantifying consciousness would be a necessary step in that direction. The question remains though, if the quantitative values (objective numbers) would also include qualitative values (subjective feelings). Or would the holistic Self be filtered-out in the process of reducing a person to raw data? :smile:

PS__Rovelli's book focuses on the fundamental physical quantum-level inter-connectedness of the universe -- as the "web of relations that weaves reality". But, as a sober scientist, he avoids speculating on such meta-physical holistic notions as Cosmic Consciousness. He does, however, in a footnote, comment on Thomas Nagel's Mind & Cosmos : "on a careful reading, I find that it doesn't offer any convincing arguments to sustain his thesis".
Enrique June 20, 2021 at 20:32 #554204
Quoting TheMadFool
To my knowledge brain states are generally construed to be patterns in the neural network.


Subjective experience as comprised of qualia is not a product of neural networks (by that I mean some sort of "wiring" itself), but rather the electromagnetic, more generally radiative fields of the brain etc. entangling with smaller scale entanglements amongst molecular complexes, creating a superposition contour analogous in its elemental structure to the additive wavelengths of visible light.
Possibility June 21, 2021 at 06:42 #554437
Quoting Gnomon
I doubt that Tononi had Star Trek technology in mind as he developed his theory. But the notion of quantifying consciousness would be a necessary step in that direction. The question remains though, if the quantitative values (objective numbers) would also include qualitative values (subjective feelings). Or would the holistic Self be filtered-out in the process of reducing a person to raw data?


An informational ‘bit’ is a consolidated binary event - a resultant spatio-temporal state of a system, reduced to the smallest identifiable interaction as energy (electricity) passes from one qualitatively static structure of matter in momentary contact with another. You can’t quantify information as a ‘bit’ outside the qualitative structure of an electronic system.

I don’t think Tononi can identify consciousness as a similar binary event, because there seems to be no way to control the qualitative variability of structure in a system complex enough for such an event to be identifiable amongst the noise. What he has identified is more like the least significant prediction of consciousness. Just as an informational ‘bit’ value depends on energy (electricity) flowing through a static system, I would argue that the ‘value’ of consciousness more accurately refers to non-commutative variables of attention and effort in an ongoing energy event.

From what I understand, the accuracy of Tononi’s ‘psi’ (ie. reduced to a single quantitative value) seems restricted to quantifying the probability of interaction changing an energy event in a particular way. But is that really ‘consciousness’? Any prediction assumes a particular qualitative structure of attention and effort, and loses accuracy in ‘predicting consciousness’ the further that qualitative structure differs from human. Like Shannon’s ‘bit’ in an electronic system, I’m yet to be convince that you can reliably quantify consciousness as a ‘psi’ value outside the qualitative structure of a human system.

Reducing a person to raw data isn’t the issue, though - it’s when we assume that the complexity of this raw data can be rendered as purely quantitative value (without qualitative structure) that we start to ignore contextuality. This is demonstrated in Heisenberg’s tables of data.

Quoting Gnomon
PS__Rovelli's book focuses on the fundamental physical quantum-level inter-connectedness of the universe -- as the "web of relations that weaves reality". But, as a sober scientist, he avoids speculating on such meta-physical holistic notions as Cosmic Consciousness. He does, however, in a footnote, comment on Thomas Nagel's Mind & Cosmos : "on a careful reading, I find that it doesn't offer any convincing arguments to sustain his thesis".


Like Rovelli, I don’t believe there is any reason to posit a Cosmic Consciousness. But I would suggest that it’s more the self-justifying preference for consolidation that he objects to than any metaphysical aspects. Nagel’s book is pure speculation - a challenge to ‘do philosophy’ - and personally I don’t see it making any reasoned argument for Cosmic Consciousness. Nagel simply wasn’t prepared to dismiss the metaphysical sense of interconnected purposiveness harboured in teleological discourse. I think Rovelli shows that consolidation for its own sake isn’t necessary to include this metaphysical sense - that a collaborative and open-ended dialogue with our own ignorance is more conducive to scientific endeavour than tying it up in a comforting metaphorical bow. But perhaps it comes down to whether one is inspired by the question or the answer...
Gnomon June 21, 2021 at 17:58 #554609
Quoting Possibility
An informational ‘bit’ is a consolidated binary event

Rovelli uses Planck's Proportionality Constant ( ? ) as a symbol of quantum level "communication" in the form of Information or Energy. The constant defines a "quantum" of Energy and a "bit" of Information. As you say though, it always takes two to "entangle", to communicate. But he also makes a distinction between a Syntactic exchange (equivalent to a geometric relationship), and a Semantic interchange, which conveys Meaning between minds. That's my interpretation of course, He doesn't put it in exactly those terms. He does say, however, that "entanglement . . . is none other than the external perspective on the very relations that weave reality". (my emphasis) And you can define that third party to the exchange as a scientist's observation, or more generally as Berkeley's "God" who is "always about in the quad". That was the bishop's ontological argument for a universal Observer, who keeps the system up & running, even when there are no Quantum Physicists to measure the energy/information exchanges of minuscule particles. My own Enformationism thesis came to a similar conclusion. :nerd:

Queer quantum query in the quad :
https://www.newscientist.com/letter/mg23130871-400-5-queer-quantum-query-in-the-quad/

Quoting Possibility
Like Rovelli, I don’t believe there is any reason to posit a Cosmic Consciousness.

Rovelli asks, "why is it that we are not able to describe where the electron is and what it is doing when we are not observing it? . . . . Observables! What does nature care whether there is anyone to observe or not?" Scientists don't have to worry about such questions, because Nature, or Spinoza's God, is always observing. But Rovelli has a different explanation : "the electron is a wave that spreads, and that is all. This is why it has no trajectory." When unobserved, there is no independent particles; there is only the hypothetical universal unitary non-quantized fluid or field in which a wave propagates. As I understand his point : the entangled system observes or tracks itself. Hence no third party is necessary. But what if G*D, or Cosmic Mind is the system? :chin:

Quoting Possibility
But perhaps it comes down to whether one is inspired by the question or the answer...

I suppose you could say that my Information-based worldview is what "inspired" me to assume, as an unprovable axiom, that a Cosmic Mind is necessary to imagine all the semantic information & causal energy in the world. :cool:

Possibility June 22, 2021 at 23:59 #555229
Quoting Gnomon
Rovelli uses Planck's Proportionality Constant ( ? ) as a symbol of quantum level "communication" in the form of Information or Energy. The constant defines a "quantum" of Energy and a "bit" of Information. As you say though, it always takes two to "entangle", to communicate. But he also makes a distinction between a Syntactic exchange (equivalent to a geometric relationship), and a Semantic interchange, which conveys Meaning between minds. That's my interpretation of course, He doesn't put it in exactly those terms. He does say, however, that "entanglement . . . is none other than the external perspective on the very relations that weave reality". (my emphasis) And you can define that third party to the exchange as a scientist's observation, or more generally as Berkeley's "God" who is "always about in the quad". That was the bishop's ontological argument for a universal Observer, who keeps the system up & running, even when there are no Quantum Physicists to measure the energy/information exchanges of minuscule particles. My own Enformationism thesis came to a similar conclusion.


Rovelli’s use of the h-bar is not as a symbol of quantum level ‘communication’ - it acts as a qualitative limitation in any calculated prediction.

And it actually takes three to ‘entangle’ - and this is the point I think you’re missing with Rovelli. He makes it pretty clear in his criticism of alternative QM interpretations that to suggest such an unprovable axiom is grasping for certainty where there is none. Rovelli shows that a Cosmic Mind - just like a parallel universe or unobservable - isn’t necessary at all, but that it’s a source of comfort: to assume that someone is always observing, reassurance that the tree continues to be. This is where we have made errors in our descriptions of reality.

“We cannot rely upon the existence of something that only God can see.” - Rovelli

The way Rovelli sees it, it makes no sense to state that two systems S and S` are entangled if there is nothing with respect to which this can be determined. Consolidating an ‘entangled system’ only confuses the issue, because this entanglement does not necessarily exist for any system. It is determined as a joint property of the two systems only in relation to a third system S``, and cannot be assumed as a property of either system S or S` in relation to another system with which they might interact at any earlier or later time.

So we DO need to identify this third party for RQM. Your quote from Rovelli is incomplete: “Entanglement, in sum, is none other than the external perspective on the very relations that weave reality: the manifestation of one object to another, in the course of an interaction, in which the properties of the objects become actual.

According to the relational interpretation of QM, there is no ‘Cosmic Mind’ or ‘universal Observer’, no privilege of subject over object. There are simply systems of information, and the two postulates:
- the maximal amount of relevant information about a system is finite;
- it is always possible to acquire new relevant information about any system.

Omniscience cannot be determined as a property of any system at this level. That’s not to say that either G*D or the notion of omniscience is necessarily impossible. Just that positing the ‘necessary’ existence of a Cosmic Mind as a system that is ‘always observing’ is not compatible with RQM. The notion of ‘Cosmic Mind’ refers to a qualitative infinite, an upper limitation or event horizon, while Planck’s constant refers to a lower limitation. They’re heuristic devices, not objects. That we consider the existence of a Cosmic Mind necessary to imagine all the semantic information and causal energy in the world speaks to the limitations of our own mind, not of reality.

Back to IIT, though - I think the above postulates highlight the limitations of a quantitative theory. Relevant information is that which counts for predicting future interaction with the system. Consciousness isn’t just about quantity, but about relevance: what counts for predicting future interaction.
Gnomon June 23, 2021 at 17:17 #555618
Quoting Possibility
Rovelli’s use of the h-bar is not as a symbol of quantum level ‘communication’ - it acts as a qualitative limitation in any calculated prediction.

"Communication" was my term, not Rovelli's. And it was used deliberately, even though Rovelli specifically excludes the definition of "Information" that is relevant to my personal worldview. He says "the word 'information' . . . . is a word packed with ambiguity". That's exactly why I spend a lot of verbiage in my thesis & blog, to specify what I do and don't mean by "information", in the context of my un-orthodox understanding of how the world works -- not physically, but metaphysically. He goes on to say "'Information' is used here in an objective physical sense that has nothing to do with meaning". And that's OK for scientific descriptions of the physical world. But my concern is with the philosophical (semantic) meaning of metaphysical Information, as one human communicates subjective ideas to other humans.

In its most abstract and general sense, Information is simply mathematical ratios : relationships between one un-specified thing and another, (X : Y = 1 : 2). Those logical relations boil-down to yes/no, or true/false, or 1/0, as in digital computer code. And ratios or relationships have no meaning until they are interpreted by an observer : either a third party, or one of the communicants, who has a subjective perspective. And the "meaning" of an interchange is interpreted relative to the unique frame-of-reference of that third party. It is not an empirical fact of reality.

So, I take Rovelli's emphasis on the "relational" interpretation of quantum theory, as a pragmatic definition for physical scientific purposes. But my purpose is philosophical and metaphysical, in that it is concerned with how Conscious Minds, capable of knowing abstract Qualia, could evolve from a world of concrete Quanta. Therefore, for me, the relevant usage of "Information" is for qualitative concepts, not quantitative percepts. And the notion of a Prime Observer (third party), or holistic Cosmic Mind, has a qualitative meaning, that would not be of interest to a physicist attempting to reduce reality down to its fundamental granular quanta at the Planck scale. The holistic meaning of "reality" is continuous & non-finite, and exists only as a meaningful concept in a subjective mind. But then, as the "mind of god", that universal view would also be our objective reality. Yes? :smile:


Meta-physics :
The branch of philosophy that examines the nature of reality, including the relationship between mind and matter, substance and attribute, fact and value. . . . Physics refers to the things we perceive with the eye of the body. Meta-physics refers to the things we conceive with the eye of the mind. Meta-physics includes the properties, and qualities, and functions that make a thing what it is. Matter is just the clay from which a thing is made. Meta-physics is the design (form, purpose); physics is the product (shape, action). The act of creation brings an ideal design into actual existence. The design concept is the “formal” cause of the thing designed.
http://blog-glossary.enformationism.info/page14.html

Information :
Knowledge and the ability to know. Technically, it's the ratio of order to disorder, of positive to negative, of knowledge to ignorance. It's measured in degrees of uncertainty. Those ratios are also called "differences". So Gregory Bateson* defined Information as "the difference that makes a difference". The latter distinction refers to "value" or "meaning". Babbage called his prototype computer a "difference engine". Difference is the cause or agent of Change. In Physics it’s called "Thermodynamics" or "Energy". In Sociology it’s called "Conflict".
http://blog-glossary.enformationism.info/page11.html
Gnomon June 23, 2021 at 22:54 #555821
Quoting Possibility
Back to IIT, though - I think the above postulates highlight the limitations of a quantitative theory. Relevant information is that which counts for predicting future interaction with the system. Consciousness isn’t just about quantity, but about relevance: what counts for predicting future interaction.

That's why I think quantitative IIT is a step in the right direction for reductive Science, but still can't account for the holistic aspects of the world, that are relevant to all humans, not just empirical scientists. :smile:


Reply to RougueAI above "
IIT is a novel way of thinking about Consciousness, which gives the impression of scientific validity in its use of mathematics. But it still doesn't tell us what Consciousness is, in an ontological sense. Since Consciousness, as a computative process, is meta-physical, we can only define it with metaphors : comparisons to physical stuff. And Mathematical Logic may be as good an analogy as we can hope for. But the big C is not simply a pattern itself, it's the power (ability) to decipher encoded patterns (think Morse code). That's why I say it's a form of generic Enformation (EnFormAction) : the epistemological power to create and to decode Forms into Meaning.