You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

More Is Different

Streetlight March 19, 2018 at 08:01 15300 views 66 comments
A while ago, I wrote this, on the topic of 'reductionism':

"Most definitions of reductionism are terrible, and tend to resolve into some sort of useless tautology; "reductionism means that everything can be reduced to..."; To treat reductionism rigorously however, is to recognize that 'reductionism' simply means context invarience. It says: here is an explanation of the thing, and this explanation holds irrespective of context". On this understanding, reductionism is a one-way street: it means explaning the thing from the inside-out, and never the outside-in, which is why reductionist explanations are always accompanied by the phrase 'is only', as in, such and such 'is only...' (...atoms at work, ...God at work, etc - there are material reductionisms no less than there are idealist and spiritualist reductionisms).

I still think this understanding of reductionism is mostly right, but I think it can also be expanded. P. W. Anderson, the Nobel prize-winning physicist, draws out what I think are the implications of denying this kind of - let's call it - 'one-way street' reductionism: "The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a 'constructionist' one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. In fact, the more elementary particle physicists tell us about the nature of the fundamental laws, the less relevance they seem to have to the very real problems of the science, much less to those of societies". (Anderson, "More Is Different").

This 'inability to reconstruct the universe' from first principles is, I think, the exact corollary of understanding reductionism as context-invarience: it means that there is no one-way street, and that explanation (of any phenomenon) needs to be (at least) 'two way' - context matters. David Pines and Robert Laughlin, two physicists who themselves have written much on the poverty of 'theories of everything', put it this way: "If you are locked in a room with the system Hamiltonian, you can't figure the rules out in the absence of experiment, and hand-shaking between theory and experiment" (source): that is, if you are locked in a room with an equation that describes the behaviour of any system, you can't reconstruct that system without actually getting one's hands dirty and doing the experiments.

There are a few directions one can take this. One is that science itself does not - despite popular misconceptions, often spread by philosophically inept scientists themselves - sanction any kind of reductionist metaphysics. As Anderson himself understood, there are no 'base levels' of reality any more fundamental to any other, insofar as "at each new level of complexity entirely new properties appear, and the understanding of the new behaviours requires research which I think is as fundamental in its nature as any other" (my emphasis). For Anderson, even science itself is internally differentiated such that principles used at one level of scientific investigation cannot be used to explain phenomena at another level - which is of course obvious to anyone who has even more than a casual acquaintance with science, insofar as one doesn't use the laws of solid state physics to explain ecosystem evolution: hence Anderson's slogan - 'More Is Different [pdf]'.

Another implication here is that science is always specific: 'is only' ought to never figure in the vocabulary of anyone who claims fidelity to the scientific method. Instead, explanation in the sciences always amounts to - in the words of Isabelle Stengers: "this..., but in other circumstances that ... or yet again that..." (Stengers, Power and Invention).

Comments (66)

Galuchat March 19, 2018 at 09:52 #163722
Reply to StreetlightX
This agrees with:

1) Lower levels of description always underdetermine higher levels. Newell, A. (1990). Unified Theories of Cognition. Cambridge, MA: Harvard University Press, and

2) Data are not accessed and elaborated independently of a level of abstraction. Floridi, L. (2010). Information: A Very Short Introduction. Oxford: Oxford University Press.
Streetlight March 19, 2018 at 10:33 #163724
Hah, I've read that Floridi book - pamphlet, really - but unfortunately found it so painfully average that I think that connection would have escaped me entirely. I can't speak for Newell, but the idea that "lower levels of description always underdetermine higher levels" sounds about right.
Galuchat March 19, 2018 at 11:04 #163731
StreetlightX:Hah, I've read that Floridi book - pamphlet, really - but unfortunately found it so painfully average that I think that connection would have escaped me entirely.


Perhaps you could clarify/answer the symbol grounding problem Floridi raises (i.e., "how data can come to have an assigned meaning and function in a semiotic system like a natural language")?

I currently see the notion of data as foundational to scientific theories and the systems they explain, being inclined to view natural laws rather as natural syntax (i.e., encoding/decoding principles), and natural codes as transformed, translated, or converted natural data.
Streetlight March 19, 2018 at 12:01 #163745
Quoting Galuchat
Perhaps you could clarify/answer the symbol grounding problem Floridi raises (i.e., "how data can come to have an assigned meaning and function in a semiotic system like a natural language")?


I have vague intuitions about this question, but I'm still lacking the conceptual clarity I need to really address it properly. There's a whole nexus of terms - around embodiment, gesture, sense, and asymmetry that I need to do more research on, and am planning to do in the future.
T Clark March 19, 2018 at 15:20 #163838
Quoting StreetlightX
I still think this understanding of reductionism is mostly right, but I think it can also be expanded. P. W. Anderson, the Nobel prize-winning physicist, draws out what I think are the implications of denying this kind of - let's call it - 'one-way street' reductionism: "The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a 'constructionist' one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. In fact, the more elementary particle physicists tell us about the nature of the fundamental laws, the less relevance they seem to have to the very real problems of the science, much less to those of societies". (Anderson, "More Is Different").


I was looking through Wikipedia trying to figure out what this "symmetry breaking" StreetlightX and Apokrisis are always talking about. I found a reference to the Anderson paper. It's wonderful. He writes clearly and conversationally without oversimplifying. I especially liked the passage you quoted above.

I still haven't figured out what "symmetry breaking" means or how it relates to emergence. I'll keep trying.
Streetlight March 19, 2018 at 15:47 #163881
Reply to T Clark Here is one of the clearest primers I know, although it explains it through reference to Merleau-Ponty.

But yeah, the Anderson paper is awesome. I couldn't not talk about it.

I think I should have named this thread something like: Reductionism is Bad Science or somewhat.
T Clark March 20, 2018 at 02:17 #164156
Quoting StreetlightX
Here is one of the clearest primers I know, although it explains it through reference to Merleau-Ponty.


Read it. Knocked my socks off. I only understood about 1/3 of it. Need to read it again. This changes everything. It puts words to things I've felt, but in the process requires me to change my entire understanding of how the universe works. I can't imagine anything more radical. Quantum mechanics is easier because I don't have to understand it, I only have to believe that things behave the way scientists say they do.

Thank you.
Caldwell March 20, 2018 at 02:59 #164161
Quoting StreetlightX
This 'inability to reconstruct the universe' from first principles is, I think, the exact corollary of understanding reductionism as context-invarience: it means that there is no one-way street, and that explanation (of any phenomenon) needs to be (at least) 'two way' - context matters.

Excellent post!
Phenomena and processes are 'complex' in the philosophical sense (of course, that is also true if strictly science). Reductionism in the true sense of existence denies the complex and what's left is ultimately the indivisible something -- say an atom. I don't suppose the (old) traditional reductionists would want to change the entire meaning of their endeavor. Reductionism is about what is ultimately cannot be denied.
Caldwell March 20, 2018 at 03:18 #164166
I realize I am sort of defending the traditional reductionism even through I am not a follower of this school of thought.
Streetlight March 20, 2018 at 04:23 #164183
Quoting Caldwell
Phenomena and processes are 'complex' in the philosophical sense (of course, that is also true if strictly science). Reductionism in the true sense of existence denies the complex and what's left is ultimately the indivisible something -- say an atom. I don't suppose the (old) traditional reductionists would want to change the entire meaning of their endeavor. Reductionism is about what is ultimately cannot be denied.

I realize I am sort of defending the traditional reductionism even through I am not a follower of this school of thought.


Yeah, it's actually a really hard mindset to shed, and it inevitably creeps back when one isn't paying attention. It doesn't help when people talk about genes as 'the secret to life' or atoms as 'the building blocks of the universe', and so, as is often done in pop-sci presentations of these topics. It makes for good, bold headlines, but for horrible philosophy - not to mention science! In fact, the oddest thing about such reductionist programs is that, taken to their logical conclusion, the ability to reconstruct the universe from first principles is idealism in it's most extreme form; they literally 'vacate the world of its content' as it were, giving up empiricism - the very loadstone of science - for ideality. Yet this almost entirely antiscientific POV is what is almost universally associated with so-called 'hard core science'. It's both bizarre and saddening.

'Complexity' - a different, but related topic - is also another one of those concepts that is generally often defined in vague, imprecise ways (usually circularly as well: 'what is complex is what cannot be broken down into simple parts.... which is to say that it is complex'; or the even worse and more fuzzy 'the whole is greater than the sum of its parts'.). The only definition of complexity with rigour that I know of is Robert Rosen's, which 'relativizes' complexity to our ability to model a particular system, and stipulates that a complex system is one with no 'largest model' - no single model that can capture all the dynamics of such a system. There's a nice summary of it here.

[quote="T Clark;164156"]I can't imagine anything more radical. Quantum mechanics is easier because I don't have to understand it, I only have to believe that things behave the way scientists say they do.

User image
T Clark March 20, 2018 at 04:44 #164190
Quoting T Clark
I can't imagine anything more radical. Quantum mechanics is easier because I don't have to understand it, I only have to believe that things behave the way scientists say they do.


Serious. As I said, QM is just the way things are. I don't feel any ontological agita. Why would you expect things to behave the same at atomic scale as it does at human scale. I hate it when people, even physicists, get all excited and talk about "quantum weirdness" as if they're Neil DeGrasse Tyson, that son of a bitch. This stuff makes me rethink how the universe works.
Pierre-Normand March 20, 2018 at 05:05 #164196
Quoting StreetlightX
A while ago, I wrote this, on the topic of 'reductionism':


Very nice OP, and thanks for the links to the texts by P. W. Anderson; by Noah Moss Brender; and by R. B. Laughlin and David Pines. I now know what I'm going to be reading tomorrow.

Regarding the issue of context, here is a relevant quote from Patrick Aidan Heelan's own preface* to his book The Observable: Heisenberg's Philosophy of Quantum Mechanics, which I was reading earlier tonight:

"I had found that phenomenology and hermeneutics were helpful in making sense of the distinction between classical physics and post-classical physics of relativity and quantum mechanics because these new philosophies had the capacity to explore the latent significance and function of context in both scientific traditions; ‘context’ was arguably the central innovative component of these physical theories that had revolutionized 20th century physics.

Specifically, the notion of context can be thought of as having two parts: a part internal to human consciousness, comprising the functions of meaning-making, meaning-using, and meaning-testing; and a part external to human consciousness, comprising the physical processes associated in human life with meaning-making, meaning-using, and meaning-testing. The internal part draws on the hermeneutic resources of intentionality, which is a technical term for the making, using, and testing of meanings. These hermeneutic resources include not only the habitual practices of categorizing what is represented in the sensory flux, but also habits of relating groups of categories to one another by higher-order explanatory laws (or theories). The external part of context acknowledges the physical aspects of the embodied practices of meaning-making, using, and testing, such as the organized conditions of the space and time of the laboratory bench and engagement with the ‘world’ through acts of measurement performed by a qualified embodied observer who, in his/her community of practice, has become skilled in ‘interpreting’ the measured scientific phenomenon as a datum, present as described within the context of the relevant categories and theories." (All italics in the original)

* There also is a preface by Michel Bitbol and another one by Babette Babich who edited to book after Heelan's passing. Bitbol's preface is outstanding.
T Clark March 20, 2018 at 05:14 #164197
Reply to StreetlightX

Don't the processes discussed in the paper you referenced take all the mystery out of QM? Don't they explain how quantum behavior at atomic scale can emerge as classical behavior at human scale? Isn't that all that matters from an ontological point of view?
Pierre-Normand March 20, 2018 at 05:17 #164201
Quoting T Clark
Serious. As I said, QM is just the way things are. I don't feel any ontological agita. Why would you expect things to behave the same at atomic scale as it does at human scale. I hate it when people, even physicists, get all excited and talk about "quantum weirdness" as if they're Neil DeGrasse Tyson, that son of a bitch. This stuff makes me rethink how the universe works.


Until just a few years ago I tended to share this judgement about the (lack of significant) philosophical significance of quantum mechanics. There is rather more to it than just a description of the way microphysical objects happen to behave, though. Maybe Bitbol's preface to Heelan's book would lead you to change your judgement about this topic. It is very short and entirely devoid of weirdness or woo. It's been published separately from the book, in case you would have some trouble locating it. (Search for 'Heelan' on this page).
Streetlight March 20, 2018 at 06:19 #164226
Quoting T Clark
Serious. As I said, QM is just the way things are. I don't feel any ontological agita. Why would you expect things to behave the same at atomic scale as it does at human scale. I hate it when people, even physicists, get all excited and talk about "quantum weirdness" as if they're Neil DeGrasse Tyson, that son of a bitch. This stuff makes me rethink how the universe works.


Ooh, I see what you mean. But yes, all investigation ought to be scale specific - which is not to say that it isn't interesting or important to understand QM! In fact, even a minimal understanding of QM helps to explain why QM isn't super important at macro scales on the basis on QM itself. You don't need any fancy philosophy here at all. The first point and most important point to note is that QM does in fact apply at all scales. The question is over the effects of QM at macro scales, and it's that effects that are negligible. Why? First because Planck's constant ('h') is so, so tiny: 6.626176 x 10^-34 (as Karen Barad puts it, if you convert Planck's constant into a length, 'this length is so small that if you proposed to measure the diameter of an atom in Planck lengths and you counted one Planck length per second, it would take you ten billion times the current age of the universe."). And the second reason is that quantum effects take place at the order of the ratio between Planck's constant and mass (h/m): for objects with tiny mass - like an atom - that ratio is high. For objects with large mass, that ratio basically becomes negligible, but never non-zero.

Barad: "There is a common misconception (shared by some physicists as well as the general public) that quantum considerations apply only to the micro world. Some people think that the fact that h is very small means that the world is just as Newton says on a macroscopic scale. But this is to confuse practical considerations with more fundamental issues of principle. ... The fact that h (Planck’s constant) is small relative to the mass of large objects does not mean that Bohr’s insights apply only to microscopic objects. It does mean that the effects of the essential discontinuity may be less evident for relatively large objects, but they are not zero. To put it another way, no evidence exists to support the belief that the physical world is divided into two separate domains, each with its own set of physical laws: a microscopic domain governed by the laws of quantum physics, and a macroscopic domain governed by the laws of Newtonian physics". (Barad, Meeting the Universe Halfway)
Streetlight March 20, 2018 at 06:37 #164230
Quoting Pierre-Normand
"I had found that phenomenology and hermeneutics were helpful in making sense of the distinction between classical physics and post-classical physics of relativity and quantum mechanics because these new philosophies had the capacity to explore the latent significance and function of context in both scientific traditions; ‘context’ was arguably the central innovative component of these physical theories that had revolutionized 20th century physics."


It's odd isn't it? I mean, the idea that context matters is so simple an idea, yet it is routinely ignored despite it. And it provides such a simple retort to those who believe in atoms or genes or whathaveyou as constituting any kind of 'fundamental ground' for the rest of the world. Yet the only thing that's 'fundamental' is that everything can function differentially, depending on the context which explicates it:

"What counts is the question: of what is a body capable? And thereby he [Spinoza] sets out one of the most fundamental questions in his whole philosophy (before him there had been Hobbes and others) by saying that the only question is that we don't even know what a body is capable of, we prattle on about the soul and the mind and we don't know what a body can do. But a body must be defined by the ensemble of relations which compose it, or, what amounts to exactly the same thing, by its power of being affected." (Deleuze, Lecture on Spinoza).
Pierre-Normand March 20, 2018 at 06:57 #164233
Quoting StreetlightX
(Barad, Meeting the Universe Halfway)


I only read half of Meeting the Universe Halfway, a few years ago, not because it's not good -- it's excellent -- but because of time constraints. I'll come back to it eventually.

It's worth noting that although the smallness of Planck's constant entails that, for instances, for mesoscale sized bodies very much larger than electrons, Heisenberg's inequality relation relating the products of the uncertainties (or indeterminations) of the positions and momenta of such bodies to Planck's constant have little practical signifiance, measurements of observables that relate to individual photons or electrons *do* have immediate practical significance for the behaviors of the macroscopic measurement apparatuses that are set up for measuring them. The main reason for that is that the mutuality relations that hold between conjugate variables (complementary observables such as position and momentum) do not just affect how microphysical entities behave but limit what sorts of measurement apparatuses can be jointly implemented to probe the very same phenomena that they are measuring without destroying their very conditions of existence. So, just because microphysical phenomena can be effectively amplified by macroscopic apparatuses that interact with them, the finiteness of Planck's constant (i.e. the fact that it's larger than zero) has direct consequences for the structure of the phenomena that we can observe at the macroscopic scale, such as interference patterns. This constraint also undercuts the idea that the microscopic events that are being probed have their determinations independently from the instrumental contexts in which they are being measured, or so have Bohr and Heisenberg argued.
T Clark March 20, 2018 at 07:12 #164235
Quoting StreetlightX
It's odd isn't it? I mean, the idea that context matters is so simple an idea, yet it is routinely ignored despite it. And it provides such a simple retort to those who believe in atoms or genes or whathaveyou as constituting any kind of 'fundamental ground' for the rest of the world. Yet the only thing that's 'fundamental' is that everything can function differentially, depending on the context which explicates it:


It doesn't seem as though this would be controversial, so how can any smart, competent physicist claim that physics can be reduced to particles spinning around in isolation from the rest of the world? I haven't taken physics in 30 years. Do they teach this now?
Pierre-Normand March 20, 2018 at 07:32 #164238
Quoting T Clark
It doesn't seem as though this would be controversial, so how can any smart, competent physicist claim that physics can be reduced to particles spinning around in isolation from the rest of the world?


You can read the third chapter -- Two Cheers for Reductionism -- in Steven Weinberg's book Dreams of a Final Theory, for an instance of such an argument. Such reductionist authors don't reject the idea that the particles are interacting with the rest of the world, where the rest of the world is being characterized as just more particles (and fields), of course. They are saying that whatever complex phenomena "emerge" from such elementary interactions are "nothing over and above" the elementary particles that they are being constituted of, and the interactions between them. Weinberg attempts to cash out such ideas of ontological and explanatory reducibility in terms of "convergence of arrows of explanation" to a lower level of fundamental physics while relying entirely on a strikingly impoverished notion of what an explanation is.
Streetlight March 20, 2018 at 08:06 #164254
Reply to T Clark Yeah, having someone like Weinberg doesn't help, but I suspect that the basic answer is that it's not as 'pretty'. To say that everything is just 'atoms in motion' is an incredibly attractive thesis, a powerful-looking, parsimonious 'explanation' for things that absolves one from going out there and doing the hard work. It's good PR ('the God particle', 'Grand Unifying Theory', etc), and moreover, it has a long and rich history, which used to culminate in 'God' instead of 'atom'. But both are idealist claptrap.

The other reason is that science is, in general, methodologically reductionist: an experiment has value precisely to the extent that 'context' is, as much as possible, controlled for, such that we can track one variable while holding equal an entire background of other variables. 'Context' is exactly what you exclude in experimentation, all the better for experimental success. This is less a vice than a virtue however, and is one of the reasons science is so very powerful. In other words, reductionism works. The problem is when this necessary methodological reductionism is translated into, as it were, ontological premise.

Quoting Pierre-Normand
So, just because microphysical phenomena can be effectively amplified by macroscopic apparatuses that interact with them, the finiteness of Planck's constant (i.e. the fact that it's larger than zero) has direct consequences for the structure of the phenomena that we can observe at the macroscopic scale, such as interference patterns. This constraint also undercuts the idea that the microscopic events that are being probed have their determinations independently from the instrumental contexts in which they are being measured, or so have Bohr and Heisenberg argued.


Yeah, this makes alot of sense, and the implications of this kind of thinking are something I'm always keen to try and tease out. Also, Weinberg's 'arrows of explanation' are pretty much exactly the 'one way street' explnations I was aiming at in the OP.
Streetlight March 20, 2018 at 14:20 #164435
Reply to Galuchat Thinking a little about this in terms of information, part of what it means to subscribe to reductionism is to say that context contains no information, or rather, cannot function informationally. For example, in biology, people have had to work hard to demonstrate that information about gene expression (what DNA codes for) and heredity (what gets passed down from one generation to the next) is not contained in the genes alone: DNA alone is not sufficient a mechanism, on it's own, to constitute an organism. This is the case both ontogenetically (development of a single organism though it's life) and phylogenetically (evolution at the level of a species, from one organism to it's offspring).

Ontogenetically, the exact protein a DNA will code for will depend not only on it's so-called 'primary structure' (the precise order of amino acids coded for by DNA sequences), but also it's 'secondary' and 'tertiary' structure: that is, the local geometries of any one protein: the spacing between amino acids (are they folded into helixes or pleated into sheets?), as well as the timings and temporal sequences in which the protein is 'folded'. Space and time literally carry information regarding the 'end result' of what DNA codes for, in a way that is not itself contained in the DNA sequence (usually regulated by chemical gradients, electrical differentials, and maybe some other mechanisms).

Phylogenetically, what is 'passed down' from parent to child is also not 'only' contained in the DNA. Instead, information is carried by the entire 'developmental system', i.e. the entire environment in which the DNA is passed down, such that it too carries the information necessary for the development of the offspring.

In both cases, reductionism would require denying the informational role that context plays - 'context' being space, time and other mechanisms of heredity which specify how an organism will develop through it's lifetime. And again, to be against reductionism here is just to be for science, not against it; at least, it is to hew closer to the discoveries of science than any extra-scientific metaphysics which is foisted onto it from the outside. This obviously doesn't answer the symbol-grounding problem which you asked about, but it does imply thinking about 'symbols' differently: as not carriers of information in their own right, but as resources that need to be thought about in terms of wider, context-bearing processes.
Galuchat March 20, 2018 at 15:43 #164469
StreetlightX:Thinking a little about this in terms of information, part of what it means to subscribe to reductionism is to say that context contains no information, or rather, cannot function informationally.


That would appear to be the case. Thanks for the gene expression example. Mention of developmental factors brought gene switching to mind. Since I'm currently focused on cognitive psychology, I tend to be more annoyed by attempts to explain mind solely in terms of brain anatomy and/or neurophysiology.

StreetlightX:And again, to be against reductionism here is just to be for science, not against it; at least, it is to hew closer to the discoveries of science than any extra-scientific metaphysics which is foisted onto it from the outside.


I agree. The problem is not one of restricting empirical investigation to a single level of abstraction, but of reaching inappropriate conclusions and deriving incoherent concepts from the results of those investigations.

StreetlightX:This obviously doesn't answer the symbol-grounding problem which you asked about, but it does imply thinking about 'symbols' differently: as not carriers of information in their own right, but as resources that need to be thought about in terms of wider, context-bearing processes.


Hence, the difference between data and information.
I'm in the process of reading the Brender paper to see if there are any insights worth pursuing.
T Clark March 20, 2018 at 16:26 #164486
Quoting StreetlightX
'Context' is exactly what you exclude in experimentation, all the better for experimental success. This is less a vice than a virtue however, and is one of the reasons science is so very powerful. In other words, reductionism works. The problem is when this necessary methodological reductionism is translated into, as it were, ontological premise.


Isn't it more than just philosophical ontology? I've tried to pay attention to discussions over the past months that you and Apokrisis participated in relating to abiogenesis and the development of consciousness. There's no way those can be understood using a reductionist approach.

This is a pain in the ass. It's making me re-evaluate my understanding about the difference between facts as a matter of truth and ontology and epistemology as matters of choice.
T Clark March 20, 2018 at 16:28 #164487
Quoting Pierre-Normand
You can read the third chapter -- Two Cheers for Reductionism -- in Steven Weinberg's book Dreams of a Final Theory, for an instance of such an argument.


Thanks. I'll take a look.

T Clark March 20, 2018 at 16:41 #164488
Quoting Galuchat
That would appear to be the case. Thanks for the gene expression example. Mention of developmental factors brought gene switching to mind.


Here's a link to a discussion on gene expression that SLX started a few months ago. It really opened my eyes:

https://thephilosophyforum.com/discussion/2235/networks-evolution-and-the-question-of-life/p1
Streetlight March 20, 2018 at 17:20 #164491
Reply to T Clark I'm glad you're finding some of these threads useful, or at least provocative! Note that the thread on gene expression is basically an example or a 'case' of the more generalized principles I've tried to outline in this thread: context always modifies the operation of the elements so contexualized - genes will express differently depending on 'context' (where here 'context' serves to cover a shit ton of interesting biology, only some of which I broached in that previous thread). And note also that the stuff I did speak of in that thread - re: gene networks - didn't even begin to broach the question of protein folding and the secondary and tertiary structures (along with their regulation) that also influence the production of a protein. The path from DNA to trait is basically an entire biological adventure (i.e. DNA = organism = wrong!)

Quoting T Clark
Isn't it more than just philosophical ontology?


As in?
T Clark March 20, 2018 at 18:19 #164499
Quoting StreetlightX
Isn't it more than just philosophical ontology?
— T Clark

As in?


A couple of weeks ago I started a discussion - "An attempt to clarify my thoughts about metaphysics." I wanted to lay out my thoughts about the difference between questions of fact and questions of what I called "metaphysics." One of the upshots of the discussion is that I think calling it metaphysics is probably not right. At least it's misleading. The questions I was interested in were those that are not matters of fact, but are more a matter of choice about how you want to look at things, e.g. is there such a thing as objective reality? is there an objective morality? Is there free will? I have always said that type of question does not have a yes or no answer. It's a matter of usefulness, not truth.

The discussion we are having now is making me rethink that.
Caldwell March 21, 2018 at 01:47 #164776
Quoting StreetlightX
In fact, the oddest thing about such reductionist programs is that, taken to their logical conclusion, the ability to reconstruct the universe from first principles is idealism in it's most extreme form; they literally 'vacate the world of its content' as it were, giving up empiricism - the very loadstone of science - for ideality. Yet this almost entirely antiscientific POV is what is almost universally associated with so-called 'hard core science'. It's both bizarre and saddening.

The only definition of complexity with rigour that I know of is Robert Rosen's, which 'relativizes' complexity to our ability to model a particular system,


(Good reference. I like Rosen's description of complex and simple).
Yes, reductionists could be easily read as idealists. After all, the exercise of their intellect is of a priori kind -- what we see is what we don't understand.
Reductionists are simply purists -- remove the clutter to get to the neat stuff. The point is not to reconstruct the universe, it is to see it as it really is.
To put it simply, to the reductionist, the universe is complete.
Streetlight March 21, 2018 at 04:28 #164813
Quoting T Clark
A couple of weeks ago I started a discussion - "An attempt to clarify my thoughts about metaphysics." I wanted to lay out my thoughts about the difference between questions of fact and questions of what I called "metaphysics." One of the upshots of the discussion is that I think calling it metaphysics is probably not right. At least it's misleading. The questions I was interested in were those that are not matters of fact, but are more a matter of choice about how you want to look at things, e.g. is there such a thing as objective reality? is there an objective morality? Is there free will? I have always said that type of question does not have a yes or no answer. It's a matter of usefulness, not truth.

The discussion we are having now is making me rethink that.


Excellent :D This is, obviously, a different topic, but I think you're on exactly the right track; I don't think truth has ever been an index of philosophy, nor do I think it ought to be. As Deleuze says, philosophy lives and breathes not on truth, but on the Remarkable, the Interesting, and the Important: categories of sense, of significance. I would quibble about the the idea that it's a matter of 'choice' - philosophy or 'metaphysics' always arises, for me anyway, out of the necessity of responding to a problem, where the problem - whatever it is - is immanently defined by the solution which addresses it. But this is somewhat off-topic.
Streetlight March 21, 2018 at 04:33 #164814
Quoting Caldwell
The point is not to reconstruct the universe, it is to see it as it really is.


Yes, but who doesn't claim to 'see things as they really are'? This is why I insisted, in the OP, on the rhetorical trope of the 'is only...' when it comes to reductionism. 'Is only' excludes, it denies, as I said, the informational capacity of context, it rules things out so as all the better to rule (one) thing(s) in - atom, mind, God, etc. It is simplicity bought at the price of simplification, in the most pejorative sense of that word.
T Clark March 21, 2018 at 07:46 #164832
Quoting StreetlightX
I would quibble about the the idea that it's a matter of 'choice' - philosophy or 'metaphysics' always arises, for me anyway, out of the necessity of responding to a problem, where the problem - whatever it is - is immanently defined by the solution which addresses it.


I don't think we disagree. When I say "choice" I mean we get to choose what works best.
T Clark March 22, 2018 at 19:03 #165457
Quoting StreetlightX
...even a minimal understanding of QM helps to explain why QM isn't super important at macro scales on the basis on QM itself. You don't need any fancy philosophy here at all. The first point and most important point to note is that QM does in fact apply at all scales.The question is over the effects of QM at macro scales, and it's that effects that are negligible.


I read this when you wrote it and thought I understood what you meant. I thought your point was that a situation like QM, which applies at all scales but who's effects are only significant at atomic scales, is not a true example of emergence.

I am currently working through " Decoupling emergence and reduction in physics" by Karen Crowther, which @Pierre-Normand sent me. I didn't want to wait till I'm finished to ask you this question. It may take me a while. When she says "EFT" she is using an alternative term for emergence. Here is a quote:

A quick example should give some feel for what is meant. Consider the BCS theory of superconductivity as an example, typical of EFT, where the low-energy theory severely underdetermines the high-energy physics. The BCS theory is an approxi mation which ignores most of the mutual interactions of the electrons of the system and focuses only on a particular interaction due to phonon exchange. In spite of this, it works surprisingly well in many situations, with its predictions often in agreement with experiment to within several percent. The success of the BCS theory can be explained by the framework of EFT: it can be shown that only the specific interactions used by the BCS theory are relevant at low energies. All the rest of the interactions—important in the high-energy theory—are suppressed at low energies by powers of a small energy ratio (Burgess 2004, p. 9). This example demonstrates how little the low-energy physics (of interest) depends on the high-energy (micro-) physics: most of the high-energy interactions can simply be ignored. Given the low- energy theory alone, it would be impossible to come up with the high-energy theory without the aid of experimental results or some other external source of input. In this example, we can thus say that the low-energy physics is relatively autonomous from the high-energy theory.

Seems like either I misunderstood you or I misunderstand her. Or both.
Pierre-Normand March 22, 2018 at 20:39 #165487
Quoting T Clark
Seems like either I misunderstood you or I misunderstand her. Or both.


What you quoted Crowther to be saying and what @StreetlightX had said seem to be broadly compatible ideas applied to different contexts.

StreetlightX stressed that many macroscopic objects and phenomena can be understood and explained while abstracting away from the laws of quantum mechanics that govern interactions between the micro-constituents of those objects.

Likewise, Crowther provides an examples of relative independence between pairs of domains that are both being governed by quantum mechanics. In the case of superconductivity the electrons behave collectively in a rigid fashion in a way that is explained by the exchanges of 'sound particles' (phonons) that can only exist in the context of low energy (below some definite threshold). When that occurs, the random interactions that electrons normally have with each other and with the lattice cancel out and hence cease to have any effect on the collective behavior of the electrons. One striking fact, though, is that the high-level emergent laws that govern equally emergent objects (the phonons), although independent of many features of their material 'constituents' (the electrons) still are governed by the general principles of quantum mechanics.

In the same vein, in their paper The Theory of Everything, R. B. Laughlin and David Pines commented that:

"The Josephson quantum is exact because of the principle of continuous symmetry breaking. The quantum Hall effect is exact because of localization (17). Neither of these things can be deduced from microscopics, and both are transcendent, in that they would continue to be true and to lead to exact results even if the Theory of Everything were changed."

The "Theory of Everything" here is conceived as the ultimate high-energy (and hence the ultimate micro-physical) unification theory of matter and gravity, assuming that there might be one.
apokrisis March 22, 2018 at 23:01 #165547
Quoting Pierre-Normand
Likewise, Crowther provides an examples of relative independence between pairs of domains that are both being governed by quantum mechanics.


[Apologies. This is a bit roundabout as an actual response, but I started so I finished...]

The philosophical tension here would seem to be the issue of how closely do our models of the world match the actuality of the world. And - in the name of pragmatic efficiency - would they even want to mirror that actuality? Does the map need to be like the territory, or is the whole point that it is not, which is why you can fold it up and stick it in your back pocket?

But I would argue that our hierarchical approach to scientific modelling - starting off with the "fundamental" and building our way up to the messy complexity of actual reality - does deeply mirror reality in being about symmetries (a generalised lack of constraints) and symmetry-breaking (the addition or emergence of constraints).

And then a further wrinkle, symmetry-breaking itself halts - and forms a hierarchical level - when it reaches an equilibrium condition. A new symmetry emerges when - as in an ideal gas - differences no longer make a difference. Macroscale order rules once the internals of the system are reduced to a statistical noise so far as the world is concerned.

This is why more can be ontically different. The internal parts that construct the system have lost any power to disturb the general state of the system. As an equilibrium balance, its fluctuations don't count. And so now - when nature goes searching for symmetries to break, and we go searching for breakings to describe - it is the macrostate that forms the new hierarchical level, the new platform, for any breaking.

That is a roundabout way of getting at the fundamentality of quantum mechanics. QM is a highly general view that includes "everything" by removing every symmetry-breaking and just talking nakedly about the statistics of fluctuations or individuations. It forms a ground zero at the point where indeterminism itself is constrained to produce determinism.

So it arises from considering what could be the barest kind of intelligibility imposed on an unorganised possibility. And it discovers that any definite answer, any act of particularisation, must have the dialectical or complementary form of a dichotomy. To ask a symmetry-breaking question of nature, it must be posed as the either/or of two polar limits.

This is just a principle of logic. To be A with complete definiteness, you have to be not not-A. And the quantum issue is you can't ask both questions - the questions being diametrically opposed - at the same time. Hence you can ask about location, but then you must lose sight of the complementary thing of momentum. The duality is baked in by the complementary nature of the questions you must ask to pin something down in a particular fashion when its existence is itself relative to the context of these two limits that must be "far out of each other's sight" to be real themselves.

So quantumness itself arises due to the requirements of contextuality or holism. For even a fluctuation to have counterfactual definiteness, it must be placed relative to a context. And the barest context is already an asymmetry - a broken symmetry - in being formed by dichotomous boundary conditions. It has to be "world" that is described by logically complementary limits to being.

So while reductionism/constructionism is based on the maths of bottom-up addition or summation, at the heart of holism is the maths of reciprocal or inverse relations. A-ness is defined in terms of its lack of not-A-ness. And not-A-ness in turn is defined by its lack of A-ness.

To be located is to lack momentum. To have momentum is to lack location. Each stands as the other's ground of measurement. And as a reciprocal relation, that yields the uncertain curve. Shrinking the uncertainty concerning one leads to a matching increase in the uncertainty of the other.

Anyway, our notion of the quantum arises by asking the deep question of what would the most general symmetry-breaking look like? And logic tells us that it is the bare reciprocal. It would have to be a world where the question was being posed in terms of the most fundamental complementary pair of properties. Dialectics would rule.

So quantum mechanics describes that bare state. Well, it is not completely bare as QM has to presume a backdrop dimension of time which can make the simultaneity of the question-asking a thing. If the most global logical constraint is we don't get to ask two opposed question in the one act of measurement, then there has to be a passage of time to underpin that.

A theory of everything, a theory of quantum gravity, of course hopes to get past that and show that time too is an emergent phenomenon of some kind.

But that is what quantum theory does. It sets a baseline on intelligible existence. It is the minimal view that arises when we start not with some reductionist view of existence as a brute state, but a view of existence as a self-organising semiotic process of inquiry. If anything is to have definite existence, it would have to start with the dichotomy that is its boundaries. The symmetry-breaking kind of has to exist before the symmetry it breaks.

This is an argument for strong holism, then. Or ontic structural realism. It is pretty Platonic in fact. A bootstrapping ontology. It says that it is the possibility of being organised that produces the material needed to construct the organisation. So not just "more is different", but it is the more that differentiates.

Finally getting back around to the examples like phonons and superconductivity, science then adds further mathematical or Platonic structure to this fundamental quantum generality.

What condensed matter phenomena, like phonons, demonstrate is that nature, at the quantum level, does not distinguish between solitary and collective excitations. It is not a fundamental fact that matter has to be atomistic. It is a more fundamental fact that matter has to have organised form. What makes the difference is the "shape" of the excitation, not the "number of parts" it seems to be composed of.

So collective excitations of matter can have exactly the same quantum weirdness as individual particles. Nature sees no essential difference - as the complementarity of any fundamental "question asking" is the thing. Individuation is individuation. And QM is the general rules of the most primal possible acts of individuation.

To break nature's indifference to fundamental level "excitations of a field", the world has to cool and expand. Bosons must become actually different to fermions in their statistics because something happens to break the symmetry of their spin and expose a handedness or chirality that adds a new level of anti-symmetric possibility. Once left looks different from right, another level of structure can arise as there is now a solid basis for a constraint that differentiates one vanilla excitation from another.

Getting back to how closely our models of reality actually match that reality, I think what I have shown is that the bottom-up constructive view is really the view of things which is pretty much exactly back to front. It works because it is the formal inverse of "what is really going on".

Holism is then more correct as it says form conjures matter into being. Reality bootstraps because there are mathematical-strength regularities it can't escape. Quantum mechanics is our picture of the first step at which this inevitable organisation - the one that is forced on indeterminate possibility by the very nature of a determining question - arises. And once that game has started, more specific constraining questions can follow. You get the cascade of further symmetry breakings which produce all the quantum particles of the Standard Model for a start.

This is the thesis of Ontic Structural Realism. It was vogue as the bootstrap model - Chew's S-matrix model - of 1960s. John Wheeler said it nicely in his "it from bit" papers. The quantum reconstruction and quantum information approaches are reviving it currently.

But still, I admit there is a problem. Even a completely formalist and logicist approach - one that says it is constraints all the way down to the bottom - has to grant some reality to materiality somewhere.

Ontic Structural Realism or a Theory of Everything can shrink the notion of a bare action - a primal material/efficient cause - to practical invisibility. But a complete metaphysics still would want to say something about this complementary aspect of reality.

Once holism has explained everything in terms of the constraints on fluctuations, there will still be the fluctuation itself to be explained.

(Even if - as in the theory of spontaneous symmetry breaking - "anything" would start the tipping. A pencil will not stay balanced on its tip forever because anything and everything could shake it. And so a primal fluctuation would have no particular nature that had a "constructive" impact on the world it happened to produce. The mystery of the first fluctuation would become the least kind of possible mystery on that argument.)







Pierre-Normand March 22, 2018 at 23:33 #165568
Quoting apokrisis
[Apologies. This is a bit roundabout as an actual response, but I started so I finished...]


No apologies needed. As usual, your posts need being given quite a bit of thought before one can reply to them meaningfully. Have you read the paper by Noah Moss Brender that StreetlightX linked to recently (Sense-Making and Symmetry-Breaking, Merleau-Ponty, Cognitive Science, and Dynamic System Theory)? I just finished reading it today.

Quoting apokrisis
That is a roundabout way of getting at the fundamentality of quantum mechanics. QM is a highly general view that includes "everything" by removing every symmetry-breaking and just talking nakedly about the statistics of fluctuations or individuations. It forms a ground zero at the point where indeterminism itself is constrained to produce determinism.


I must give more thought to that too but it rings similar to Bitbol's thesis in his paper Quantum Mechanics as a Generalized Theory of Probabilities.

I'll comment more substantively at a later time.
apokrisis March 22, 2018 at 23:44 #165594
Quoting Pierre-Normand
I must give more thought to that too but it rings similar to Bitbol's thesis in his paper Quantum Mechanics as a Generalized Theory of Probabilities.


Hah, Bitbol's paper is one of those lined up in a crowd of browser tabs waiting to get read.

I certainly liked his earlier downward causality paper - http://michel.bitbol.pagesperso-orange.fr/DownwardCausationDraft.pdf

Caldwell March 23, 2018 at 01:39 #165638
Quoting StreetlightX
This 'inability to reconstruct the universe' from first principles is, I think, the exact corollary of understanding reductionism as context-invarience: it means that there is no one-way street, and that explanation (of any phenomenon) needs to be (at least) 'two way' - context matters.


Okay, I read your last response to me, but I'd rather respond to this quote instead. I am sympathetic with your idea.
lol. I am now officially defending reductionism. :smile:
First, I think the definition of existence given by reductionism is one that denies context, complex, processes and manifolds as real. (You can correct me on this. )
And then what's left? Absolutism. Reductionism is a true absolutism.
I think when we start talking about context, reconstruction, and phenomena, we want to explain why relativism isn't being mentioned here. Is this not relativism? Better yet, why is this not a Schrodinger's cat?
Streetlight March 23, 2018 at 02:11 #165651
Reply to Caldwell But words like 'Absolutism' and 'Relativism' are just words, nominations. What does it matter if you call something 'absolutism' or 'relativism'? You haven't specified the difference these differences make. As for the cat, what about it? Again, what's the relavence? I think it would be more helpful if you elaborated the stakes involved in invoking these things.
Streetlight March 23, 2018 at 05:22 #165672
Quoting T Clark
I read this when you wrote it and thought I understood what you meant. I thought your point was that a situation like QM, which applies at all scales but who's effects are only significant at atomic scales, is not a true example of emergence.


One thing to note is that I've been quite careful to avoid the word 'emergence' when talking about alot of this stuff (take a look, I don't even use the word at all!). All I was doing with the QM comment was insisting that one be careful about the kinds of conclusions that one draws from the insistence of context: just because everything is context-bound does not mean that one can entirely disregard the functioning of things at a 'lower level' in the context of a higher one: with QM, just because quantum effects are no longer as manifest at macro scales does not mean that QM 'stops functioning' at macro scales, only that those effects are so small as to be rendered (for the most part) inconsequential. I say 'for the most part' because as P-N pointed out, if you look for quantum effects at macro scales, you will still find them.

Crowther's point, as I read it, is similar: at low energies, BCS theory captures the dynamics of superconductive systems even though it doesn't take into account high-energy interactions that take place all throughout the system. This doesn't mean that those high-energy interactions - interactions between electrons - somehow 'cease to exist' or whathaveyou, only that, at the level of explanation we are interested in, those interactions are not relevant. Crowther's point goes further than mine though because she actually provides a reason why this is so: at low energies, we can simply measure interactions due to phonon exchanges in order to capture the dynamics, and importantly, phonons only exist at low energy levels. This is where questions of emergence might start to become relevant, but it wasn't necessarily what I was concerned with in my own comments prior.
Streetlight March 23, 2018 at 08:51 #165693
@Pierre-Normand, So I just finished reading the Crowther paper and damn it's excellent. It vindicates, I think, my avoidance of talking about emergence in a thread largely dedicated to reductionism, and makes me really want to read the Butterfield papers she referenced. I've actually come across both Batterman's and Morrison's papers before (the ones she critiques for giving too negative a definition of emergence as 'not-reduction'; Massimo Pigliucci has discussed both on his blog), and I really like her angle of critique. Her discussion of EFT was also excellent and blissfully clear, and the whole thing just helped me clarify alot of the conceptual issues I had with thinking about these issues. Awesome.
Dominic Osborn March 23, 2018 at 10:29 #165719
Reply to StreetlightX

It's not that I want to be confrontational, but I can't help but come at all this from the opposite direction from you.

I think the philosophical pursuit of truth is the drive for a single, all-encompassing explanation. I think to explain is to reduce. I almost feel—if each of the very last two explanations cannot be understood in terms of the other—then the whole project has been worthless. If there’s one mystery in the universe left—then nothing has been understood.

Quoting StreetlightX
"The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a 'constructionist' one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. In fact, the more elementary particle physicists tell us about the nature of the fundamental laws, the less relevance they seem to have to the very real problems of the science, much less to those of societies". (Anderson, "More Is Different").


But that’s because the explanation is the wrong one. Reductionism is the claim that the one thing at the bottom can explain the upper layers. Saying a see-saw is “just” or “only” a fulcrum and a plank and a certain spatial relationship between them is saying: give me a fulcrum and a plank, specify their relationship, and I’ll make a see-saw for you. The right reductionist theory is the one that enables you to construct back up again.

Quoting StreetlightX
One is that science itself does not - despite popular misconceptions, often spread by philosophically inept scientists themselves - sanction any kind of reductionist metaphysics.


No, not consciously, but there is a form that science expresses, that every scientific endeavour reduces to, a form that is simple, and that the realist world is built from, namely: there is a scientist, and a world to be found out about—

Quoting StreetlightX
In fact, the oddest thing about such reductionist programs is that, taken to their logical conclusion, the ability to reconstruct the universe from first principles is idealism in it's most extreme form; they literally 'vacate the world of its content' as it were, giving up empiricism - the very loadstone of science - for ideality.


You say that like it’s a bad thing.

Quoting StreetlightX
To say that everything is just 'atoms in motion' is an incredibly attractive thesis, a powerful-looking, parsimonious 'explanation' for things that absolves one from going out there and doing the hard work.


Trying to get of out of doing hard work sounds like a good idea to me. (I'm serious.)

Quoting StreetlightX
But both are idealist claptrap.


Well... I know you feel this way. Still, I'd love to be able to communicate in some way with physicalists.

Quoting StreetlightX
science is so very powerful


Powerful, yes, but for what? It’s like a Weimar Republic bank-note printing press. Churning out worthlessness. Powerful for creating illusions, or things that cause as much harm as good. Utterly without power to produce happiness, peace, justice, beauty, etc.---or indeed, understanding.

Quoting Caldwell
The point is not to reconstruct the universe, it is to see it as it really is.


To a reductionist, or an idealist, these are the same thing.

Quoting StreetlightX
As Deleuze says, philosophy lives and breathes not on truth, but on the Remarkable, the Interesting, and the Important: categories of sense, of significance.


I think the Truth on the one hand and the Remarkable, Interesting and Important on the other—are the same thing.

Quoting StreetlightX
It is simplicity bought at the price of simplification, in the most pejorative sense of that word.


I think simplification is always good.




Streetlight March 23, 2018 at 10:53 #165729
Quoting Dominic Osborn
I think the philosophical pursuit of truth is the drive for a single, all-encompassing explanation.


No, that would be theology, not philosophy. Explanation follows the explananda wherever it goes, it does not subordinate it to prior stipulations.

In any case, give me arguments, not aphorisms. The latter are not worth much.
T Clark March 23, 2018 at 16:30 #165862
Quoting StreetlightX
One thing to note is that I've been quite careful to avoid the word 'emergence' when talking about alot of this stuff


Yes, I understand that. Crowther doesn't use the word either. She actually says "EFT" in order to avoid that. I've been thinking about that. Let me lay out how I'm currently thinking about what I won't avoid calling "emergence," to give you guys a chance to help me clarify my thinking.

This is really fun.

Ok, I think we agree that a reductionist approach to phenomena can be misleading when you're trying to understand the relationships between what Crowther calls different "length (or energy) scales." That's because the results of reductionist analysis at one level are generally not useful to construct the properties and phenomena on another level. So, here are three situations I'm thinking about:

1) Situations where you can build up an understanding of higher levels from reductionist bricks. I guess this is statistical mechanics and the most commonly used example is the behavior of an ideal gas.

2) Situations where lower level process apply to higher levels, but have negligible effects. The example you used was QM. The one I usually think of special relativity. Newton is fine except when you get close to the speed of light. Crowther talks about superconductivity.

3) Situations where higher levels "emerge" from lower levels in ways that are fundamentally unpredictable. These are what I've always thought of when I think of "complexity."

The question I was trying to answer in my previous post when I quoted Crowther is whether 2 and 3 are the same or maybe 2 is an example of 3. I thought you and she were saying that the QM and superconductivity phenomena are examples of EFT, emergence, complexity, or whatever you want to call it. Seems to me an understanding as described in 2 would allow me to construct a higher level from what I know about a lower one. That's inconsistent with my understanding of situation 3.

I hope you understand the question I'm trying to ask. Is my understanding of situation 3 wrong?

Did I mention this is fun?
T Clark March 23, 2018 at 23:20 #166017
@StreetlightX & @Pierre-Normand

Well, since you guys left me holding bag, I had to go ahead and finish the Crowther paper. I got very lost in the physics, but for the first two thirds of the paper I was able to follow well enough to understand the kinds of issues that are involved in the idea of emergence. It's a lot more nuanced than I was expecting.

Quoting T Clark
3) Situations where higher levels "emerge" from lower levels in ways that are fundamentally unpredictable. These are what I've always thought of when I think of "complexity."


What I get is that what I wrote above is right except when it's not or when you look at what emergence means a little differently. Epistemological vs. ontological emergence. Universality, autonomy, novelty, underdetermination. Holy smokes. I really like how clearly Crowther writes even though it was so hard for me to understand. I enjoyed reading it. I think I'll read it again after I've done some more homework.

And I still don't understand what symmetry breaking means. No, please. Don't explain. Like I said - I'll do some more homework.

I appreciate you walking me through this.
apokrisis March 24, 2018 at 00:37 #166040
Reply to T Clark I find Crowther’s presentation very confused - upside down indeed.

But anyway, the essential point about superconductivity is that QM gets restored as having an effect at low temperature.

Electrons themselves emerge as the universe cools enough for them to pop out of the quantum foam as classical point particles with individual momenta or incoherent kinetic energy. They are fermionic and disentangled. So in EFT fashion, symmetries are broken and new universalised properties are established. Electrons become a universal thing described by their own emergent EFT - as in classical electrodynamics.

But cool down their world even further - and make it the world of a confined metalic lattice, another pretty arbitrary and non-universal or non-holonomic constraint - and you get a recovery of a quantum possibility. You can get point particles behaving like collective phonons with bosonic statistics. The asymmetry, or broken chiral symmetry, of being a fermion can be reversed by teaming up as "Cooper pairs".

So a general rule of the quantum level - Pauli's exclusion principle - becomes again explanatory of what is going on at a "super-classical" level. Electrons could be melted back to pre-fermionic matter - effectively - by travelling back in time to the white heat of the Big Bang. And they can also lose their fermionic identity at the end of time, when things get so cool that pairs of electrons can find it easy to entangle as symmetry-increasing combos. They may still have some individual kinetic momenta, but that is now so weak it can't combat the stronger urge to unit as a way to meet the energetic constraints imposed by a motion through a metallic lattice.

This all makes BCS a bad example of a "tower of EFTs". It kind of combines two different things at once. And that is confusing.

Let's step back a bit.

The key thing about EFTs is they are about a collective mode of organisation being spread out absolutely everywhere and so becoming a “law”. It is about a phase transition that occurs when a limit is "crossed".

For example, in a world of gaseous H2O, it is the law that the molecules behave everywhere and at all times like a liquid when the temperature and density crosses a critical threshold. The collective behaviour - due to weak van der Waals forces - overcomes the individual kinetic energies of the molecules. And so a new state of matter arises - one that itself has new relevant properties, like being a reasonably universal solvent, and having an unusual latent heat capacity, that are very relevant to explaining chemistry and even life itself.

So emergence is all about a collective mode of organisation becoming a universalised property. And it is "a property" - something that justifies "more as different" - from the point of view of the further complexity of which it becomes the generalised platform.

In a weakly emergent way - a supervenient way - water is just water. It is nothing more interesting than a form of organisation where an integrative force has taken over from a previously differentiating one. I mean water doesn't even seem an unpredictable surprise if you know that the same H2O molecules have both a kinetic energy and a van der Waals attraction as properties. We ought to be able to calculate "wateriness" from first principles knowledge of what was already lurking in the background - a suppressed integrative urge.

But then water, once it forms, becomes a generalised substrate that reveals it itself to have unpredictable properties - like being just right as a medium for complex organic chemistry and life. Higher level theories discover these properties in water. It takes further levels of context to give water these new measurable qualities. The emergence is essentially semiotic.

This gets to one of my discomforts with Crowther's presentation (which I only skimmed through, I admit). As a hierarchy of emergence, it doesn't start from the simple and move towards the complex - the natural or systems way of thinking about emergence. Instead it confuses things by trying to stick to the emergence of the simple from the simple - the "simple" meaning whatever is cosmically and universally the most simple condition given the time since the Big Bang.

Now this is a useful view. But it depends on accepting the reality of a thermodynamic direction to time. And of course, standard reductionist or mechanical physical models deal in laws that are fundamentally reversible or time-symmetric. They thus hardwire in a presumption of linearity. Going forwards and going backwards are two views of the same thing. There just is no room for the non-linearities that are these abrupt phase transitions, these sudden changes in state, when a collective mode spreads exponentially across a system to become a new universalised property.

Mechanical models that wire in time-symmetry cannot see emergence for this reason. Which is why collective modes of behaviour seem so spooky or epiphenomenal - not properly physical, causal and real.

In the big view, physics is working towards a directional understanding of a temporal cosmos. The Universe begins in a "pure" quantum state at the Big Bang. And it is headed towards a "pure" inverse of that state at the Heat Death. Everything will become quantum again - an undifferentiated sea of fluctuations - but the other way round. All that was as hot as possible will be as cold as possible. All that was as small as possible will be as large as possible.

Complexity and classicality are thus something that arise in the middle of it all. It is rather like a Benard cell.

What many don't tell you is that a Benard cell only appears briefly. You have to keep the oil in the pan at precisely the right "boiling" point to see a simple pattern of convection currents. Keep heating and the hexagons break up into turbulence.

So simple global order is what you get at the edge of chaos. The starting of a flip, the beginnings of an inversion, the middle of the change. It is the first big fluctuation that heralds the descent into that kind of fluctuation erupting fractally over all available physical scales.

Again, EFTs don't really capture this. They do describe fluctuations in the infinite limit. So they do describe the world that has gone to maximum fractal turbulence and now looks - if we could stand outside it - to be a flat and simple surface. A platform with universal properties that is now suitable for the next level of more complex hierarchical organisation, based on the further use of those properties.

So EFT thinking is good for accounting how the quantum realm becomes thermodynamically cohered at a certain physical scale and takes on the universal properties of the classical realm. And then in turn, biophysics is now investigating the nanoscale quasi-classical scale (of water as a solvent matrix) and showing how its hidden non-linearity is a resource of the semiotic properties that life needed to exist.

So you have the sweeping view of physics where all the quantumness of the cosmos can get brushed under the carpet when things get large and cold enough for electrons and protons to be modelled in terms of individual particles with particular momenta and forces.

The quasi-classical nature of the electron only becomes an issue again when electron behaviour has to be explained in the highly atypical scenario of a chilled metal lattice. Or when we roll forward to the end of time when quantum effects find a way to decay all matter back to cosmic radiation.

And then biology has to get interested in the particular and atypical scenario that is the quasi-classical scale of water. Life depends on the fact that it can insert itself into the zone of criticality, or the edge of chaos, where things are not generally stable but generally poised on the edge of instability. The whole point now is that the EFT outcome - the hitting of a universalised limit - hasn't happened. Instead, life is playing around at the point where it is in-between - like a Benard cell. The collective mode is only being expressed in a fractured or fractal fashion. It is a bit quantum and a bit classical. And so the emergence of a property is this very in-betweenness - this lability, this instability, this duality, indeed this liveliness.

So again, an important distinction that Crowther's presentation brushes over it seems.

There is EFT emergence - where a system has just gone to the limit and universalised a collective mode as a system-wide homogenous property. A phase transition has happened and there is no going back.

Well, reheating or depressurising the system can reverse its state. Water can re-evaporate. Particle accelerators can melt individual particles. But generally - if we are talking about the fundamental laws of a universe - then the direction of time itself locks in the transitions. EFTs seem the rule because they can safely presume symmetries have been broken, lines have been crossed, and there is no general way of ever going back. The general change is now dead and buried, safe to encode as an emergent law.

Then there is this other kind of emergence that is becoming now really important for understanding complexity. Instead of wanting a world that is fundamentally dead and stable, it needs a world poised on the brink of change. It needs instability - as that then becomes the something that it itself can exist to control. If the world is poised for bifurcation - ready to go either way towards begin fully a gas or fully a liquid - then there is room to insert "intelligence" or semiotic mechanism in that gap and mine it in useful fashion.

This was the dream of Maxwell's demon, by the way. And life - as negentropic or dissipative structure - can behave in that fashion, adding Benard cell like order that manages the transition from smooth flows to turbulent ones.

So yes, emergence is a tricky subject as Crowther says. But I am against the idea that it is essentially pluralistic. Let's not leap straight from the one to the many.

For me, a pan-semiotic metaphysics does bring all the strands of the story together. A triadic or hierarchical ontology makes complexity itself irreducible. And spontaneity or instability is part of that irreducible triad.

So between the one and the many is the three-ness of actual hierarchical causality - the true systems of holistic view. This is where it is all leading as you muddle through "more is different" and the move away from an essentially dead and timeless, reductionist or constructivist, physics.

T Clark March 24, 2018 at 00:59 #166044
Quoting apokrisis
So yes, emergence is a tricky subject as Crowther says. But I am against the idea that it is essentially pluralistic. Let's not leap straight from the one to the many.


Thanks as always. Keep in mind where I am in my understanding. When I read Crowther's descriptions, or yours, I'm picking out pieces that I understand or think I may understand, almost like condensation nuclei, where the concepts that are new to me can come together. When that happens enough, I can start to put together a more comprehensive understanding of what is going on. Then I go out and test that, find out where I'm wrong, and go at it again. I'm still at an early stage in that process.
apokrisis March 24, 2018 at 01:12 #166050
Reply to T Clark Yeah. But how do you not understand symmetry breaking? :)

You are an engineer. You must be familiar with classic examples like the magnetisation of a bar magnet, or an Ising model.

What about a buckling beam? Or the onset of turbulence? These are very concrete examples.
T Clark March 24, 2018 at 01:16 #166053
Quoting apokrisis
But how do you not understand symmetry breaking?


We don't talk about it in those terms and no one ever described it that way back in physics. Don't try to explain it. I need to do more homework.
apokrisis March 24, 2018 at 01:49 #166062
Quoting Pierre-Normand
Have you read the paper by Noah Moss Brender that StreetlightX linked to recently (Sense-Making and Symmetry-Breaking, Merleau-Ponty, Cognitive Science, and Dynamic System Theory)?


Just read it quickly. I like the phenomenological slant. And I like the nice summary of symmetry-breaking as differentiation or individuation.

The point of my last long post is that symmetry-breaking is really a dynamical process. Traditional physics has only focused on the EFT view - where the dynamics have gone all the way to equilibrium or heat death. Asymmetry has been achieved for good in that the locally emergent forms have become now the universalised rule - a habit to be taken homogenously for granted.

But a biological or semiotic view sees the larger reality - the one in which the dynamism is still in play for the world. Now contextuality is a living parameter. The rule of some collective form applies only in some relative degree. And it is the hidden non-linearity of that phase transition which makes it eminently "switchable".

If change was always boringly linear and gradual - adiabiatic - the Cosmos would never have been anything other than a cooling~expanding bath of thermal radiation. Complexity - with all its own emergent properties and laws - depends on the orthogonality of abrupt ruptures.

Again, emergence seems mysterious to reductionists as its essential story is hidden off at right angles to the universal gradualism that bottom-up summation or construction can imagine. But sudden transitions are completely natural. Non-linearity is more fundamental than linearity - a dynamicist would say.

So once instability (as in quantum indeterminism or Peircean tychism) is understood as the more fundamental condition of nature, then the best explanation of nature is the pan-semiotic one - the one that speaks directly to the context that controls the abrupt switches in global state.

The cosmic issue is not how does change happen in an essentially timeless and changeless world, but how can unbound change become so firmly regulated? It is all about the emergence of downward acting constraints. The meaningless fluctuations can pretty much be taken for granted.

On Brender's paper, what made me think was the contrast between the phenomenology of Merleau-Ponty and Peirce.

One tries to assimilate ontology to the fundamentals of sense-knowledge. The other instead tries to assimilate it to the fundamentals of logic or rational inquiry.

I think Peirce goes directly to the issue of form - the structure of intelligibility itself. That is why he shines for me. It is phenomenology. But it aims for that general structural logic.

While the Merleau-Ponty school is headed towards a sense-data view - one that risks takes qualia too seriously as drops of conscious experience.

So yes, you can pull back from that and steer the phenomenology towards a modelling relations view. You get back towards a semiotic story. You can understand how it fits with an enactive approach to cognition, or Rosen's relational biology.

But that makes this route - veering towards the senses and away from logic - a rather indirect way to get to the ultimate destination of a process view of structure and form, a metaphysics based on existence as intelligible organisation.

Still a nicely argued paper though.
Streetlight March 24, 2018 at 02:21 #166071
Quoting T Clark
Yes, I understand that. Crowther doesn't use the word either. She actually says "EFT" in order to avoid that.


Just to clear up some terminology, Crowther does not use EFT as a synonym for 'emergence'; indeed, the whole question is whether or not EFTs admit emergence or not. EFT - effective field theory - is just a way of describing the behavior of a system at a particular scale, while ignoring parameters that, for the most part, are not relevant - parameters that might become more relevant at scales other than the one you're interested in. Thus, as you said, we ignore QM at macro scales, and we ignore relativity at velocities much smaller than the speed of light. And Crowther absolutely does use the word emergence: the crux of the paper is its attempt to provide nothing less than 'a positive conception of emergence', which is literally the bolded subheading of one of the paper's subsections.

Another important point to make is that Crowther's conception of emergence is theory-relative: it has to do with emergence between our descriptions of systems. She is explicit about this: "The conception of emergence espoused in this paper is presented as being one that is appropriate and important for understanding the relation between ‘levels’ of theories in physics". So it's not so straightforward as speaking about - as you put it - 'situations where higher levels "emerge" from lower levels in ways that are fundamentally unpredictable'. As I mentioned earlier to Caldwell, this is similar to Rosen's understanding of 'complexity', where what is complex is relative to our ability to model a system - specifically our ability to arrive at a 'largest model' capturing all the dynamics of a system.

Streetlight March 24, 2018 at 02:38 #166079
Quoting apokrisis
While the Merleau-Ponty school is headed towards a sense-data view - one that risks takes qualia too seriously as drops of conscious experience.


It's important to note that this is entirely implausible on any reasonable reading of M-P, who spends page after page in the Phenomenology arguing against such a view.
Pierre-Normand March 24, 2018 at 03:00 #166085
Quoting StreetlightX
Massimo Pigliucci has discussed both on his blog)


I had recommended those four emergence-themed blog posts when I had had my extended debate regarding reduction and emergence on this forum last year. Worthy of mention is George Ellis's three enlightening interventions in the comment section where he responds to Sean Carroll's objection to Pigliucci's strong emergentism. Ellis also supplies an extended quote from R. B. Laughlin. This is all very relevant to what we've been discussing here. (Here, here and here)
Streetlight March 24, 2018 at 03:39 #166089
Reply to Pierre-Normand Ahh, the Ellis comments are wonderful! This in particular:

"Whether we agree on causation or not depends on the weight you put on the words “nothing but” in the phrase “are nothing but constrained, structured, microphysical causal features”. I think the explanation above says it’s a mistake to use the phrase “nothing but”, because significant other causal effects are at work. Part of the problem for a true reductionist is that the electronic gate states are “nothing but” specific states of quarks and electrons; and these again are “nothing but” excitations of superstrings – if they exist, which may or may not be the case. Which level are you claiming is the true microphysical level? The embarrassment for a true reductionist is that we don’t know what the lowest level structure is – we don’t have any viable theory for the bottom-most physical states. Thus if we take your phrase at face value, all physics is “nothing but” we know not what. Why not leave those words out?"

At the very least, I like it because it chimes nicely with what I said regarding the use of '... is only' in reductionist discourse. And of course, the point that "the essential nature of the lower level entities is altered by the local context" - just is the point of this thread. Although the interesting question is then - just how 'essential' is the so-called 'essential nature' when if such 'essence' is altered by local context? What kind of plastic essence is really at stake here? The explanation regarding Cooper pairs and their literal non-existence before the formation of the corresponding lattice structure is also really cool. I'm going to have to keep that one in my back-pocket.
frank March 24, 2018 at 04:02 #166097
Reply to StreetlightX Some methodological reductionism continues to be our baseline whether we know what's down at the lowest level or not. I think that's the answer to "why not leave the words out?"

I can't see that changing without a dramatic reason. I wonder what such a reason would look like.
Streetlight March 24, 2018 at 04:10 #166099
Reply to frank As I said earlier, one can accept - one should accept, as a necessary condition of conducting science at all - methodological reductionism without at the same time acceding to any kind of ontological reductionism. Mostly because the latter is implicitly, if not explicitly anti-science.
frank March 24, 2018 at 04:22 #166103
apokrisis March 24, 2018 at 23:03 #166294
Quoting StreetlightX
It's important to note that this is entirely implausible on any reasonable reading of M-P, who spends page after page in the Phenomenology arguing against such a view.


Hah, yes. I admit that saying M-P was veering right towards a sense-data view was a wild exaggeration on my part. M-P's basic gestalt approach is of course very good - exactly the systems causality I would argue. And Brender gives a wonderfully clear account of that.

So I am thinking more of the ambiguity that Brender identifies....

Merleau-Ponty distinguishes his position from transcendental idealism by insisting that form does not require a consciousness to constitute it. But in order to distinguish his position from materialism, Merleau-Ponty argues that physical form is a perceptual being, “conceivable only as an object of perception.” (SB, 144) Even if we understand perception as a bodily rather than an intellectual activity, this formulation seems to reinscribe the logic of transcendental idealism at the level of vital behavior, placing us right back in the old antinomies:

https://philpapers.org/archive/MOSSAS-2.pdf


...so the issue is there according to the paper.

And Brender argues that Thompson rescues M-P from this veering into a idealist grounding. Although I then agree that M-P is always aiming at an essentially logicist account, a structural account, in intuitively understanding that the forms of nature are emergently self-organising. They will snap into place holistically in gestalt fashion.

Really, the paper is very good. And rereading it with more time, I can see better now how it addresses a weakness in the Peircean story. It allows one to place a sign in the mediating zone that is behaviour.

The relation between consciousness and nature is split in two by the appearance of behaviour as a mediating term...


From a more strongly logicist view, a sign can seem like a token, a static symbol. It is hard to see how "a mark" can stand for an actual lived connection with the world. It sounds too much like still being an idealist and representational approach to the issue.

But behaviour puts the sign out in the gap between the mind and the world. It can be a two-way connection.

And so we can understand the sign as in fact a switch or gestalt-driven bifurcation. This is what Brender makes clear in discussing dynamical systems theory and autopoiesis. What we call the sign is really the mind is putting out there a binary choice - a suggested instability that represents a dialectical choice as to which way to jump. And then the world bumps up against this neurally-encoded suggestion and tips the balance one way or other with gestalt counterfactual definiteness.

So a sign is not something already an object in a single definite state waiting to be read. A sign is a binary choice being suggested - an instability that must have its poised symmetry broken in one direction or the other.

Do I see a cat? The image of whatever it is must be resolved decisively by some set of binary perceptual judgements. Its ears are pointy and perky enough not to be a dog or a rat. Its face is flat enough not to be a fox. Its tail is not too bushy nor too hairless.

So the signs aren't representational in themselves. They represent dialectical switches going off - logically-sharp hypotheticals about the world that then get nudged just enough to trip them decisively and result in a stable interpretation.

The larger story is that the world itself is formed - by its own much longer run history - of a whole concatenation of such symmetry-breakings. So the mind is only reflecting the dynamics of the world in a modelling relation based on this rapid fire Bayesianism. We are seeing in a flash - by throwing out there a whole set of super-sensitive switches - the history that produces "a world of objects, a world of medium size dry goods".

So what on one view seems like mere data processing - imposing a conceptual framework on arriving sense data - would be the more profound thing of recreating in a flash the entire casual history of a process that must lie behind the steady objects of experience. A cat is seen as a cat in a way that truly reflects the ontology of nature. The context, the constraints, that would be needed to conjure up a cat in a place at a time are part of what we perceive.

And of course the enactive turn in psychology captures that probing nature of signs very well. Unless there is "behaviour" - a move that reveals an effect - then there is nothing to connect the mental to the physical, the model to the world.

This in itself is hardly an original thought. It was prominent in Russian neuropsychology - the post-Pavlovian work on the orienting reflex. And also in cybernetics. See perceptual control theory in particular - https://en.wikipedia.org/wiki/Perceptual_control_theory

But it does give a better understanding of a semiotic sign as being a behavioural switch. It is not a sense-data - a passive element already inscribed in some mentalese - but a choice of which way to jump put out there in the world as an active uncertainty. Only the reality of an interaction can nudge matters one way or the other. So the sign represents a dichotomy, a possible symmetry-breaking, rather than a point of reality, a bit of information, that somehow - magically - finds itself crossing the gap between the world and the mind.








SophistiCat March 25, 2018 at 08:45 #166368
Thanks for this discussion. I wasn't really impressed by the title paper, which trades in anecdotes that won't be very informative to non-experts, but in the end doesn't seem to express any clear vision; however, I'll be reading some of the linked materials.

By the way, Batterman has a topical article in the SEP: Intertheory Relations in Physics.

Other relevant overviews: Supervenience and Emergent Properties.

Perhaps some of you here wonder who these terrible reductionists are who hold such absurd and obviously untenable notions (not counting amateurs like Weinberg). Jaegwon Kim would be one such formidable foil (his work is discussed in above articles). His "Supervenience and Mind" is a locus classicus on the topic. But, to add to the general reviews cited above, he also has this short primer:

Emergence: Core ideas and issues

One point to note about Kim and some of the other authors who have written on the topic: Many of them come to it from the perspective of the philosophy of mind, and among their principle concerns are mental causation, epiphenomenalism, eliminativism (about the mental), and so forth. I find it frustrating that in their exclusive focus on the mental they often don't seem to see the larger context of such questions: science (not limited to cognitive science) and other explanatory projects. I am glad that this wider context is the focus of attention here, rather than the parochial questions of the "mental" vs. the "physical."
Dominic Osborn March 25, 2018 at 09:17 #166369
Quoting StreetlightX
No, that would be theology, not philosophy. Explanation follows the explananda wherever it goes, it does not subordinate it to prior stipulations.

In any case, give me arguments, not aphorisms. The latter are not worth much.


What’s theological about “Atoms and the void”?

Your “following the explananda wherever they lead” sounds like “following the Lord wherever He leads”. It’s your attitude—reverence and humility—that defines the object. All those nineteenth century scientists who were sons of pastors—but not just biological sons.

As to the aphoristic style (thanks for the compliment): I am reductionist about writing too.
Dominic Osborn March 25, 2018 at 09:19 #166370
Quoting SophistiCat
parochial


Gorgeous
Streetlight March 25, 2018 at 09:25 #166371
Quoting SophistiCat
One point to note about Kim and some of the other authors who have written on the topic: Many of them come to it from the perspective of the philosophy of mind, and among their principle concerns are mental causation, epiphenomenalism, eliminativism (about the mental), and so forth. I find it frustrating that in their exclusive focus on the mental they often don't seem to see the larger context of such questions: science (not limited to cognitive science) and other explanatory projects. I am glad that this wider context is the focus of attention here, rather than the parochial questions of the "mental" vs. the "physical."


I agree entirely. The focus on mind has always been a bit of sideshow I think - albeit a deeply interesting one - but a sideshow nonetheless.
apokrisis March 25, 2018 at 20:55 #166504
Quoting SophistiCat
Jaegwon Kim would be one such formidable foil (his work is discussed in above articles).


Kim represents the view that emergence is "nothing but" the sum of the microphysics. So he stands at the other end of the spectrum to folk who think emergence is real and wholes can shape their own parts by downward causality.
Caldwell March 25, 2018 at 22:59 #166543
Quoting StreetlightX
But words like 'Absolutism' and 'Relativism' are just words, nominations. What does it matter if you call something 'absolutism' or 'relativism'? You haven't specified the difference these differences make.

Those concepts matter. We are having philosophical arguments, after all. We need to use certain concepts to explain what's going on here.
Relativism is context-driven, absolutism is fundamental-driven (you can also say here, foundation-driven). The reductionist explanation doesn't need context as a crutch. As you say, their explanation is "context invarience". Okay. But I do not find the criticism "context invariant" as damaging to a method of explanation, such as the reductionist's. This does not weaken their theory. Why not? It's because context-driven explanation must necessarily use some form of reference point to relativize. And this point of reference must necessarily come from the fundamental laws themselves, which the reductionist had already set out.

Quoting StreetlightX
As for the cat, what about it? Again, what's the relavence? I think it would be more helpful if you elaborated the stakes involved in invoking these things.

The significance of Schrodinger's cat, if we follow the context-driven explanation, is that it is necessarily observer-driven. We could not make the judgment unless we look inside the box. The mechanism inside the box is designed so as to leave us guessing -- it could go one way, or another. Reconstruction is observer-driven.

Quoting StreetlightX
"The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a 'constructionist' one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. In fact, the more elementary particle physicists tell us about the nature of the fundamental laws, the less relevance they seem to have to the very real problems of the science, much less to those of societies". (Anderson, "More Is Different").

But I think the mistake here is misunderstanding the reductionist's account of reality. If they want to solve the problem of science, do not look at reductionism. The above quote sounds like they want to talk about the ethical treatment of knowledge as it relates to science. That's fine. They should develop a theory on how best to explain the universe, given the scientist and a context, without having to appeal to reductionism. We should know that reductionism is unforgiving. The very first principles they developed was prior to the modern scientific method. They earned their salt.
Caldwell March 25, 2018 at 23:01 #166544
Quoting Dominic Osborn
To a reductionist, or an idealist, these are the same thing.

Yes, I realize that they are.
apokrisis March 25, 2018 at 23:46 #166552
Quoting Caldwell
But I do not find the criticism "context invariant" as damaging to a method of explanation, such as the reductionist's. This does not weaken their theory. Why not? It's because context-driven explanation must necessarily use some form of reference point to relativize. And this point of reference must necessarily come from the fundamental laws themselves, which the reductionist had already set out.


The problem is that regular reductionist physics targets equilibrium descriptions of nature - nature that has "emerged" in the sense of crossing a critical threshold and now seeming utterly stable. The dynamics are so dead that any source of change or individuation has to be imposed as a further cost or effort.

And that is fine. The contextual or the relational drops out of the picture in terms of accounting for the proximate causes of change. It simply becomes an inert or a-causal backdrop. The taken for granted reference frame.

But the physics that is interesting to emergentists is the physics of the boundaries, the critical thresholds, where the dynamics are still poised or unstable - and hence, switchable. The zone of non-linear instability is where interesting things can happen because that is where information can start to insert itself into the process and control the bifurcations or symmetry breakings for its own reasons.

So there is a whole physics of instability - and the distal causes that can regulate that. And that kind of holistic physics is what can be made large enough to include complexity in general, and also in particular, as the complexity that has the autonomy of life and mind.

In other words, it becomes a physics that includes both information and matter. Where things are materially wobbling on the cusp, that is where information - or semiosis - can insert choices about which way to wobble things.

And it is not as if quantum theory - as our most fundamental theory of materiality - isn't already telling us just this. It is all about the instabilities that get regulated by a mysterious "collapse" - the insertion of an "observer" asking questions that stand for some particular point of view.

So all our best physical theories are completely mechanical and observerless - right until we get to the point where the fundamental instability and contextuality of nature can no longer be ignored in our theory building.

Holism has already beaten reductionism at the level of metaphysical generality.

Reductionism is the most efficient, or least information-requiring, way of modelling a material reality where all the symmetries have been broken to the point that the system has gone to stable equilibrium and all the contextuality can be summed up by a simple macro-state number.

But science keeps developing. Over the past 50 years, it has started to get its head around the more general case of modelling a non-linear and relational world.

It's kind of like how the Euclidean presumptions of Newtonian cosmology turned out to be a highly particular view of the total physics that was possible. Non-Euclidean geometries were the more generic physical model in fact.

So yes, we must impose a stable reference frame to allow some system of measurement - a firm base on which we can construct a story of local deterministic causes acting in an a-causal void.

But it will always be revealed that this in itself is a choice made by an observer. And so it can't be the largest model of the physics if it doesn't also include that observer.

Which is why we need the kind of holism that is about information or semiosis regulating the inherent instability of nature. Biology, for one, is on to it.




Caldwell March 26, 2018 at 04:42 #166577
Reply to apokrisis All in good spirit. Like I said earlier, if this discussion is about the normative function of account of reality, no need to bother with reductionism. They are situated where they want to be.

Quoting apokrisis
Holism has already beaten reductionism at the level of metaphysical generality.

I don't know anything about this. Reductionism is not about generality.