Determinism vs 'Intelligent Design'
As I see it, the problem with ‘hard determinism os that taken it to its logical conclusion would imply that somehow everything that exist now and will exist in the future was present in a program-like form in the moment of creation. Such an assumption has uncomfortable to the discredited Intelligent design in terms of inevitability.
The following analogy is worth considering:
Suppose you through a song into an infinite body of water (thus eliminating the effect of the shoreline shape) The pattern of the radiating waves will entirely depend on the shape of the stone. You could therefore consider the shape of the stone as being a ‘program’. Such a process would be purely deterministic. However, things would change dramatically if the body of ter has a fractal shore line.
When hitting the shore the waves would regenerate into secondary, terfairy …., wavelets as effects of reflection, interference that could carry on indefinitely.
One still might say that this process of iteration is caused by the throw of the stone but it nevertheless acquires a life of its own that is detached from the original plunge of the stone.
Of course, my analogy does not stand up. There were not the equivalents of the pond and the shore at the moment of creation. The point I am trying to make is that at a certain level of complexity there could be a break in the linearity and inevitability in the causal chain originated by the Prime Cause. One cannot rule out some mechanism, be it quanta, chaos, orcosciosnes that acts in a remotely similar way to my analogy.
The following analogy is worth considering:
Suppose you through a song into an infinite body of water (thus eliminating the effect of the shoreline shape) The pattern of the radiating waves will entirely depend on the shape of the stone. You could therefore consider the shape of the stone as being a ‘program’. Such a process would be purely deterministic. However, things would change dramatically if the body of ter has a fractal shore line.
When hitting the shore the waves would regenerate into secondary, terfairy …., wavelets as effects of reflection, interference that could carry on indefinitely.
One still might say that this process of iteration is caused by the throw of the stone but it nevertheless acquires a life of its own that is detached from the original plunge of the stone.
Of course, my analogy does not stand up. There were not the equivalents of the pond and the shore at the moment of creation. The point I am trying to make is that at a certain level of complexity there could be a break in the linearity and inevitability in the causal chain originated by the Prime Cause. One cannot rule out some mechanism, be it quanta, chaos, orcosciosnes that acts in a remotely similar way to my analogy.
Comments (18)
:razz:
That would be an unusual way to use the term 'intelligent design'.
There have been a lot of discussions around this type of issue. They often come down to a question of the validity of a reductionist approach to science. The behavior of a complex, dynamic physical system will be consistent with so called laws of physics. That does not mean that the behavior of the system is predictable, even in theory, by those laws. It works top down, but it doesn't work bottom up.
During one of those previous discussions, someone, I forget whom, suggested "More is Different" by PW Anderson. I found it very helpful. Here's a link:
http://robotics.cs.tamu.edu/dshell/cs689/papers/anderson72more_is_different.pdf
Here's some of what Anderson says:
[i]The reductionist hypothesis may still be a topic for controversy among philosophers, but among the great majority of active scientists I think it is accepted without question. The workings of our
minds and bodies, and of all the animate or inanimate matter of which we have any detailed knowledge, are assumed to be controlled by the same set of fundamental laws, which except under certain extreme conditions we feel we know pretty well.....
....The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a "constructionist" one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. In fact, the more the elementary particle physicists tell us about the nature of the fundamental laws, the less relevance they seem to have to the very real problems of the rest of science, much less to those of society. The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. The behavior of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear, and the understanding of the new behaviors requires research which I think is as fundamental in its nature as any other.[/i]
Actually it works bottom up, too, once one realizes that you need two things in your knowledge before you can construct: all the laws, and the initial state (including movement).
The initial state does not need to be some sort of moment of creation; any time point in history or in a future time is sufficient if one can fathom the state then, and the laws that govern movement.
If, by this, you mean that higher levels of organization can be predicted from the laws of lower levels, Anderson and I disagree.
Yes, Higher level organizations can be predicted, but not their laws from knowing how lower level organizations work. But it has been a given that all laws are known. That was one of the hypotheses or premises. Including lower, higher, and intermediate level of organizations. Therefore determinism (when knowing the state at any one time instance and all the laws) rules.
I don't know if they still use the term "higher" and "lower" level organizations... that's a sort of matterism. Or substantism. PC is big time in the forefront of current academic philosophical practice -- this is the first reason I only know about philosophy what I can intuit, because I was thrown out of the university the day before classes started.
If reductionism is true, then indeed it must work bottom up. That's not to say that scientific research should be directed toward (say) fully accounting for biology with quantum field theory - that's impractical, but it is in principle possible to do so - or at least would be if our knowledge of fundamental physics were complete. If it is not possible, this implies there are some higher level properties that are ontologically emergent (and thus truly unpredictable), which contradicts reductionism.
Reductionism does imply that the current state of the universe was in principle predictable at the big bang. Quantum indeterminacy means that base prediction would have actually been of a huge number of possible states of the universe, of which the current state is but one of that number.
You and I seem to agree. If you can't predict the behavior of complex systems from the bottom up, then a reductionist approach is misleading. Since bottom up prediction is not possible, therefore reductionism is wrong. QED.
Well, not wrong really - as I said, misleading. Reductionism is a metaphysical approach. It isn't right or wrong, it's more or less useful in specific situations. It's very useful when dealing with subatomic particles. It gets less so very quickly when you leave that size scale.
Reductionism could be an approach, but I've only seen it used as an ontological commitment- so from that perspective it is either consistent with reality, or it isn't. Its converse is ontological emergentism, which is the claim that some higher level properties are not a product of lower level properties. Consciousness is cited as the most likely example of ontological emergence.
In my opinion, those who think of reductionism as a yes or no thing are misguided. Anyway, now you've seen it used as a metaphysical approach. I'm not the only one who sees it that way. It's a mainstream, not to say predominant, view.
Quoting Relativist
My understanding of what you call emergentism is not that higher level properties are not the product of lower level ones, only that they are not predictable from them. Those are not the same thing.
There are two flavors of emergentism: ontological and epistemological. I think you're referring to epistemological emergence, since you're accepting that higher level properties are the product of lower level properties, but not predictable. Ontological emergence is stronger: it entails the emergence of novel properties that exist exclusively in the higher level that cannot, in principle, be reduced to fundamental physics. Consider mental causation: our minds have causal effects on substances in the world; is this mental activity reducible to particle behavior (reductionism is true), or is the mental activity entail ontological emergence from the material in our brains (reductionism is false)? If you're interested, the Stanford Encyclopedia of Philosophy has an article on this (here).
I took a look at the SEP article, in particular in relation to the distinction between epistemological vs. ontological emergence. I must admit, I don't get it. It seemed like reshuffling words to complicate something that should be much simpler. Anyway, I don't see how the distinctions discussed make any difference to the argument I was making.
I'll spend some more time with the SEP article.
In the sense that both share inevitability and pre-destination
One couldn't believe in intelligent design where there's some indeterminism and free will?