How can chance be non-deterministic?
If I throw a dice the chance that I throw any of the six numbers is 1/6. The dice rolls determined towards its destined number.
Our lack of knowledge gives rise to chances. If we somehow could know the initial state of the dice and the exact interactions with its environment then the final numbe? could be known.
Well, that's the naive argument. In practice, it can't be known in priciple. Which doesn't mean that the process is not determined. It is.
You can say that chance is a subjective feature that we project to, for example, the world of dice. A dice has a 1/6 chance of showing one of the six (if the dice is ideal).
Then quantum mechanics shows it face. There nothing is determined except the evolution of the wavefunction. If we throw a quantum dice, each outcome has still 1/6 chance of appearing. But the process leading to an outcome is simply not there. We don't project chances because lack of knowledge but the chances are objective properties. That's the main difference with the classical view.
But how can this be? Einstein ask this question by means of a statement: "Der liebe Herr Gott würfelt nicht". How can there be a pure chance, without a deterministic substrate giving rise to our ignorance? The usual reply is that the quantum world is so much different than the everyday experienced world. But why should that be? That's just a presupposition. The wavefunction might be weird, like the collapse of it. That's true. This gives rise to the "shut up and calculate" attitude. Only math, predictions, and observations (values) and their distributions matter. This attitude doesn't look at the heart of the matter and sometimes even denies its existence.
What's the heart of the matter? The heart is that which is actually going on. How can there be no determined process going on behind the chances? Giving a new kind of chance.
Our lack of knowledge gives rise to chances. If we somehow could know the initial state of the dice and the exact interactions with its environment then the final numbe? could be known.
Well, that's the naive argument. In practice, it can't be known in priciple. Which doesn't mean that the process is not determined. It is.
You can say that chance is a subjective feature that we project to, for example, the world of dice. A dice has a 1/6 chance of showing one of the six (if the dice is ideal).
Then quantum mechanics shows it face. There nothing is determined except the evolution of the wavefunction. If we throw a quantum dice, each outcome has still 1/6 chance of appearing. But the process leading to an outcome is simply not there. We don't project chances because lack of knowledge but the chances are objective properties. That's the main difference with the classical view.
But how can this be? Einstein ask this question by means of a statement: "Der liebe Herr Gott würfelt nicht". How can there be a pure chance, without a deterministic substrate giving rise to our ignorance? The usual reply is that the quantum world is so much different than the everyday experienced world. But why should that be? That's just a presupposition. The wavefunction might be weird, like the collapse of it. That's true. This gives rise to the "shut up and calculate" attitude. Only math, predictions, and observations (values) and their distributions matter. This attitude doesn't look at the heart of the matter and sometimes even denies its existence.
What's the heart of the matter? The heart is that which is actually going on. How can there be no determined process going on behind the chances? Giving a new kind of chance.
Comments (70)
This is if we look at the quantum dice as an isolated system. There are no isolated systems in reality though (except maybe the universe itself - we don't know).
So when we throw the dice in reality, what we're looking at is not just the wavefunction of the dice - but the wavefunction of our throw, the wavefunction of the air, the wavefunction of the table, etc. interacting with each other, is it not?
Now, I'm a donkey when it comes to math but I'd imagine if we were to add up all those wavefunctions that interact with our dice, the result would be something like the determination of our dice throw, no?
I was talking about a metaphorical quantum dice. Sorry if I wasn't clear. Thanks anyway! :smile:
What rule?
Also at the micro-level more likely observations appear more often. Statistical ensembles of measuremeants show you the form of the wavefunction, if all states are prepared the same. This doesn't address my question though.
I think that goes along rather well with what I said.
Quoting Hermeticus
It's obvious then that a higher number of influencing factors (more particles interacting at macro-level) is more determinable than very few, or no influencing factors (micro-level, singular particles).
In fact, we can know. An experiment has been planned to discern between hidden variable and pure chance. It's a though experiment and not yet possible to actually perform, but the idea is to measure arrival times.
The subjective interpretation of probabilities as representing ignorance which you appear to be assuming, is a logical fallacy in my understanding. Lack of knowledge should give rise to possibilities only. Moreover, it is impossible for anyone to distinguish ignorance from objective uncertainty before the fact. Such distinctions can only be drawn after the fact.
To represent uncertainty in terms of probabilities isn't to assign a particular distribution, but to assign a set of distributions, which is sometimes referred to as assigning "imprecise probabilities". In the case of complete ignorance of a thrown die, one should assign the entire set of distributions with six possible outcomes, which amounts to saying that one only knows that there are six possible outcomes.
Furthermore, having described one's state of ignorance by using a set of distributions, one shouldn't then average over the set, as Bayesians often do, to obtain a precise "ignorance prior", for this fallacious practice amounts to an attempt to extract information from ignorance.
What do you mean? Perform the question? Untill now it's just a thought experiment, like all experiments are first.
It's doable but not yet. Work in progress.
Dunno. I read it on a physics forum.
I'll give it a try. A deterministic reality behind QM (like van 't Hooft or Bohm suggest and me too) will lead to different trajectories of particles. This results in different arrival times than pure chance gives you. If you start with pure chance calculations and from this result different times are measured, then there is a process behind the chances, as is more reasonable.
It is imprecise because probability intervals are assigned to outcomes, rather than numbers.
e.g. P ( dice throw = six) = (0,1)
Which only express the fact that throwing a six is possible, but not certain.
What does this mean? In words?
It is saying that the physical propensity for obtaining the respective possible outcome, is above zero and less than 1. Without additional information, or assumptions, a more precise set cannot be assigned to the outcome.
Quantum interactions are physical, therefore, they affect the physical world, therefore they affect you and me, in turn making us ultimately not truly predictable.
Again, being able to guess that someone is going to do something and they actually do it at about the same time you thought they would and in the same manner, that's just a good estimation. Some things are eventualities because nature has laws and it works a certain way. You can't change the fundamental laws. So yes, you can "predict" if you will, that eventually this and that will happen, because that's natural law. But you CANNOT predict EXACTLY how and EXACTLY when it will occur. Beyond that though, lies the abyss of uncertainty and chance.
IT works like this. If logic says A + B = C. Then we must first figure out how we got to A and then B in the first place. And so on, and so on. A never ending chain of questions, which leads us back to the only answer - that is: You cannot predict the future. Yes, logic can help you understand things, but you cannot perfectly control or predict how or when you are going to use it
The Achilles' heel of determinism is the problem of induction.
Determinsim seems wedded for all eternity with the so-called laws of nature (go to the science section) but, for better or worse, all causal patterns are, it seems, severely undermined by Hume's coup de grâce, delivered mid-18th century. I don't know what the fuss is all about! Determinism is predicated on the laws of nature but the laws of nature have no leg to stand on.
The pragmatists rebuttal to this is that if a specific outcome is not predictable, even in principle, then it is meaningless to call the phenomenon determined. I think this is a good way of looking at things. This argument can be extended to include phenomena that are completely impractical to predict accurately, e.g. the winner of the 2024 Super Bowl. I think the dice example is somewhere in the middle - highly impractical in most cases, but not impossible.
And this is all before quantum mechanics and chaos theory are taken into account.
Apparently you haven't gotten the word. Hume was full of it. Of course induction works. This is probably not the right thread to discuss it further.
My understanding is that chance entails lots of brute contingencies. Why does A happen and not B? It just does and it isn’t possible for there to be an explanation, since this would remove the chance.
I was responding to the common belief that chance represents ignorance. We know from experience that the odds for a "fair die" landing on any side, if tossed in a typical fashion, are roughly 1/6 in the sense of relative frequencies. This is essentially the definition of what a "fair die" is.
But a priori, we don't even know that. For in the case of an unknown die that isn't necessarily fair, all we know a priori is that the odds for any outcome is between 0 and 1. Nevertheless, there persists a convention which assigns a uniform distribution in the case of an unknown dice. But this is misleading for it conflates knowledge of a fair die with ignorance of an unknown die, and also leads to inferential biases in the case of an unknown die that aren't warranted.
If all you know is that an event has n possible outcomes, there is nothing more than can be said, and chance cannot be quantified.
I think the determinists response is that each of the "lots of brute contingencies" is determined, even if we don't know what they are or how they are caused. In this view, chance is just another word for our ignorance of what determines what. I don't buy that.
So it looks like the choice is between a view that forgoes further explanation and one that claims but can’t demonstrate its explanation.
So, the probability that a law of nature will break down is nil?
Prove it!
If each state is determined by its anterior state the first state wouldn’t exist because there was no anterior state to determine it.
In any case, we couldn’t know the initial state of the dice and the exact interactions with its environment because by the time we did the initial state and exact interactions would be different. This is not because we are too slow or inadequate at examining states, but because there are no states.
It doesn't have to be perfect. It only has to work well enough to be useful and understandable enough so we can figure the uncertainties. You use induction all the time.
I don't consider the idea of determinism very useful in any but the simplest situations. Scientific generalizations, including laws, we develop describe how the world happens to work, not how it has to work. The law of conservation of matter and energy does not cause matter and energy to be conserved.
This seems fair. I’m inclined to accept that chance outcomes exist and have no explanation; explanation being something that appears to run out regardless of the world view held.
Is even the world of a choose-your-own adventure story, where alternative courses of action can be chosen by the reader, describable with either of these adjectives?
And that is metaphysics.
If there is an explanation for why A happens rather than B, in what meaningful sense is A a chance outcome as opposed to a determined one?
You’re referring to it as chance, but it isn’t really chance; it’s just ignorance of a determined outcome. If true chance outcomes exist then they necessarily lack an explanation.
Sure, but ignorance of an outcome is not what I’ve been referring to as chance. If someone believes only that sort of chance exists then I don’t see what distinguishes their view from a deterministic one.
Quoting CasaNostra
I’ve contended that true chance outcomes are brute contingencies—they don’t have an explanation because they necessarily can’t have one.
Can you demonstrate that this is always (or ever?) the case in any event?
Quoting CasaNostra
Because then they wouldn’t be chance outcomes, but determined ones we only call chance because of our ignorance.
Quoting CasaNostra
Unless you can demonstrate this I’m fine believing there are.
I accept that there’s physical stuff governing the roll of the dice; I don’t accept that you could predict the outcome even if you knew everything.
What I should have said is that I don’t accept that everything can be known such that the outcome of the roll could be predicted.
Fair, guess we’re not substantially in disagreement then.
Well, we still disagree here. I don’t mind believing that pure chance causes those outcomes; that they’re random and have no explanation.
The Galton board is a good example. But doesn’t it illustrate the way that micro chance and macro determinism are yoked together?
The board engineers things so that every peg gives a 50-50 probability of deflecting a falling ball to its left or right. The randomness is deliberately maximised at this level - or else it is a loaded board. We can argue that no board could ever be so perfectly engineered. Each peg might be infinitesimally biased. But the point of the exercise is to approach the limit of pure randomness at this level.
Then given a perfect board, it will produce a perfectly determined probability distribution. At the macro level, you can be absolutely certain of a nice and tidy Gaussian distribution emerging from enough trials.
Each ball hits 7 pegs on the way down. Each deflection is a 50-50 split. There is only one way to hit the outside bin - 7 left or right deflections in a row. And then 70 ways to land in the central two bins as an even mix of left and right deflections.
So the individual pegs provide the pure chance. But the board as a whole imposes a sequential history on what actually happens - a certainty about the number of 50-50 events and the number of different histories, or paths through the maze, that describe the one final outcome.
So in a Platonically perfect world, the micro and the macro scale are engineered to represent the opposing ideals or the accidental and the determined. The system isn’t either the one or the other in some deeper metaphysical sense. It is designed to represent the dialectic of accidental versus determined as being the proper model of a reality that is probabilistic.
Chance and determinism are yoked in a reciprocal relation as the opposing limits of nature, Micro level chance and macro level determinism are how we get a system that has a stochastic character.
Then of course, the problem is that the real world may not be amenable to such perfect engineering. This is where chaos and quantum effects impact on things.
Chaos is about non-linearity. It is written into our assumptions about the pegs and the board that we can keep any imprecision in our engineering within linear bounds. Any bias or error in the construction will itself be averaged away in Gaussian fashion. But if there is non-linearity of some kind - maybe the pegs are springy in a way that reverberations are set up - then errors of prediction will compound at an exponential rate. The attempt to engineer a perfect distinction between local randomness and global determinism will go off course because of the emergence of non-linear divergences that lead to new kinds of internal correlations, or synchronised behaviour.
Then quantum uncertainty also affects our perfect engineering. If the Galton board is very small or very hot, then it is going to start to misbehave. Everything from the balls, to the pegs, to the board as a whole, will be fluctuating in ways that introduce an indeterminism about both the randomness of each deflection event and the determinism about the countable ensemble of paths as a whole.
Again the classical picture of a world cleanly split between absolute chance and absolute constraint will lose its linearity and become subject to an excess of divergence and/or an excess of correlation.
We will arrive at the quantum weirdness of a physical system that either diverges at every event to create a many world ensemble of separate histories, or we have to accept the other available interpretation - that there are spooky non-local correlations limiting the chaos.
So what I am arguing is that the classical picture demands some kind of monistic commitment - either reality is fundamentally based on determinism or chance. But our best models of randomness or probability are intrinsically dichotomistic. It is essential to construct a system - whether it is a die, a coin, a Galton board, a random number generating algorithm- that exemplifies indifferent chance at the micro scale and constraining history on the macro scale.
Then we learn in fact that physical reality can’t be so perfectly engineered. We can approach linearity, but only by suppressing non-linearity. To achieve our Platonic image of the ideal gaming device, we have to do work to eliminate both its potential for divergence - too much local independence in terms of accumulating history - as well as the opposite peril of a system with too much internal correlation, or too many emergent intermediate-scale interactions.
Yep. Your post got me thinking that this is another way into the interpretation issues.
The first step is to drop the monistic demand that something is something "all the way down". Even a classical view of a system exhibiting perfect randomness achieves its goal by imposing a strong dichotomy on nature. Perfect randomness at the local scale of the independent events have to be matched by perfect determinacy in terms of the macroscopic boundary conditions. A die has to be precision machined so that as a six-sided shape, it rolls fair.
And then having understood classical indeterminacy in that fashion, that opens the way to understanding probabilistic systems where this essential dichotomy itself has a larger story.
From a mathematical perspective, you get chaos and its non-linearity - a dichotomy where the opposing limits are about divergence vs coherence. You get every trajectory able to bend away from the straight line in unpredictable fashion, but also the opposite thing of all trajectories being bent towards a common goal - the correlations that produce attractors.
Quantum theory seems to say much the same thing about material reality. You have both more convergence and more divergence than linear classicality would make you suspect. The uncertainties are more uncertain, and the certainties also more certain.
So the monistic ground becomes a fundamental dialectic. And this dialectic in turn is revealed to have a more generic form. Classicality emerges as the perfectly engineered limit of a more basic dichotomy where the non-linearities have yet to be tamed. You get both more divergence in the parts and more coherence in the whole.
The Galton board pegs can dance and so be even more chaotic, but they also can dance in synchrony, and so deliver a more ordered result.
Quoting tim wood
Sure, linearity does get achieved to a useful degree. Otherwise we wouldn't be here to discuss quantum weirdness or chaotic non-linearity. The anthropic principle applies there.
But the OP did raise the question of how classical randomness can square with quantum indeterminancy.
My answer is that these are not two incompatible models. They may be the one model, but with constraints added. Quantum reality is the non-linear version (as a broad brush statement) and classical reality is the linearised version of that - the thermally decohered limit.
Quoting tim wood
But I am arguing that QM is the larger dichotomy in which the classical dichotomy is embedded. So QM is its own less constrained, less linear, version of the micro-macro dichotomy under discussion here.
QM uncertainty is both about the smallest spacetime scales and the greatest energy densities. It is about non-linear fluctuations – endless quantum corrections to any classical particle value - yet also the constraining holism of non-locality. If we take the Feynman path integral literally, a particle explores every possible path to discover the path that delivers on the constraint of the least action principle.
It breaks the rules in both their directions. It is more extreme in terms of its individuated chance, and more extreme in terms of its contextual determinism.
So same old divided world, but less linearised in both regards.
It just is. I don’t see how it’s amenable to explanation. I’m in the camp that quantum randomness is real, ontological, not just an epistemological problem, though there are interpretations that have it be like the dice example. Pilot wave theory for one if I understand correctly.
Then again, I don’t know shit about QM.
Are you talking about statistical mechanics, e.g. pressure arising out of the random behavior of molecules, or something else?
Why shouldn't there be? What prima facie case is there that there ought not to be chance?
Quoting apokrisis
What about Peirce's 'tychism'? Didn't he see chance as basic? And does it have to be one or the other - all chance, or totally detemined? What about the strange attractors in chaos theory - they produce patterns arising from apparently minute fluctuations - which seems a way of conceptualising something which is both a product of chance but also subject to laws?
Systems with pressures and temperatures are examples of this general way of thinking.
But one of the other things I would point out here is the “weirdness” of the situation where the random kinetics of the particles of an ideal gas is seen as the deterministic part of the story, and macro properties like pressure and temperature become the emergent accidents.
Again, that is the consequence of a backwards metaphysics that wants to make Newtonian dynamics the generic case and statistical systems, and quantum systems, the special cases.
This is like thinking everything is Euclidean geometry - flat and infinite - and that non-Euclidean geometry is some weird extra. We had to flip that around once we realised - ad with relativity - that it is in fact linear Euclideanism which is the special case here.
Yep. But unfortunately a further dichotomy is built into that - one that Peirce was still working on.
Just as there is a Aristotelian distinction between potentials and possibilities, there is a distinction between vagueness and fluctuation.
So there is “chance” that is basic in terms of being a logical vagueness - anything might be the case. And then there is “chance” in the sense of some definite spontaneous event - a tychic “sporting”.
One is about the generality of potential being. The other is about the particularity of some accident of being - a definite possibility that is logically crisp in the sense of being a counterfactual.
Again, this speaks to a holistic systems view of nature as concrete chance only exists by virtue of the counterfactuality of some matchingly definite context. It is a local-global deal. The radioactive particle is still there or it just spontaneously decayed.
But the quantum vacuum is a much vaguer beast - a generic indeterminacy. It is both full of fluctuations, and yet they are “virtual”. The vacuum needs a constraining context to make its zero-point uncertainty manifest. You need an apparatus like two Casimir plates to turn a vague potential into definite possibilities.
Quoting Wayfarer
Attractors are produced by correlated interactions. So rather than trajectories exploring the world with complete freedom, they become entrained to emergent patterns.
Draining water forms a spiral. A vortex is a simple point attractor. Rather than every molecule having to find its own random path to the plug hole exit - which could take for bloody ever - they get sucked into the most efficient possible path that solves the collective problem.
Order out of chaos, as they say. All the minute individual fluctuations are overwhelmed by mob forces.
The butterfly effect is then the widely misunderstood converse of the story. If we try to figure out which minute fluctuation began the general plug hole spiral, we might pick one wee fellow that seemed to mark the right angle of attack first. It was the spontaneous fluctuation that broke the symmetry and so set up the giant “tropical storm in a distant land” that became the gurgling vortex.
But really, the cause of the vortex was the general shape of the system - the boundary conditions that set up a bath full of water where the plug had suddenly been pulled. After that, any old fluctuation could have been the first panicked lurch that set the whole crowd stampede off.
Then, by extension, determinism isn't perfect! In other words, chance and free will are a possibility.
I wrote this in response to a thread on another forum, in respect of a discussion of this paper:
'Heisenberg had in mind Aristotle’s ‘potentia’ which is more like ’the realm of possibilities’. There is a real ‘realm of possibilities’ which is defined by the wave function, very precisely, as a distribution of probabilities of possible outcomes. (The possibility of the electron being, say, 'a rabbit', is not included in that realm, because that result is not ‘in the realm of possibility'.) When the observation is taken then the realm of possibilities collapses into a single actuality, which is the so-called ‘collapse of the wave function’. That’s what the Everett formulation is seeking to avoid, hence its proliferation of worlds.
Surely if I ask you the whereabouts of some unobserved particle, the answer is given by the Schrodinger equation, right? You can’t say ‘oh, that’s it, right there’. The only reply you can give to that question is a distribution of possibilities. which is the likelihood of a particular result when it’s measured. So put another way, the answer to the question ’does the object exist?’ is the equation, isn’t it? You can’t say ‘yes it exists’ until the measurement has been taken. So the object is not unambiguously real until it’s measured. All there are until that point are patterns of probabilities. The wave function describes degrees of reality, which is the same as degrees of likelihood.'
Probability theory hints at what those terms describe; they describe semantic relationships between data-sets or theories. For example, the sequence {1, 2, 3} is "determined" in relation to the sequence {a,b,c} under the assignment a --> 1, b --> 2, c-->3. But it is undetermined with respect to the set of sequences beginning {1,2,...} which include it as a special case.
Consider historical counterfactuals; Must Hitler have invaded Poland? In spite of appearances, the meaning of this question isn't about the literal existence of a possibility available to the German government in the year 1939, rather it concerns the relation of a model of the actual event to a hypothetical set of "similar" circumstances, such as that defined by a historical simulator, where it is the notion of "similarity" that is actually the focus of the question.
Yep. That is the creation of concrete possibilities by the preparation of a system. It is like carving a die with six sides. You constrain things so that outcomes are limited to a particular range of choices.
Vagueness would be a deeper state of indeterminacy. The wavefunction of the universe would be so broad as not to either rule in or rule out the existence of any particular electron and its history.
Quoting Wayfarer
The problem is that the collapse isn’t part of the formalism. So there isn’t a good ground for claiming some kind of definite transition that promotes the particle from some kind of existence as a probability to a state of being real.
I don’t have a hard position on the issue for that reason. But decoherence at least let’s the thermal environment be the “observer”. We can do without an actual collapse because the uncertainty reduces asymptotically towards a definable limit.
I like the term, almost surely, in probability theory - https://en.wikipedia.org/wiki/Almost_surely
A probability of 1 isn’t absolutely certain. But close enough for all practical purposes.
The demand for a collapse is another example of the backwards metaphysics that infects the quantum vs classicality discussions. The holistic view says uncertainty is merely being constrained. Reality doesn’t actually have to be made certain to exist. Being highly constrained gives it enough of a definite counterfactuality to amount to the same thing.
Your question reveals your implicit materialism.
( ;) It can happen to the best of Buddhists that a hint of materialism sneaks into their thinking.)
This is wrong view:
What's wrong with materialism? Matter's true nature is unknown. It stays mystique, even if it's matter "only".
This is something I've thought about. I don't find the idea of determinism very convincing. To much of the world is too complicated to make that a useful way of thinking, e.g. the molecules bouncing around with a wide range of kinetic energies. On the other hand, the macro behavior, the pressure and temperature in the boiler, behaves in a very predictable way, at least as long as we keep it fairly simple.
Here's one of my familiar refrains. Determinism vs. free will is a metaphysical distinction. Neither is true. Neither is false. Either may be useful in different situations.
What's wrong with materialism? That materialists typically _don't_ believe things like "Matter's true nature is unknown. It stays mystique".
I have no idea why people cry for determinism/deity. I guess it is the assurance that they have no say in the direction of their life.
Nothing is determined. There are learned habits (memory of the Mind) as well as choices in intention that the Mind makes.
Right up my alley,
[quote=Bhart?hari]Sarvam mithy? brav?mi (Everything I'm saying is false).[/quote]