You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

Explaining probabilities in quantum mechanics

Andrew M September 01, 2017 at 04:05 17500 views 68 comments
The Copenhagen Interpretation of quantum mechanics postulates the Born rule which gives the probability that an observer will measure a quantum system in a particular state. For example, if a photon is sent through a beam splitter, the Born rule says that the photon will be detected on the reflection path half the time and on the transmission path the other half of the time.

The Born rule gives the correct probabilistic predictions. But what explains the Born rule? And why are there probabilities at all? According to the Copenhagen interpretation (which rejects causality) there is no explanation - the probabilities are just a brute fact of the universe. Hence the Born rule must be postulated under that interpretation.

However for causal interpretations (such as Everett and de Broglie-Bohm), the probabilities must have an underlying causal explanation. So, in the case of Everett, the Born rule should be derivable from unitary quantum mechanics, not merely postulated. I'm going to outline a derivation below - note that it draws from Carroll's and Sebens' derivation.

The first issue is why we should expect probabilities at all. With the beam splitter example above, the Everett interpretation says that a photon is detected on both paths (as described by the wave function). But an observer only reports detecting a photon on one path. Therefore, on the Everett interpretation, the probabilities describe the self-locating uncertainty of the observer (or the observing system), not the probabilities that the photon will be detected exclusively on one path or the other.

The second issue is why the probability is that given by the Born rule and not by some other rule.

The Born rule states that the probability is given by squaring the magnitude of the amplitude for a particular state. In the beam splitter example above, the amplitude for each relative state is ?(1/2). So the probability for each state is 1/2.

It appears that in these scenarios we should be indifferent about which relative state we will end up observing. So the simple rule here is to assign an equal probability to each state. If there are two states, then the probability of each is 1/2. If there are three states, then the probability of each is 1/3. And so on. This is often called branch-counting and it makes the correct predictions in cases where the amplitudes for each state are equal.

So far so good. But this approach doesn't work if the amplitudes are not equal. For example, suppose we have a beam splitter where the probability of reflection is 1/3 and the probability of transmission is 2/3.

In this scenario there are two states with amplitudes ?(1/3) and ?(2/3). If we are indifferent about which state we will end up observing, then we will wrongly assign a probability of 1/2 to each state. Branch counting doesn't work in this case.

So we have a simple indifference rule that only works in specific circumstances and seems inapplicable here. What can be done? One thing we can do is to transform the setup so that the rule does become applicable.

To do this, we can add a second beam splitter (with 1/2 probability of reflection and transmission) to the transmission path of the first beam splitter. When we send a photon through this setup, there are now three final states and they all have equal amplitudes of ?(1/3) each (i.e., ?(2/3) * ?(1/2) = ?(1/3)).

Now the indifference rule can be applied to get a probability of 1/3 for each final state. Since there are two final states on the first beam splitter's transmission path, their probabilities add to give a total probability for the first beam splitter's transmission path state of 2/3. This is the correct prediction as per the Born rule and it only assumes the indifference rule as required.

This works for any scenario where states of unequal amplitudes are factorable into states of equal amplitudes.

Does anyone see any problems with this?

Comments (68)

Wayfarer September 01, 2017 at 04:32 #101518
Only that it perhaps ought to have been posted on Physics Forum.
Andrew M September 01, 2017 at 04:49 #101519
Reply to Wayfarer It's about the connection between causality and probability which is really a philosophical issue not a physics issue.
Wayfarer September 01, 2017 at 05:49 #101525
Reply to Andrew M Fair enough. Could you just unpack a bit more what the problem is that you're trying to solve? Why is the role of 'probability' such that it requires such an elaborate response? Is that related to Einstein's 'god playing dice' remark?
Andrew M September 01, 2017 at 07:40 #101535
Reply to Wayfarer Yes. Before quantum mechanics came along, it was assumed that probability reflected a lack of knowledge about the world (i.e., it was an epistemic issue). But quantum mechanics suggested that probability was baked into the universe at the most fundamental level, violating causality. Which irked Einstein and prompted him to say that "God does not play dice with the universe".

So the problem for a causal interpretation (as Everett and de Broglie-Bohm are) is to explain how probability arises in a causal universe. In particular, if the wave function describes a state evolving into a superposition of two states where one state has an amplitude of 1 and the other has an amplitude of 2, then why should the probability of observing them be in the ratio of 1:4? That is the squared rule that is the Born rule.

If that can be explained in a causal framework then it restores the idea that probability reflects a lack of knowledge about the world, it's not fundamental.
Wayfarer September 01, 2017 at 08:28 #101541
Reply to Andrew M well if you accept that the universe is not causally closed, the whole problem goes away. But apparently that is too high a price.
Rich September 01, 2017 at 11:20 #101573
Quoting Andrew M
If that can be explained in a causal framework then it restores the idea that probability reflects a lack of knowledge about the world, it's not fundamental.


Probability can still be baked in the universe even with a causal interpretation. The cause may be inherently probabilistic, which is one of the possible interpretations of the Bohm quantum potential initial conditions. Hence, the reason Bohm suggested that his interpretation is causal yet non-deterministic.
Andrew M September 01, 2017 at 21:57 #101643
Quoting Wayfarer
well if you accept that the universe is not causally closed, the whole problem goes away. But apparently that is too high a price.


That's a possible response. But if you make a distinction between the universe and reality, then it just pushes the issue back a level. That is, is reality causally closed (recast as the Principle of Sufficient Reason rather than in physical terms)?
Andrew M September 01, 2017 at 21:59 #101644
Quoting Rich
Probability can still be baked in the universe even with a causal interpretation. The cause may be inherently probabilistic, which is one of the possible interpretations of the Bohm quantum potential initial conditions. Hence, the reason Bohm suggested that his interpretation is causal yet non-deterministic.


That raises the question of the status of the Born rule under such interpretations. It would seem that the Born rule could only be postulated, not explained or derived.
Rich September 01, 2017 at 22:08 #101646
Reply to Andrew M I agree. I certainly have no love for the Copenhagen Interpretation nor the way the Copenhagen group successfully rammed it down everyone's throat.
apokrisis September 01, 2017 at 22:46 #101651
Quoting Andrew M
If that can be explained in a causal framework then it restores the idea that probability reflects a lack of knowledge about the world, it's not fundamental.


I like the quantum information approach where the view is that uncertainty is irreducible because you can't ask two orthogonal questions of reality at once. Location and momentum are opposite kinds of questions and so an observer can only determine one value in any particular act of measurement.

So this view accepts reality is fundamentally indeterministic. But also that acts of measurements are physically real constraints on that indeterminism. Collapse of the wavefunction happens. It is only that you can't collapse both poles of a complementary/orthogonal pair of variables at once. Maximising certainty about one, minimises uncertainty of the other, in good Heisenberg fashion.

What this interpretation brings out sharply is the contextual or holistic nature of quantum reality. And the role played by complementarity. Eventually you are forced to a logical fork when trying to eliminate measurement uncertainty. You can't physically pin down two completely opposite quantities in a single act of constraint.

Grab the snake by the head, or by the tail. The choice is yours. But pin one down and the other now squirms with maximum freedom and unpredictability.

Of course, when talking of observers and measurements, we then have to grant that it is the Universe itself which is doing this - exerting the constraints that fix a history of events. So it becomes a thermal decoherence interpretation with the second law of thermodynamics being the party interested in reducing the information entropy of the Universe as a physical system.
Wayfarer September 01, 2017 at 23:19 #101658
Quoting Andrew M
That is, is reality causally closed (recast as the Principle of Sufficient Reason rather than in physical terms)?


Things happen for reasons, but there is also serendipity, chance or hazard. There is an element of spontaneity involved.

Quoting apokrisis
when talking of observers and measurements, we then have to grant that it is the Universe itself which is doing this


'A scientist is just an atom's way of looking at itself' ~ Niels Bohr.

Note that 'the Copenhagen interpretation' is NOT a scientific hypothesis. It is simply the kinds of things that Heisenberg, Bohr and Pauli would say about the philosophical implications of quantum mechanics. The term wasn't even coined until the 1950's. But it is an epistemologically modest attitude, in my view, and one generally in keeping with the tradition of Western natural philosophy. Heisenberg's essays on Physics and Philosophy are very interesting in that regard.
apokrisis September 01, 2017 at 23:43 #101665
Reply to Wayfarer Modest or radical? The Copenhagen Interpretation is metaphysically radical in paving the ground to acknowledge that there must be an epistemic cut in nature.

The "modest" understanding of that has been the good old dualistic story that it is all in the individual mind of a human observer. All we can say is what we personally experience. Which then leads to folk thinking that consciousness is what must cause wavefunction collapse. So epistemic modesty quickly becomes transcendental confusion. We have the divorce in nature which is two worlds - mental and physical - in completely mysterious interaction.

I, of course, am taking the other holistic and semiotic tack. The epistemic cut is now made a fundamental feature of nature itself. We have the two worlds of the it and the bit. Matter and information. Or local degrees of freedom and global states of constraint.

So CI, in recognising information complementarity, can go three ways.

The actually modest version is simple scientific instrumentalism. We just don't attempt to go further with the metaphysics. (But then that is also giving up hope on improving on the science.)

Then CI became popular as a confirmation of hard dualism. The mind created reality by its observation.

But the third route is the scientific one which various information theoretic and thermodynamically inspired interpretations are working towards. The Universe is a system that comes to definitely exist by dissipating its own uncertainty. It is a self constraining system with emergent global order. A sum over histories that takes time and space to develop into its most concrete condition.
Wayfarer September 01, 2017 at 23:48 #101666
Reply to apokrisis what do you make of the Everett-many worlds hypothesis?

Oh, and I'm sure there is an 'epistemic cut'.
Wayfarer September 01, 2017 at 23:57 #101669
Quoting apokrisis
The "modest" understanding of that has been the good old dualistic story that it is all in the individual mind of a human observer.


My take would certainly not be that it's ALL in the mind of the observer, but that (1) the mind makes a contribution without which we would perceive nothing and (2) this contribution is not itself amongst the objects of perception as it is in the domain of the subjective. Scientific realism would like to say that we can see a reality as if we're not even there at all, 'as it is in itself', in other words that what we're seeing is truly 'observer independent'. But it is precisely the undermining of that which has caused all of the angst over the 'observer problem', because that shows that the observer has an inextricable role in what is being observed. The observer problem makes that an unavoidable conclusion, which Everett's 'many worlds' seeks to avoid.
apokrisis September 02, 2017 at 00:04 #101672
Reply to Wayfarer Many worlds is used by many to avoid the physical reality of wavefunction collapse or an actual epistemic cut. Or rather, to argue that rather than local variable collapse, there is branching that creates complementary global worlds.

So as maths, many worlds is fine. It has to be as it is just ordinary quantum formalism with the addition of thermodynamical constraint - exactly the decoherent informational view I advocate.

But it gets squirmy when Interpretation tries to speak about the metaphysics. If people start thinking of literal new worlds arising, that's crazy.

If they say they only mean branching world lines, that usually turns out to mean they want to have their metaphysical cake and eat it. There is intellectual dishonesty because now we do have the observer being split across the world lines in ways that beg the question of how this can be metaphysically real. The observer is turned back into a mystic being that gets freely multiplied.

So I prefer decoherence thinking that keeps observers and observables together in the one universe. The epistemic cut itself is a real thing happening and not something that gets pushed out of sight via the free creation of other parallel worlds or other parallel observers.
Wayfarer September 02, 2017 at 00:12 #101674
Quoting apokrisis
as maths, many worlds is fine. It has to be as it is just ordinary quantum formalism with the addition of thermodynamical constraint - exactly the decoherent informational view I advocate.

But it gets squirmy when Interpretation tries to speak about the metaphysics. If people start thinking of literal new worlds arising, that's crazy.


What would you say if someone were to assert that quantum computers rely on the actual reality of many worlds in order to operate?
apokrisis September 02, 2017 at 00:26 #101675
Reply to Wayfarer They are being metaphysically extravagant in a way the mathematics of decoherence doesn't require.

To build an actual quantum computer will require a lot of technical ingenuity to sort practical problems like keeping the circuits in a cold enough, and isolated enough, condition for states of entanglement to be controllable.

Do you think those basic engineering problems - that may be insurmountable if we want to scale up a circuit design in any reasonable fashion - are going to be helped by a metaphysical claim about the existence of many worlds.

Unless MWI is also new science, new formalism, it is irrelevant to what is another engineering application of good old quantum mechanics.
Wayfarer September 02, 2017 at 00:35 #101680
Quoting apokrisis
Do you think those basic engineering problems - that may be insurmountable if we want to scale up a circuit design in any reasonable fashion - are going to be helped by a metaphysical claim about the existence of many worlds?


I'm only going on the jacket blurb of the first book by the guy that invented quantum computing:

The multiplicity of universes, according to Deutsch, turns out to be the key to achieving a new worldview, one which synthesizes the theories of evolution, computation, and knowledge with quantum physics. Considered jointly, these four strands of explanation reveal a unified fabric of reality that is both objective and comprehensible, the subject of this daring, challenging book. The Fabric of Reality explains and connects many topics at the leading edge of current research and thinking, such as quantum computers (which work by effectively collaborating with their counterparts in other universes).


Reading through the reader reviews of that title, it seems Deutsch gives pretty short shrift to anyone who doubts the actual reality of parallel universes, which he seems to think is necessary for the concept to actually work.
Metaphysician Undercover September 02, 2017 at 00:55 #101690
Quoting Wayfarer
Reading through the reader reviews of that title, it seems Deutsch gives pretty short shrift to anyone who doubts the actual reality of parallel universes, which he seems to think is necessary for the concept to actually work.


Isn't apo saying that the concept doesn't actually work?
Wayfarer September 02, 2017 at 01:16 #101694
Reply to Metaphysician Undercover He, like myself, can't accept the idea of 'parallel universes', but the point I'm trying to make is that it is an inevitable consequence of Everett's 'relative state formulation', like it or not. So, let's move on.
apokrisis September 02, 2017 at 01:50 #101701
Reply to Wayfarer To be fair to Deutsch, he wrote that book back in the 1990s. Many people got carried away and were taking the most literal metaphysical view of the newly derived thermal decoherence modification to quantum formalism.

Now most have moved to a more nuanced take of talking about many world-line branches. But my criticism is that simply mumbles the same extravagant metaphysics rather than blurting it out aloud. Many minds is as bad as many worlds.

On the other hand, listen closely enough to MWI proponents, and they now also start to put "branches" in quotes as well as "worlds". It all starts to become a fuzzy ensemble of possibilities that exist outside of time, space and even energy (as preserved conservation symmetries). The MWIers like Wallace start to emphasise the decision making inherent in the very notion of making a measurement. In other words - in accepting metaphysical vagueness and the role that "questioning" plays in dissipating that foundational uncertainty - MWI is back into just the kind of interpretative approach I have advocated.

There is now only the one universe emerging from the one action - the dissipation of uncertainty that comes from this universe being able to ask ever more precise questions of itself.

In ordinary language - classical physics - we would say the Universe is cooling/expanding. It began as a fireball of hot maximal uncertainty. As vague and formless as heck. Then it started to develop its highly structured character we know and love. It sorted itself into various forces and particles. Eventually it will be completely definite and "existent" as it becomes maximally certain - a state of eternal heat death.

Only about three degrees of cooling/expanding to get to that absolute limit of concrete definiteness now. You will then be able to count as many worlds as you like as everyone of them will look exactly the same. (Or rather you won't, as individual acts of measurement will no longer be distinguishable as belonging to any particular time or location.)




Wayfarer September 02, 2017 at 01:56 #101702
Quoting apokrisis
Eventually it will be completely definite and "existent" as it becomes maximally certain - a state of eternal heat death.


Unless it collapses back into another singularity, and then expands again. Guess we'll have to wait and see ;-)
apokrisis September 02, 2017 at 02:00 #101703
Quoting Wayfarer
He, like myself, can't accept the idea of 'parallel universes', but the point I'm trying to make is that it is an inevitable consequence of Everett's 'relative state formulation', like it or not. So, let's move on.


I reject parallel worlds and parallel minds because immanence has to be more reasonable than transcendence when it comes to metaphysics.

An immanent explanation could at least be wrong. A transcendent explanation is always "not even wrong" because it posits no actual causal mechanism. It just sticks a warning sign at the edge of the map saying "here be dragons".

And the Everett formulation is just an interpretation - a metaphysical heuristic. It itself becomes subject to various metaphysical interpretations as I just described. You can get literal and concrete. Or you can take a vaguer approach where the worlds and branches are possibilities, not really actualities. Or you can go the full hog and just accept that the foundation of being is ontically vague and so any counterfactual definiteness is an emergent property.

The real advance of "MWI" is the uniting of the maths of quantum mechanics with the maths of thermodynamical constraints - the decoherence formalism.

This is a genuine step in the development of quantum theory. And it has sparked its own wave of interpretative understanding - even if ardent MWIers claim to own decoherence as their own thing.



apokrisis September 02, 2017 at 02:11 #101705
Quoting Wayfarer
Unless it collapses back into another singularity, and then expands again. Guess we'll have to wait and see ;-)


No we bloody don't. Dark energy is a fact. The Heat Death is gonna happen.

Of course we now have to account for dark energy. And again - in my view - decoherence is the best hope of that. Because quantum level uncertainty can only be constrained, not eliminated, then that means that the fabric of spacetime is going to have a built-in negative pressure. It is going to have a zero-point energy that causes quantum-scale "creep".

Unfortunately we don't know enough particle physics to do an exact calculation of this "creep". We can't sum all the contributions in an accurate way to see if they match the dark energy observations. And the naive level calculation - where either things either all sum or all cancel - produce the ridiculous answers that the dark energy value should be either zero or Planck-scale "infinite". An error of 130 orders of magnitude and so another of your often cited "crises of modern physics".

Other calculations going beyond the most naive have got closer to the observed value. But also admittedly, not come nearly close enough yet.

But at least, as a mechanism, it could be bloody wrong. ;)
dclements September 02, 2017 at 02:52 #101712
To the best of my knowledge, it has already been determined that the issue comes from the state of particle (or whatever else is being tested) being changed by being MEASURED and therefore we have already ruled out any problem with any problem from it being OBSERVED.

It is a bit too fruit loopy to think that just our observation of something completely altering it beyond that which is conceivable through more or less physics. I mean there could be something like "magic" where our mind alters realty but it is best to rule out everything else before we allow ourselves to think something like "magic" is going on.
Rich September 02, 2017 at 02:55 #101714
Quoting apokrisis
Of course we now have to account for dark energy. And again - in my view - decoherence is the best hope of that. Because quantum level uncertainty can only be constrained, not eliminated, then that means that the fabric of spacetime is going to have a built-in negative pressure. It is going to have a zero-point energy that causes quantum-scale "creep".


I don't know whether I should laugh or cry. I am sure many members are awaiting breathlessly for the final verdict on what will happen billions and billions of years from now as science refines it's precise calculations. No doubt such calculations will require increased funding. Come to think of it, how about forecasting tomorrow?
apokrisis September 02, 2017 at 03:01 #101719
Quoting Rich
I am sure many members are awaiting breathlessly for the v final verdict on what will happen billions and billions of years from now as science refines it's calculations.


Just measure the cosmic background radiation. Its 2.7 degrees above absolute zero. The average energy density is down to a handful of protons per cubic metre.

Again you reveal the vastness of your ignorance of routine scientific facts. The Heat Death is a done deal even if you might also say the clock has another tick or two to actually reach midnight.
Rich September 02, 2017 at 03:04 #101721
Reply to apokrisis Heck, anyone who can predict as fact what will be happening billions upon billions (maybe trillions) of years from now has to be .... well, just a remarkable fortuneteller. Thank you for putting my mind at rest. Not in my lifetime at least. Any other long-term forecasts?
apokrisis September 02, 2017 at 03:33 #101732
Reply to RichYou are like someone plunging off a skyscraper, now being inches from the ground, shouting out I'm not dead yet, you don't know what you're talking about, my future has not been foretold, there are no grounds to predict my immanent demise.
Rich September 02, 2017 at 03:37 #101738
Reply to apokrisis Nah, just laughing and wondering how many people buy into your gobblygook? Your demise? Are you working on the precise calculations? Need funding?
apokrisis September 02, 2017 at 03:52 #101749
Reply to Rich Most people probably can follow simple math. If the Universe had a temperature of 10^32 degrees at the Big Bang, and the Heat Death is defined by it being asymptotically close to 0 degrees, then it being currently 2.7 degrees tells us what?

Is it: A) We are pretty much at the end of the journey. Yes siree, 32 orders of magnitude is quite a big drop. We are not even talking nanoseconds to midnight (nano being merely 9 orders of magnitude).

Or: B) Bibble, bibble, bibble. Blub, blub, blub....
Rich September 02, 2017 at 03:57 #101751
Reply to apokrisis No need to justify your extraordinary calculations. You had me when you predicted the demise of the universe as fact. Gutsy move.
Wayfarer September 02, 2017 at 03:59 #101752
Well, if the Big Bang happened once.....
apokrisis September 02, 2017 at 04:55 #101780
Reply to Rich You are very flattering. But its just standard cosmology. You can read all about it yourself.
Rich September 02, 2017 at 05:07 #101785
Reply to apokrisis I know. And you are here to give us the scientific facts.

You have 4 months to live. You have 2 years to live. The universe has 1 trillion years to live. Always with the proclamations.
apokrisis September 02, 2017 at 05:32 #101791
Quoting Rich
The universe has 1 trillion years to live.
Don't you mean that the Heat Death is eternal? That's quite a surprising conclusion if you think about it.

SophistiCat September 02, 2017 at 20:53 #101946
Reply to Andrew M There have been a number of attempts to derive/justify the Born rule, including the self-locating uncertainty approach that Carroll and Sebens develop (I haven't looked at their paper, but they probably cite earlier works in the same vein). Not everyone is convinced that such justifications are (a) not circular, and (b) do not smuggle in assumptions that are not present in the starting interpretation. But adjudicating this debate is way beyond my pay grade.

I just want to take issue with your characterization of probabilistic theories as "acausal." What you are talking about is causal determinism, and the keyword here is determinism. You can, of course, put your foot down and insist that causality necessarily implies determinism, but, as far as your arguments here are concerned, causality may as well equal determinism, because you are not actually talking about any aspect of causality other than it being deterministic. So for your purposes, causality is a redundant concept, since all that you are talking about is determinism. And I suspect that you only bring it up for rhetorical purposes (everyone wants to preserve causality in our theories, right?)
Andrew M September 04, 2017 at 03:10 #102205
Quoting SophistiCat
There have been a number of attempts to derive/justify the Born rule, including the self-locating uncertainty approach that Carroll and Sebens develop (I haven't looked at their paper, but they probably cite earlier works in the same vein). Not everyone is convinced that such justifications are (a) not circular, and (b) do not smuggle in assumptions that are not present in the starting interpretation. But adjudicating this debate is way beyond my pay grade.


That's fine. At any rate, the justification for the Born rule boils down to the following claims:

1. On the Everett interpretation, measurement leads to initial self-locating uncertainty. An observer can have complete knowledge about the relative states of the system, but not which particular state they have just measured. This raises the question of how to quantify their uncertainty in terms of probabilities.

2. If the state amplitudes are equal, the observer should initially be indifferent about which state they have measured. So the states can simply be counted to calculate the probability that a particular state has been measured.

3. If the state amplitudes are not equal, they can be mathematically factored into states that do have equal amplitudes. And again the states can be counted to calculate the probability. The number of factored states exactly tracks the square of the initial amplitude, so it is equivalent to applying the Born rule.

The main assumption is the indifference rule which seems reasonable to me.

Quoting SophistiCat
I just want to take issue with your characterization of probabilistic theories as "acausal." What you are talking about is causal determinism, and the keyword here is determinism. You can, of course, put your foot down and insist that causality necessarily implies determinism, but, as far as your arguments here are concerned, causality may as well equal determinism, because you are not actually talking about any aspect of causality other than it being deterministic. So for your purposes, causality is a redundant concept, since all that you are talking about is determinism. And I suspect that you only bring it up for rhetorical purposes (everyone wants to preserve causality in our theories, right?)


It would be great if everyone wanted to preserve causality in their theories but that is what the Copenhagen interpretation explicitly rejects. The idea that the universe is inherently probabilistic implies that the probabilities are a brute fact and inexplicable.

Naturally since the Everett interpretation directly maps the quantum formalism onto the world, then a deterministic formalism leads to a causal (or, if you prefer, deterministic) theory. But it's worth noting that the theory describes and predicts behavior, it does not prescribe it.
Andrew M September 04, 2017 at 03:18 #102207
Quoting Wayfarer
Reading through the reader reviews of that title, it seems Deutsch gives pretty short shrift to anyone who doubts the actual reality of parallel universes, which he seems to think is necessary for the concept to actually work.


Here's the actual challenge Deutsch raises in his book:

David Deutsch - “The Fabric of Reality”:Logically, the possibility of complex quantum computations adds nothing to a case that is already unanswerable. But it does add psychological impact. With Shor’s algorithm, the argument has been writ very large. To those who still cling to a single-universe world-view, I issue this challenge: explain how Shor’s algorithm works. I do not merely mean predict that it will work, which is merely a matter of solving a few uncontroversial equations. I mean provide an explanation. When Shor’s algorithm has factorized a number, using 10^500 or so times the computational resources that can be seen to be present, where was the number factorized? There are only about 10^80 atoms in the entire visible universe, an utterly minuscule number compared with 10^500. So if the visible universe were the extent of physical reality, physical reality would not even remotely contain the resources required to factorize such a large number. Who did factorize it, then? How, and where, was the computation performed?

Wayfarer September 04, 2017 at 03:31 #102209
Reply to Andrew M where are numbers? Any numbers? There might be a vast domain of which the physical universe is simply an aspect, but which is not physical. //edit// Where is 'the realm of possibility'? You might say 'it doesn't exist', but then, there are some things which are in the domain of possibility, and some things which are not. So there are 'real possibilities' - but they don't actually exist anywhere. Which, in the context, is significant, I would have thought.//
Andrew M September 04, 2017 at 04:58 #102230
Quoting Wayfarer
?Andrew M where are numbers? Any numbers? There might be a vast domain of which the physical universe is simply an aspect, but which is not physical. //edit// Where is 'the realm of possibility'? You might say 'it doesn't exist', but then, there are some things which are in the domain of possibility, and some things which are not. So there are 'real possibilities' - but they don't actually exist anywhere. Which, in the context, is significant, I would have thought.//


Numbers aren't anywhere. Numbers are an abstraction over things (which is more-or-less the Aristotelian view).

Possibilities are also abstractions. In ordinary use, a real possibility is just one that is more likely to eventuate.
Wayfarer September 04, 2017 at 08:13 #102261
Quoting Andrew M
Numbers aren't anywhere.


In that case I clearly don't understand what Deutsch is talking about, although that is probably not surprising.
Andrew M September 04, 2017 at 08:53 #102268
Reply to Wayfarer Factoring large numbers requires physical resources (i.e., a computer). If a successful factorization required vastly more physical resources than were available in the visible universe, then where would those resources have come from?
Wayfarer September 04, 2017 at 09:33 #102291
Reply to Andrew M but that's not something that's been done, right? Is that part of the argument for quantum computers? That if you want to do that, then you will need a quantum computer to do it?
Andrew M September 04, 2017 at 23:49 #102482
Reply to Wayfarer It hasn't been done - it's only a theoretical possibility at present (and, no, classical computers couldn't do this). The practical goal right now is to outperform classical computers, so-called quantum supremacy.
Wayfarer September 04, 2017 at 23:53 #102485
Reply to Andrew M in which case, I faill to see the cogency of that example for Deutsch's argument for there being many worlds.
Metaphysician Undercover September 05, 2017 at 00:09 #102489
Quoting Wayfarer
//edit// Where is 'the realm of possibility'? You might say 'it doesn't exist', but then, there are some things which are in the domain of possibility, and some things which are not. So there are 'real possibilities' - but they don't actually exist anywhere. Which, in the context, is significant, I would have thought.//


The realm of possibility is the future. The difference between a real possibility and an unreal possibility (something said to be possible but actually impossible) is determined by the past. What has already occurred, in the past, determines what is possible in the future. If you take a presentist perspective, neither past nor future "exists", because existence is limited to the present. If you take a dualist perspective you can allow that both the past and the future have actual existence, but there is a substantial difference between the two (substance dualism). Therefore the realm of possibility, remaining always ahead of us in time, in the future, is very real, but since it has not yet received material existence at the present it is apprehended only by the mind, and not the senses
Andrew M September 05, 2017 at 00:41 #102493
Quoting Wayfarer
?Andrew M in which case, I faill to see the cogency of that example for Deutsch's argument for there being many worlds.


The demonstration would require a quantum computer with about 300 qubits. Either that is an engineering problem that can one day be solved. Or there is some unknown law of the universe that prevents that possibility.
Wayfarer September 05, 2017 at 01:10 #102499
Reply to Andrew M 'We have a factoring problem which is so big it can't be solved with any known computer. Therefore in order to solve it we will need a computer that utilises many universes. Therefore there are many universes.'
SophistiCat September 05, 2017 at 07:42 #102578
Quoting Andrew M
It would be great if everyone wanted to preserve causality in their theories but that is what the Copenhagen interpretation explicitly rejects. The idea that the universe is inherently probabilistic implies that the probabilities are a brute fact and inexplicable.


Well, again, you are just equivocating between "causal" and "deterministic." What you are really saying is that our theories ought to be deterministic. I disagree. There is no a priori reason why we should prefer determinism. Or indeterminism, for that matter. I don't consider either to be a theoretical virtue, in and of itself. Of course, if one also offered better or more specific predictions, or a more economical description than the other, then it ought to be preferred - but those other advantages obtain independently of the determinism/indeterminism split. Indeed, in the case of the interpretations of quantum mechanics, none of the empirical advantages can be credited to one interpretation and not another, since they all make the same empirical predictions*.

* That may not actually be true - some interpretations seem to make distinct predictions, but they are presently out of reach for empirical investigation.
Rich September 05, 2017 at 14:11 #102647
Quoting SophistiCat
* That may not actually be true - some interpretations seem to make distinct predictions, but they are presently out of reach for empirical investigation.


While not a prediction per se, Bohm's version of the Schrodinger equation implies a quantum potential that can act non-locally, at a distance. This implication inspired Bell to formulate his Theorem which has been experimentally tested many times over. In this regard, the causal, non-deterministic model of Bohm's carries some additional weight. Does this sound reasonable?
Andrew M September 06, 2017 at 03:05 #102817
Quoting SophistiCat
What you are really saying is that our theories ought to be deterministic.


Yes I think our theories should be deterministic. But, most importantly, our theories should be explanatory which is how I've used "causal" in this thread. I'm unaware of any non-deterministic theory that meets that criterion.

Consider a simple probabilistic theory about dice. This (well-tested) theory says that any given dice roll will have a 1/6 probability of producing any particular number between 1 and 6. But the theory doesn't explain why dice exhibit that behavior, it just asserts it.

That is precisely the situation with the Copenhagen interpretation and any other interpretations that postulate the Born rule probabilities instead of deriving them. They may make the correct predictions but they don't actually explain anything.
SophistiCat September 06, 2017 at 07:49 #102837
Reply to Andrew M All interpretations of quantum mechanics explain exactly the same observations, so in that sense they are explanatory to exactly the same degree. One could make a case that some of them are more parsimonious than others, but that is never an easy case to make. That said, superficially at least, the Everett interpretation does seem to be more parsimonious than its main rivals. But I realize that things aren't so simple, and not having sufficient expertise, withhold further judgment.
Rich September 06, 2017 at 14:49 #102886
Quoting SophistiCat
the Everett interpretation does seem to be more parsimonious than


This implies that creating a new world for every quantum event, or smearing every thing across infinite works comprising some mega-universe of some sort is parsimonious. Others may call such a theory as unimaginably elaborate.

Mike Adams September 06, 2017 at 17:18 #102907
Great thread this.

In response to previous assertions that the reality of multiple universe is 'craziness' is it any more crazy than any of the other interpretations?! (Cat both dead/alive etc)
Mike Adams September 06, 2017 at 19:50 #102971
Also - apokrisis, could you please expand on what you mean by 'epistemic cut.' Thanks
Rich September 06, 2017 at 20:00 #102976
Quoting Mike Adams
In response to previous assertions that the reality of multiple universe is 'craziness' is it any more crazy than any of the other interpretations?! (Cat both dead/alive etc)


Much, much crazier than Bohm's straightforward causal non-deterministic interpretation.

Bohm's interpretation implies non-locality which has been experimentally observed at the molecular level. It is also no-deterministic which drives the determinists crazy. As a result they come up with this interpretation which requires an infinitly ever-growing exponentially mega-world interpretation which indeed it's still probabilistic but deterministic in this fantasized infinite-world interpretation. It demonstrates how far materialists-determinists are willing to go to preserve their beliefs. I would say God is much more reasonable.

SophistiCat September 06, 2017 at 20:45 #102996
Reply to Mike Adams Yeah, the weirdness objection is the worst of the lot, and does not deserve any respect. Quantum mechanics is weird. The world is weird. Get used to it.
Rich September 06, 2017 at 21:43 #103007
Reply to SophistiCat So, the idea that everytime there is a quantum event, which is happening continuously everywhere in the universe, a new world is created for every possibility of that event, seems reasonable. Yes, now we need to discuss plausibility, taken in consideration that the sole reason for such an interpretation is to maintain some possibility of determinism in this mega-worlds ( not even the universe we live in). It's pretty ridiculous, but on this fine forum it is taken seriously. Why? So maintain nice relationships with determinists. Yes, let's consider this as simply weird, when in fact it is preposterous so the determinists can present such an interpretation which has yet to find any kind of definition whatsoever.
Mike Adams September 07, 2017 at 08:18 #103124
MW theory has been taken seriously by many scientists for decades...

(Plus your apparent hatred for determinists is bordering on the hysterical.)
Mike Adams September 07, 2017 at 08:31 #103126
Quoting Rich
So, the idea that everytime there is a quantum event, which is happening continuously everywhere in the universe, a new world is created for every possibility of that event, seems reasonable.


The refusal to countenance this as a real possibility just demonstrates the problems human beings have with scale. Due to our arrogance as perceived 'masters of the universe' we relate everything to our own size/perspective and so things going too far either way seem ridiculous. If someone said there were probably 10 alternate universes people would easily believe it, but say there are trillions and trillions and trillions...etc and they can't comprehend it.

But if you consider the sheer numbers of atoms in a small piece of coal, or the space between the nucleus of the atom and the electrons, of the size of the universe etc things outside our tiny scale seem far less ridiculous.
Wayfarer September 07, 2017 at 10:19 #103137
Quoting Mike Adams
In response to previous assertions that the reality of multiple universe is 'craziness' is it any more crazy than any of the other interpretations?!


Considerably more. The whole purpose of the Schrodinger's Cat thought experiment, was to try and depict the strangeness inherent in quantum mechanics with a life-size example. It was verging on satire; it is saying, 'if you take the mathematics of so-called 'superposition' literally, it means that...' - and then uses the infamous example of the live-dead cat to make the point. In some ways, 'Schrodinger's Cat' was an expression of exasperation, as much as anything.

(Physics joke: 'Erwin! What did you do to the cat? It looks half dead!' ~ Mrs Schrodinger.)

But, the Copenhagen 'interpretation' is another thing altogether. It's not a scientific theory at all, it is simply a way of characterising the kinds of things that Bohr, Heisenberg and to some extent Pauli would say could or could not be said on the basis of quantum physics.

Regarding Everett: here's an interesting if little-commented fact - Everett actually had the privilege of meeting with Bohr, several times, in 1959. But Bohr never showed the least sign of accepting the 'relative state formulation' and at this stage, Everett was already out of theoretical physics, on his way to becoming one of the mathematicians behind America's ICBM program.

This is all related in a Scientific American article called 'The Many Worlds of Hugh Everett', which notes the origin of the 'theory' as follows:

Everett’s scientific journey began one night in 1954, he recounted two decades later, “after a slosh or two of sherry.” [Incidentally the story notes that Everett became an alcoholic, a fact which contributed to his early death.] He and his Princeton classmate Charles Misner and a visitor named Aage Petersen ....were thinking up “ridiculous things about the implications of quantum mechanics.” During this session Everett had the basic idea behind the many-worlds theory, and in the weeks that followed he began developing it into a dissertation.


***
Quoting Mike Adams
The refusal to countenance this as a real possibility just demonstrates the problems human beings have with scale.


It's not scale that the problem, it is the inherent outlandishness of the implications of there really being many parallel universes. The literal implication of this idea is that every possible variation of everything that happens, really does happen. So this very dialogue - the one you and I are participating in - is taking place in an infinite number of identical worlds, and also an infinite number of worlds that are different in only one degree, up to an infinite number of degrees of difference.

Here is the handy diagram that Wikipedia generously includes in the article on Many Worlds to explain this:

User image

although in this case, there's only two outcomes. But, I suppose depicting an infinite number of outcomes would be problematical, in a two-dimensional graphic.
Rich September 07, 2017 at 12:46 #103159
Quoting Mike Adams
But if you consider the sheer numbers of atoms in a small piece of coal, or the space between the nucleus of the atom and the electrons, of the size of the universe etc things outside our tiny scale seem far less ridiculous.


I am totally OK with flights of fantasy in science and philosophy, but if we are to start taking seriously a quantum theory interpretation that calls for a continuous formation of infinity upon infinity of newly made worlds without any evidence or any hope of ever having any evidence, just for the sake of having a determinist theory to hang some hope on, then we should also begin to take seriously the infinity of God, that provides equal determinism and equal hope. Fair is fair.

On the other hand over can instead choose to explore Bohm's interpretation which is causal, non-deterministic, and which is the only one that not only predicts non-locality (already observed) as well as provides explains away all the weirdness in a very straightforward manner, e.g the delayed choice experiment, non-local spooky action, etc.).

The big problem with Bohm's interpretation is that it allows for choice, something that the materialists-determinists just cannot accept because it is contrary to their faith, and faith is exactly all they have to hold onto - other than the fantasy of infinity upon infinity of new worlds springing out of no where continuously. Science indeed has become goal oriented just like the teachings of the Church.
Mike Adams September 07, 2017 at 12:53 #103161
Reply to Rich I should really point out that I don't necessarily believe in determinism, I am just yet to hear an acceptable scientific explanation of how we can account for genuine agent control in an indetermistic universe.

I would be grateful is you could elaborate on how Bohms interpretation is 'causal and non-deterministic', because at first sight the coupling of the two appears oxymoronic.
Rich September 07, 2017 at 13:31 #103165
Quoting Mike Adams
I should really point out that I don't necessarily believe in determinism, I am just yet to hear an acceptable scientific explanation of how we can account for genuine agent control in an indetermistic universe.


The agent is precisely what you experience every day in you life. Call it what you will, consciousness, your mind, the Elan vital, or that which is choosing. The label matters not. You are the agent that is making choices. That which is peering out through your eyes.

What I was wanted to point out that the only reason the super-fantastical Exponentially-Forever Growing- Infinity-Worlds (scientists are being very modest when they refer to it as Many-World) is taken seriously at all is because determinists need it in light of quantum theory and they are desperate. But no matter what, in this world, everything remains probabilistic.

Bohm's quantum mechanics interpretation is very straightforward. It is causal because everything is real, there is no collapse. The quantum potential which guides the "election" (the election can be considered a wave perturbation) is defined by form not distance so it acts in all directions and all distances equally (non-local action). Any change in the quantum potential will immediately affect the election (this explains the Delayed Choice experiment). The equation itself is equivalent to the Schrodinger equation with different ontological implications. Here is a video which explains how it might all work. It's not precise because the narrator doesn't really understand Bohm, but it is good enough as a starting point.



Mike Adams September 07, 2017 at 15:23 #103175
Reply to Rich This guy is saying the pilot wave theory is deterministic...
Mike Adams September 07, 2017 at 15:28 #103177
Quoting Wayfarer
It's not scale that the problem, it is the inherent outlandishness of the implications of there really being many parallel universes.


It could be argued that it's is our humansize-skewed scale perspective that makes the notion seem outlandish, where as in reality (given what we know about quantum mechanics) we should really be open to any theories which make sense mathematically.

Incidentally, I'm using the word 'scale' in a very wide sense, not simply to denote size but the parameters of humancentric experience.
Rich September 07, 2017 at 15:31 #103179
Reply to Mike Adams That's what I meant when I said he isn't fully conversant. He he just repeating what he read elsewhere. Bohm' himself write it is causal but non-deterministic. It has to be so since the quantum potential, the initial conditions, is defined as a real probabilistic wave. The probabilistic aspect, Bohm said, could be consciousness.