Classical, non-hidden variable solution to the QM measurement problem
This interpretation eschews the entire Copenhagen framework which has led to the other interpretations. It seeks to be rid of all the spooky weirdness the various experiments have inspired. So no fundamental indeterminism, no pilot waves, no non-locality, no other worlds, and no weird collapse.
The claim is that uncertainty about the exact state of the measuring device is being ignored in the experiments, because we can't model that kind of complexity. This uncertainty gets entangled with the resulting measurement in the experiments.
So a measuring device is made up of many atoms which has an exact quantum state. This state is classical, in that all the properties are well defined. The problem is that we can't measure the exact state of something made up of many particles, because that would involve an enormous number of measurements. So we just set aside that issue when doing experiments.
But, because the particles used in the experiment are so small, the act of detecting has a large influence on the resulting measurement. Thus, our uncertainty about the exact quantum state of the measuring device is transferred to uncertainty about the experiment. Thus, we end up with a probability distribution of what the property of some particle being measured will be, because every time the experiment is run by someone, the measuring device is in an unknown, but different quantum state.
So why does the wave function describe the experiment results? Because the electron gun, for example, creates an excitation in the quantum field, and if we knew the exact molecular arrangement of the detection screen, then we could 100% predict where the electron would hit. The excited quantum field in conjunction with the molecular arrangement of the screen deterministically determine a singular result. We just can't precisely measure the molecular arrangement of the screen.
This is the view (as best I can paraphrase it) put forward by James Binney, physics professor at Oxford.
Do you think this sort of approach is viable? Can it resolve the weirdness of QM by explaining it away in classical terms? Has all the woo of QM been a big mistake?
The claim is that uncertainty about the exact state of the measuring device is being ignored in the experiments, because we can't model that kind of complexity. This uncertainty gets entangled with the resulting measurement in the experiments.
So a measuring device is made up of many atoms which has an exact quantum state. This state is classical, in that all the properties are well defined. The problem is that we can't measure the exact state of something made up of many particles, because that would involve an enormous number of measurements. So we just set aside that issue when doing experiments.
But, because the particles used in the experiment are so small, the act of detecting has a large influence on the resulting measurement. Thus, our uncertainty about the exact quantum state of the measuring device is transferred to uncertainty about the experiment. Thus, we end up with a probability distribution of what the property of some particle being measured will be, because every time the experiment is run by someone, the measuring device is in an unknown, but different quantum state.
So why does the wave function describe the experiment results? Because the electron gun, for example, creates an excitation in the quantum field, and if we knew the exact molecular arrangement of the detection screen, then we could 100% predict where the electron would hit. The excited quantum field in conjunction with the molecular arrangement of the screen deterministically determine a singular result. We just can't precisely measure the molecular arrangement of the screen.
This is the view (as best I can paraphrase it) put forward by James Binney, physics professor at Oxford.
Do you think this sort of approach is viable? Can it resolve the weirdness of QM by explaining it away in classical terms? Has all the woo of QM been a big mistake?
Comments (110)
It also effectively claims a hidden variable: if only we knew this hidden state we can never know about, then we could recognise how an electron was pre-determined to hit the screen.
And no I don't think that approach is valid. If you don't know the quantum state of the measuring device, then how will it help to add another measuring device measuring that one, and hence changing its quantum state? :s
No, because he's basically rejecting point particles in favor of a quantum fields. The fields have wave-like properties. When they interact with something like a detector, then the exact quantum state of that detector determines the exact measurement. A particle is just a certain kind of excitation of the quantum field.
As such, there is no need for any hidden variables or non-locality.
There's a better way to state the problem. The Schrodinger equation doesn't show any sort of "collapse" to a single outcome, which is based on a probability. No quantum system is like that. But we get that when we decide to measure for something.
So why should a measurement, which is done by a physical device that is presumably made up of the same sort of quantum particles, result in a zeroing out of all other possibilities in the wave distribution to one single actuality?
That's the motivation for Many Worlds, btw. It removes the collapse issue, but at a cost of postulating a vast number of branching worlds that we can't interact with. But if you're willing to roll with that, it works.
Or you could do what Binney has done, and question the assumption that the Schrodinger Equation must be modelling something real. His position is that it's a useful tool, but it has properties which are fantastical and unreal. And he avoids ducking the question that instrumentalists do, or proposing some form of idealism, which Bohr may have preferred. Instead, he favors one, classically real world.
There is an entirely different possibility, ala Kant. The real world is very different from our ability to conceive it, and the rubber meets the road at QM, where we find out the truth of our limitations.
There's also Pilot Wave theory which removes the non-determinism and collapse issue, but at the cost of non-locality. The theory is quite simple in that it supposes that each particle interacts with a guiding field. So, say, the particle always passes through one of the slits in the double slit experiment, but the wave passes through both and hence interferes with the particle's motion.
I am not sure about De Broglie's views, but Bohm is quite clear in his own book, that his model, which he shares with De Broglie is casual but not deterministic. I find this error repeated in many books and websites, which makes me wonder whether they have ever read the source material or if they are all just repeating the same error.
From Science, Order and Creativity
"This shows the interpretation, while being causal is not strictly deterministic. [Bohm's italics]. Indeed in the next chapter it will be shown that the possibility is opened for creativity to operate within a causal framework."
Not "strictly" deterministic. He qualifies the statement, and that's because he has a spiritual axe to grind.
Quoting Rich
De Broglie did qualify it as causal and deterministic, although the scientist could never predict it, because there would be no way to know the particle's position without interfering with it (and with its guiding field) by measuring it.
It sounds a bit like the hidden measurement interpretation (https://en.wikipedia.org/wiki/Hidden-measurements_interpretation)
A bit off topic but since we're talking QM woo, does anyone have any idea if the double slit has been carried out with two double slits? If the wave function would be more fundamental than a particle state and such (hidden measurement) an interpretation is correct, the measuring could "collapse" the wave function but the first double slit generates an interference pattern, a second double slit could potentially generate interference to show things having become wave like in nature again.
So what is a Quantum Field? Well for one thing, it is probabilistic in nature. Or as Bohm preferred, it contains possibilities. From this, he proceeded to delve deeper into the meaning of this by considering the nature of creativity and intuition. This, he presents a full ontological model that is real, casual, and not strictly deterministic. He never actually goes so far as to equate his notion of creativity with Bergson's Elan Vital, but he nudges as close as he could to it while still not saying anything that might jeopardize his job. The Copenhagenists were forever on his case, doing everything they could to marginalize his theory. Bell was an enough of a renegade to resurrect it.
It can also be pointed out that De Broglie, in an essay about Bergson's philosophy, spoke quite positively of Bergson's thoughts and how they pre-dated but in many ways predicted quantum physics. It is quite a nice essay.
There are bigger costs than non-locality. The theory doesn't work. It can't be made relativistically invariant, requires absolute time, and is restricted to the position basis. Now, there are attempts to get round these (and other) issues, but as things stand quantum field theory doesn't work in this interpretation.
There is also a slew of results that refute hidden-variable theories of any kind, not least the Free Will Theorem.
That's not a postulate of Everettian QM, it's an inevitable consequence.
That it cannot be made relativistically invariant is a positive. Time as define in Relatively and Einstein is a mess with all of its sci-fi inducing paradoxes, and should be jettisoned.
:s
http://advances.sciencemag.org/content/2/2/e1501466
The point it's equivalent in the context of observation and description. We can't get past uncertainty to give a description of the (pre)determined future. With respect to our capacity to describe what's going on, the exact and unknown quantum state we can't measure is of no more use to us than a non-local state.
In terms of descriptive power, the exact state we don't know might as well be in some other galaxy. We don't have description of it which allows us to tell the future which must necessarily occur.
It's not metaphysical in the slightest, it's the real physical situation. And, the other worlds are required to explain what we see in this world in terms of interactions with them - i.e. it is a testable prediction.
LOL! Only if you take Schrodinger's equation to be modelling a real state of affairs, and disregard all other interpretations, or the possibility that QM will be superseded by a better theory at some point.
I'm not saying that MWI is untrue, I'm just pointing out that it's one interpretation based on taking the wavefunction literally. Of course, I have no idea what's ontologically the case.
Quoting tom
That's not testable unless it makes predictions the other interpretations don't. And we don't have anyway of going to or viewing those other worlds. It falls out of the math, nothing more.
Anyway, I didn't make this thread to debate MWI, or any other standard interpretation. I wanted to know what people thought about Binney's interpretation.
I'll restate it briefly. There wavefunction is not real. Rather, our uncertainty about the exact quantum state (which is classical in Binney's interpretation) is translated to the particle or particles in these experiments. If we could take into account the exact state of the measuring device, then the uncertainty of the particle's property in question would dissipate, and thus there would be no need for the wavefunction.
I heard about this watching a youtube video of a conference in which Binney and an MWI proponent got to talk for a while and then field questions. The MWI proponent conceded to Binney that MWI would be totally unnecessary if the measuring device is the culprit, but doubted that having more exact knowledge of its quantum state would make the uncertainty disappear.
Yeah, that's pretty close to what Binney was arguing for. I don't recall that he mentioned any history of the development of hidden measurement, which often happens with the other interpretations. Looks like the wiki entry goes a bit farther with it than I recall Binney mentioning, but I've only listened to the talk once.
Basically, that calls into question the whole 'uncertainty principle' discovered by Heisenberg. Einstein wanted desperately to believe something similar - that the uncertainty was due to something we didn't know, either some hidden factor, or some inherent fault with the apparatus. From my reading, Bohr met every one of Einstein's challenges along these lines (as detailed in Manjit Kumar's book Quantum).The final nail in the coffin was Aspect experiments which falsified the EPR conjecture.
Yeah, but I don't think it falsifies HMI (hidden measurement interpretation, which looks basically like what Binney was promoting). That's because the hidden variables are not in the particle, they are in the measuring device (our lack of knowledge of its exact state, or the fluctuations of the device when detecting the particle). Looks like HMI is not entirely classical in that the quantum state of the device does fluctuate, but maybe that's consistent with Binney stating at one point that particles are just excitations in the quantum field.
Binney does reiterate during his portion of the talk how the measurement device is always left out of the modelling of the experimental results, because it's too complex to model, but arguments over the interpretation of QM always forget that.
The problem is to be found right here in these two phrases. One cannot determine "the exact" molecular arrangement of the screen without referring to non-local factors. The screen is a material object, and to determine the exact arrangement of the parts of any material object requires the consideration of outside forces, because all objects are constantly interacting with other objects in their environment. So there are always non-local unknowns, gravity of the earth, sun, galaxy, expansion of space, etc..
Quoting Marchesk
We have to look at the hidden variables as the unknowns concerning the activities of the universe. the passing of time in the universe. Since these unknowns are concerning the universe as a whole, unknown things about the way that time passes in the universe, then we cannot say that the hidden variables are proper to the particles or to the screen, they are proper to the universe itself, and this is what makes them appear as non-local.
The issue with Binney's approach, which has been previously discussed in depth in many books I read, is defining the state of the "measuring device" which must include the device and all that is entangled with the device including the observers. Ultimately, Binney's approach requires knowledge of the state of the universe from some outside perspective. Is this possible?
Given that the Bohm-DeBroglie real, casual interpretation has the most easily understood ontology, has been experimentally supported (Bell, Aspect, and subsequent experiments detailing non-local effects), and leaves open the very critical notion of possibilities and creativity, there seems to be little reason to embrace other interpretations at this time. Both the MWI and Binney's interpretation are inaccessible while Bohm's non-local prediction are continually tested and verified.
This is the thing. We cannot know "the state of the universe" because time is continually passing. So this so-called "state" is a state of change, which is inherently indeterminable because it defies the law of excluded middle. Until we know what time passing is, we do not know what a state of change is.
Yes, I agree. A "state" cannot be defined in a universe that is constantly changing. This is essentially the Heisenberg principle. Binny wants to define a state when none exists. A quantum interpretation must take this into account which is why Bohm chose to use a Holomovement as his ontological model. The waves are real (whatever they may represent) and the "particles" (really wave perturbations) are most likely to occur at the areas of greatest wave intensity. Since it is holographic in nature, all is entangled and in constant flux, and one can never define a single state.
This is the experiment that got me to change my mind about hidden variables some time ago:
https://en.wikipedia.org/wiki/Delayed_choice_quantum_eraser
I’d be impressed to see how people disagree with this experiment’s concluded implications in a manner consistent to the experiment’s methods and data.
The De Broglie-Bohm addresses the delayed choice by an instanteous action at a distance by the quantum field. So the photon that has passed the slit is still subject to the quantum field at the slit. Of course, the observer who also participates in the field has an effect. With the possibility of free will, we have a casual model of QM which permits creative actions.
Though nowhere near as eruditely as others, I investigated Bhom’s views after first discovering this experiment. This in what then were my attempts to hold onto determinism.
Quoting Rich
That is the crux of it, at least for me: does or doesn’t the causal factor which we term freewill take place? I couldn’t deny the implications of the delayed choice quantum erasure experiment—basically, that consciousness is in some way integral to the causal factors of the physical world as we know it. Which then brought me into numerous reveries regarding how determinism and freewill could mechanistically co-occur. Though I’ve lost count of the details then read, I remember Bhom’s interpretations of determinism somewhat lacking in this regard—though very aesthetically pleasing in numerous other ways. I’d have to reread things to better understand/remember the De Broglie-Bohm interpretations.
What I was intending to get at is that the experiment appears to fully substantiate that consciousness has some top-down causal role in what physically, presently is. And it does this by accounting for all variables that could lead to alternative conclusions. I, at least, wasn't imaginative enough to find any. [just remembered, there's the multiple world scenario, but spiritual unicorns being on occasion seen by some is to me a far more plausible reality than that of the multiple world scenario]
I think that Bohm was necessarily cautious about declaring consciousness and/or free will is necessitated by QM. Primarily, he sought to bring real, orthogonal meaning to the QM equations that can be readily conceptualized. It was enough for him that there was room for consciousness, free will, and creativity in what he called the holomovement of the Implicate Order.
Bergson, forsaw all of this and though I have never read where Bohm may have been influenced by Bergson, De Broglie certainly did read Bergson and may have indirectly influenced Bohm.
Bergson, via his own philosophical process, does come to the conclusion that there is Free Will, and it influences the evolution of Time (his capitalization) in a manner that corresponds to QM (De Broglie write an essay on this). Bohm dared not go so far though he's clearly implied it was there in his later works. One must remember that the instrumentalists are always ready to pounce on any one who dares to open the doors to free will and creativity.
Quoting Rich
Yes, we all know how that goes in certain academic circles. There the making a living part that goes hand in hand with reputation.
Agreed. This is why it is necessary sometimes to read between the lines to better understand what the debate is all about. Bohm's solution was ingenious but reputations were at stake.
I was having a discussion here a few months back about an interesting feature of the double-slit experiment, which is that the interference patterns are not rate-dependent. Whether you fire one photon at a time, or many together, you end up with the same pattern (up to a certain point). I posted a couple of threads about this on physics forums. It strikes me as being philosophically significant, although nobody on the physics forums were prepared to acknowledge that. To me it signifies that the probability wave is not a function of time, and from a relativistic point of view, therefore not of space-time.
I wasn’t knowledgeable of this. … Them community-lovers. (there’s sarcasm here somewhere).
Quoting Wayfarer
Although I understand what you're referring to, I yet want to acknowledge the following: I can easily get lost in the maths with which modern physics is deeply entwined. [Could barely keep up with the more complex maths of ecology: When they started talking about 16-dimensional models of what was going on on the ground is when I started doodling things in my notebooks … pondering about the premises/axioms these folks used.]
Quoting Wayfarer
If I’m interpreting your statement properly, I agree. Still, in candor, you are addressing far more detailed concepts of physics and its notions of time than I’m currently making sense of, imo.
But the overall reason why I agree:
QM relies upon time that is Newtonian like; Relativity deems time non-absolute but, traditionally, deterministic so that one obtains a Block-Time. Both these notions of time can be deemed problematic. The absence of a physical theory of everything which combines QM and Relativity attests to this, imo. Neither, imo, do theories fare better when attempting to unify QM with Relativity by declaring time to be non-real … although I’ve read one book where this was done.
The philosophical qualm then becomes the question of what time is, this meta-physically. Obviously, this is not an easy issue to resolve. However, as always, once the metaphysical construct of what time is becomes better appraised by us, then we’ll hold new axiomatic foundations with which to address and remodel the mathematical representations of what’s going on. Otherwise, we will continue to gauge reality through use of inappropriate axiomatic notions of time. Like Newtonian physics, this is useful to some extent--but, due to the rudimentary errors involved, it will not be able to resolve the questions which we current seek answers to. The implications of QM here come to mind.
Even if we could make enough measurements simultaneously to know the exact state of a system, it would not dispel uncertainty.
It is a common misconception that the 'state' of a system is a specification of the exact value of every observable of the system - location, momentum, spin, energy, etc. But the Heisenberg Uncertainty Principle - which is core QM, not interpretation - tells us that for any pair of dual observables - of which position and momentum are the most commonly cited - a state that has a narrow range of possibilities for one of the observables must have a very wide range of possibilities for the other. This has nothing to do with the practical ability to make measurements and is instead based on what 'state' means in QM. It is a purely theoretical, mathematical result. To reject that result we would have to radically alter, or even jettison, QM, not just choose another interpretation.
However, I wonder whether what your physicist was actually referring to was the notion of Decoherence, which is a fairly intuitive (some might say 'pseudo-classical, but one has to be careful using vague terms like that) way of explaining what happens in a measurement of a quantum system. The reason I think he might be referring to that is the reference to the interaction between the state of the observed system and the state of the measuring apparatus, which is what Decoherence addresses.
I don't know, sounded to me like he was denying that the Uncertainty Principle was fundamental instead of a useful approximation based on epistemic limitations.
Let's consider the implications "an instantaneous action at a distance". Clearly this would bring us outside the confines of space-time, as Wayfarer suggests we should look. So we can separate space from time, to see if this instantaneous action at a distance would represent a function of space, time, or both. Here are some speculations.
Nothing can traverse vast space in no time, so if instantaneous action is a reality we have to shrink space, such that what appears to us as a vast space, becomes a small space, such that something can traverse the small space rapidly. We know that space itself is not fixed, static, because of the phenomenon which is called the expansion of space. So we can allow the possibility that space can change in time.
Next we have the phenomenon of time itself. We live at the present, and we can know with a good degree of certainty that there is no material existence in the future. This is very difficult to grasp, but our capacity to create, and shape the world through free will choices demonstrates that it is impossible that material things can exist prior to the present. This means that the entire material world, must come into existence anew, at each moment of time. You will find this principle well outlined in some religions. We can speculate about a "big bang" of rapid spatial expansion at each moment of passing time, as the material universe comes into existence at each moment in time.
Prior to the present, in the future, must exist all the "information" (for lack of a better word, though we should refer to Neo-Platonic Forms here) which determines exactly how the physical world will be at each passing moment. This world of Forms is a non-spatial world, being prior to the spatial, material existence which we experience at the present. Here I believe we have the possibility of instantaneous action at a distance.
http://dbohm.com/david-bohm-holoflux-holomovement.html
Can we set all conceptualize what such an undivided whole might feel like? I propose that a dream may provide some clues.
I'm astounded by the number of comments on this video. Who would have thought that Pilot Wave is as popular as Kim Kardashian.
I expect he just expressed himself poorly - not an unusual occurrence for scientists trying to communicate to a non-scientific audience. The Uncertainty Relation is derived directly from the four postulates of quantum mechanics, with no additional assumptions*. It doesn't get more fundamental than that.
By the way, of the list you gave of weird things about QM - 'indeterminism, ... pilot waves, ... non-locality, ... other worlds, ... weird collapse' only one - non-locality - is potentially implied by bare QM - ie by the postulates. The others are implied only by interpretations such as Copenhagen, Bohm or Many Worlds.
As for non-locality, whether that is implied by bare QM depends on how we define locality. If we restrict it to observables - ie interpret it to say that one observation can only affect another observation in its future light cone, then bare QM does not contradict locality - even with Bell, Aspect etc. It's only if we define locality to mean that unobservables - in particular, quantum states - also cannot be affected outside of the future light cone, that bare QM, in conjunction with Bell and Aspect, implies non-locality.
* See for example Chapter 9 of Shankar's 'Principles of Quantum Mechanics'.
Just to cover the most important point first, so it can be ignored immediately: Copenhagen Interpretation does not even begin to explain reality, so should be ruled out as a scientific theory.
The Copenhagen Interpretation claims that at some time after a measurement has taken place, a certain rule must be applied which assigns to all possible outcomes a probability, and that one of these outcomes will happen. This is a can of worms, which turns out to be a normative theory, not a scientific one!
The Copenhagen Interpretation also suffers from a class of inconsistencies called the "Measurement Problem".
Yet people prefer a non-explanatory, normative, and inconsistent theory to a scientific one!
Anyway, certain, not yet feasible, tests under which Copenhagen and Everett make different predictions have been put forward. And, when the first quantum computer becomes operational, Copenhagen will be over.
I don't think that was the case. He repeated himself a lot, and was rather adamant. He doesn't except certain postulates as being anything more than useful modelling tools. Also, becuase the other speaker conceded that MWI would be unnecessary if the measuring apparatus is the culprit of the wave function probability distribution.
Binney did state several times that the measuring device has a precise quantum state that is mostly classical, we just lack the means to measure it accurately.
If you want, you can listen to Binney's portion of the talk. It's a bit long.
https://youtu.be/NKPI_wurNlo?t=37m21s
I know that you gave a reference here, but is it possible to state these four postulates in English? I took a look at the reference, and it seems to be all written in secret code (mathematics and symbols). Myself, I am very skeptical of imaginary numbers. I think that they introduce an unknown element into mathematical equations, because it is unknown how imaginary numbers truly relate to real numbers, so that using a mixture of the two will produce uncertainty. There hasn't yet been produced a set of numbers which incorporates real and imaginary numbers into one order, which demonstrates how they are differentiated from each other. For example, "zero" produces that means by which positive and negative integers are related to each other within one set.
I like the wording in section 9.4, "Applications of the Uncertainty Principle". You will find this: "Now the hand waving begins. We argue that...[two functions of the same order of magnitude are not strictly equal]... Once again we argue that...[the same inequality between two representations of the same magnitude] and get ... "
Here is the footnote:
"We are basically arguing that the mean of the functions (of X, Y, and Z) and the functions of the
mean (
Incidentally, the only part of the chapter which seems to be well spelled out in English terms, is the part concerning the time-energy uncertainty. I've found this uncertainty relationship explained elsewhere, and it seems to be excluded from QM uncertainty by the way that the Hamiltonian operator is produced. As stated by Shankar "time t is not a dynamical variable but a parameter". I believe I read elsewhere that this was a choice made by Von Neumann. The time-energy uncertainty has to do with the indefiniteness of light frequencies, especially in very short periods of time. Resolution appears to involve allowing for violation of the law of conservation of energy.
https://www.marxists.org/reference/subject/philosophy/works/dk/bohr.htm
It once again occurs to me that the absolute, fundamental issue here, as it is with Relativity, is the notion that instrumentation will provide all the necessary knowledge that we need in order to understand the nature of the Universe. The limitations are clear and Bohr spells them out in this paper. Quite literally, and simply, you cannot find out everything you want to know about something that is in continuous flux by attempting to freeze it in an instant with an instrument. And if you allow it to change in its continuous flow then whatever you may know has vanished and is no longer true.
Hence, if one wishes to understand the nature of the universe, it is necessary to give up the notion that instruments will ever provide a full description. QM and Relativity are attempting to provide solutions to instrument measurement problems and at their limits they cannot. Beyond that, other means must be used.
Heraclitus and Bergson were correct, as was Bohm who realized that in order to pry deeper one must use intuition (the mind). If this is unsatisfactory, then one must learn live within the limits of instrumentation. It appears that Binney has not. As I indicted, it is impossible to measure the state of an undivided whole if one is part of that whole.
Hidden variable theories are ruled out. I have absolutely no idea why people still cling to them. Well actually I do - they wand quantum mechanics to be about reality, but are desperate to avoid the implications.
I think I found the video you were referring to. Have only had time to watch a few minutes, so far I'm deeply alarmed.
Now it seems as if he is defending Copenhagen.
So, you've got "direct" evidence for the Higgs boson or gravitational waves? How about direct evidence for particles turning into waves when no one is looking? Maybe you have direct evidence for any fundamental particle, and how it behaves Maybe you have direct evidence for the force of gravity?
But of course you must have direct evidence of wavefunction collapse surely?
Some prefer to have a good explanation, which is possible, to direct evidence, which is impossible. This really is philosophy of science 101.
I'd ask him to explain the Elitzur-Vaidman bomb tester - a device that can identify whether a photon-detecting trigger attached to a bomb is operational, by NOT interacting with it.
But that does not seem to be what Professor Binney is saying. He is claiming that the uncertainty principle is epistemic due to the large number of states available to the measuring apparatus. i.e. the uncertainty principle is entirely due to our ignorance of interactions between apparatus and the system being measured.
Most assuredly, as do you.
I'm not going to criticise Prof Binney though because I haven't watched his video, just as I don't read designs for perpetual motion machines or proofs that one can trisect an angle. I don't need to because I know it either doesn't say what people think it does, or it is wrong.
Brilliant pick-up MU! I love it. I'd never noticed it before, as I only skimmed the rest of the chapter once I'd worked through the derivation of the uncertainty relation (item 9.2.14 in the Second Edition). It perfectly exemplifies what I'm saying. Section 9.2, in which the uncertainty relation is derived, is two pages of pure maths. As the chapter goes on, he starts to discuss interpretations and consequences of the relation that rely on more assumptions and approximations than are justified by the bare postulates. That's where that quote you found comes in.
As for the postulates, here's a rough attempt to give them in prose:
1. To any possible state of a system (collection of particles) there corresponds a unique set of information about it, called a 'quantum state', which is uniquely represented by a mathematical object called a 'ket' which is part of a collection of such objects, called a 'Hilbert Space'. [Later on, this is generalised so that kets are replaced by operators, in order to allow for non-pure states, but we won't worry about that here]
2. To every aspect of the system that can be measured as a number - called an 'observable' - there corresponds a unique mathematical object called a 'Hermitian operator'
3. If a system is in state s, to which corresponds ket S, and a measurement is made of observable m, which corresponds to Hermitian operator M then, immediately after the measurement is made, the particle will be in a state s' whose associated ket has the mathematical property of 'being an eigenket of the Hermitian operator M', and the value observed from the measurement will be a number that is 'the eigenvalue of that eigenket'. Further, as assessed prior to the measurement, the probability of the state after the measurement having ket S' is proportional to the square of the 'inner product' (another maths term) of S with S'.
4. The ket associated with a system evolves over time according to a known differential equation, called Schrodinger's Equation.
Sure, if there was no such thing as the measurement problem, there would be no need to solve it.
Problem being that when a measurement takes place, Schrodinger's equation fails to predict the outcome, unless of course MWI is endorsed.
But the interpretations stem from the measurement problem, which is not accounted for by the mathematics. That's one thing.
The second thing is to recall history when Newton proposed the law of gravity, and his critics wanted to know how an invisible force acted at a distance on objects. This troubled Newton as well, but he didn't have a good answer at the time.
Now imagine Newton and allies telling everyone to shut up and calculate, the math was all that mattered. And maybe they did back then. But we know now that Newton's formulation of gravity was incomplete. And how did Einstein come up with a better formulation?
It certainly wasn't from math, it was from asking deep questions about gravity and related phenomena, and then doing (or finding) the required math to make it work for GR.
Quoting andrewk
That's entirely dismissive and not a good counter argument. You need to be able to show how Binney and advocates of Hidden Measurement are wrong about the measuring device introducing the uncertainty.
I'd go further: it's not a counter-argument at all, because no argument has been presented to counter.
The measuring device is the source of uncertainty in these experiments. You don't agree, fine. You don't want to watch the video or research his position, fine. You don't wish to counter the argument, fine.
But calling it not an argument? That's bollocks. In fact, I would say your response is irrational.
I don't know that his or the HMI interpretation is right. It could be entirely wrong. I just wanted to hear legitimate feedback. My suspicion is that taking into account the measuring device won't make the uncertainty of the particle disappear. Too many experiments suggesting otherwise. But it's worth considering, just in case our understanding of QM resides on not taking something into account.
The problem is it doesn't work. Take out the measuring device and one is talking about a different interaction in the world. It is no longer a state we are measuring with a device. A measurement without a measuring device is nothing more than an incohrent fantasy.
Practicing a measurement is inseperable from the measuring device. It makes no sense to speak as if our measurement (or description) is spoiling our knowledge. There is no measurement or description without it.
Binny is therefore stuck (or rather simply irrelevant in the first instance). The hidden effect of the measuring device cannot be used to predict with certainity. Even if we knew it all it would do is describe the interactions of a measuring device as they occured. All those interactions not involving the measuring device, or those which behaved otherwise to what we expected, would not be covered--uncertainty remains.
Is the cat in the box alive or dead? It won't be defined with certainity until it is measured(effect of the measuring device inclusive).
No, it's about accounting for the measuring device, not removing it.
The experiments are primary, not the math. Math is used to model and predict experimental results. Schrodinger's equation exists because of the double slit experiment and others like it.
So a natural question to ask is whether the math fully takes everything relevant into account. In this interpretation, the unknown quantum state of the measuring device is a potential source of something important not being taking into account.
I agree that the math is just to tool and lots of written and spoken words as well as experiments preceded and followed. However, for science the math is what counts. For philosophers, everything else is most relevant.
In so far as as the"measuring device" is concerned, and I'm quite surprised that Binney does not recognize it, exactly what are the boundaries to the "measurement device" and how do you ever establish its state if it is constantly changing?
This topic was well discussed and Bohr addresses it in the paper I referenced above.
To put a sharp point on the problem, light (or photon) limits certainty. So, the next question for a philosopher is what exactly is light? - and I am referring to something more than the scientific definition.
Science isn't math though. It's an empirical investigation of the various phenomena in the world. As such, the world has the final say, not math. Experiments and observation are what ultimately drive the math.
That is a big problem. Perhaps as big as not being able to detect other worlds or pilot waves.
So, the laws of physics operate the "unique set of information" and not on the actual physical system?
Quoting andrewk
But what does the operator operate on?
Quoting andrewk
None if this is a necessary axiom to do quantum mechanics though. Why not drop it?
Quoting andrewk
Except when a measurement is made according to 3.
Here's a question concerning this postulate, perhaps you can find an answer for me. Refer to the time-energy uncertainty which I mentioned at the end of my other post, and is described at the end of Shankar's ch. 9. If this uncertainty is excluded from the ket which represents the quantum state (as I believe it is, if I understand correctly), how is the ket said to be the "unique" representation? And how is the set of information which is said to be the quantum state, "unique"? I ask this because some at tpf claim that this unique set of information, and unique representation constitutes a complete description of the state.
But since this time-energy uncertainty is excluded, and time is made to be a parameter rather than a dynamical variable, as Shankar says, then it follows that there is some uncertainty with respect to the quantity of energy within the system. Accordingly, I would conclude that the ket which represents the quantum state, and even the conceived "quantum state" itself, is not a complete representation of the system, and probably not even an accurate representation of the system.
That is not historically accurate, and you really need to stop pretending quantum mechanics is a "model", it's not, it's a theory i.e. a statement about what exists in reality, how it behaves and why.
The Schrödinger equation dates from ~1925. The first double-slit experiment with particles (ignoing photons) was not performed until 1965! Entanglement wasn't observed until ~1984, "macroscopic" superpositions ~1990s, and decoherence was discovered in 1970s, but I don't think it has been observed. Then of course is the yet-to-be-realised quantum computer.
All of these phenomena, and many more besides, are deductions from the theory!
Alright point taken, but the question is whether the Schrödinger equation is describing the real state of the particle before it's measured, or it just has predictive power as a useful tool, and the reality is something else. Afterall, what the hell is a probability wave supposed to be?
In context of Binney and HMI, if the reality would be our epistemic uncertainty about the complex state of the measuring device having a large influence on the particle it's detecting.
If MWI is the case, then probability wave is a description of other worlds. Or it could be pilot waves guiding the particle. But then again, perhaps reality is a jumble of possibilities when we're not looking? Question is why does measurement make it classical? Why is our lived experience mostly classical?
The sentence is way too vague to be considered a claim. 'Uncertainty' could mean any of several very different things, each of which involves a completely different discussion. The statement reminds me of some of the debating topics we used to have, when there was a (mercifully temporary) fashion to set deliberately vague topics in order to make the debates less predictable. A favourite was 'The end is nigh'.
Just to pick up one of the possible meanings, if 'uncertainty' refers to the probabilistic nature of the value obtained from the measurement, as assessed prior to the measurement, and based only on information about the observed system and not the measurement apparatus, then that agrees with the Decoherence theory, which is widely accepted. If that's what was meant then the prof is not saying anything controversial, or new, at all.
If the ket is a complete description then the function that maps physical states to kets is one-to-one ('injective' is the technical term). If it is not complete then the function is many-to-one, like for instance the functions f(x)=x^2 and g(x)=sin x.
I didn't completely grasp all of your question, but I answered it as best I could. Let me know if I left anything out.
Quoting tom
Quite right. I forgot to add that bit.
Quoting tomYou're right that there's no need for it in the context of a discussion about the 'measurement problem' (which I'm guessing this thread is somewhat related to, but I'm still very unsure of that), as Decoherence gives us all we need (I think). But in applied QM it is very useful as it removes the need to think about the measuring apparatus.
I don't know what that means, though, unless one is an anti-realist, which I'm not.
Quoting Wayfarer
But how do you go from probability to actuality? What is the mechanism? Is this just brute?
At the beginning of the talk I linked to, Alan Bar introduced the measurement problem for the audience, then Simon Saunders argued for MWI, followed by James Binney discussing HMI, I guess, although he didn't give his interpretation a name. The Youtube title is: "The 1st Ockham Debate - The Problem of Quantum Measurement - 13th May 2013".
Thank you, that clarifies it nicely. Given that it's about the 'measurement problem', the references to uncertainty will have nothing to do with the Heisenberg Uncertainty Relation and instead will refer to the lack of knowledge prior to measurement about which of the eigenvalues of the ket of the observed system will be the result of the measurement.
Discussion of that issue involves interpretation, not just core QM, as is indicated by the letter 'I' at the end of the two abbreviations 'MWI' and 'HMI'. So it would appear that the people involved are debating interpretations and not challenging the postulates of QM, or deductions therefrom like the Heisenberg Uncertainty Relation, which would have been a worry.
No, that's not what he was arguing for. Binney stated several times that the probabilistic nature of the value obtained was due to our epistemic uncertainty about the exact quantum state of the measuring device, and not anything fundamental about the state of the particle prior to being measured. A little reading up on HMI reveals that this particular interpretation understands probability to be entirely epistemic (our ignorance or inability to measure everything accurately) and not ontological or fundamental.
My understanding is that decoherence has to do with normal macroscale objects, such as detectors, interacting with isolated quantum systems, which are fundamentally probabilistic, or at least the math describes those systems as being so, causing them to lose their coherence, leaking the quantum information out into the wider environment.
But it doesn't do away with superposition. In the cat thought experiment, although it explains why we don't see both a live and dead cat when opening the box, it doesn't explain what happens to us and the rest of the universe. That still requires an interpretation, and I believe MWI is compatible with decoherence.
Yes and no. I'm pretty sure Binney challenged taking the postulates of QM literally (realistically), when interpreting the results. He said they were very useful tools, but the Schrodinger Equation, for example, has unreal properties (such as leading to a superposition of states). He also mentioned the Heisenberg Uncertainty Relation, and I'm pretty sure his interpretation is at odds with taking that realistically, since he thinks probability is epistemic, and not fundamental. Thus, a measuring device has an exact quantum state (state that all the particles and molecules are in), and not a wavefunction.
Well that's the whole measurement problem in a nutshell. All the big arguments are about this very point. Realists want to insist that there is a real particle, something 'mind-independent'; that is just what is being called into question. One of Bohr's quotes is 'that there is no particle prior to the act of measurement'; which is why Einstein asked the rhetorical question 'does the moon still exist when nobody is looking at it?' It is why there are all the arguments in the first place. Many Worlds simply outsources the problem to 'other worlds', but it seems a desperate remedy to me.
Let's say Bohr was right. Why the interference pattern, then? Why not some other probability distribution? It's highly suggestive that something is interfering. After all, that's what observable waves do.
And science has a track record of positing what are initially unobservables, and then coming up with instruments to make those observations. At one time, atoms were just theoretical posits. Anti-realists could have (and maybe did) argue that they were useful fictions for making sense of experiments at the time. But now we can observe them, so obviously they are more than useful fictions.
It's gravity certainly does. The unobserved particles have properties that are important to atomic structure and fields of force. It's similar to noting that the floor keeps holding you up even when you don't notice it. Somehow the stuff of everyday life is held together despite not observing all the particles making it up.
The "useful fiction" argument doesn't make sense in the way it's often termed. For physics, the important question is its descriptive power. What makes it "real" is that it accounts for the world, not a particular emprical form-- I mean where is the state of "energy?" Yet, we don't go around saying energy somehow isn't real.
Let's imagine for a moment that atoms weren't a particular state of the world (which is sort of true of the Borh model), would it mean that atomic theory wasn't how the world worked? Not at all. If our objects still behaved in that way, atomic theory would still be expressed by the world; it would be description of how the world really worked, despite the absence of particular atoms which someone could pick-up and hold with atom tweezers.
As I understand it, which may be not very well, the probability wave really is a distribution of probabilities - nothing more than that. So it's not actually 'a wave' at all, it simply behaves like a wave - it is 'wave-like' but there really isn't a wave as such, because it doesn't transmit energy or move in a medium, like light waves or water waves. That is why, I think, it is 'rate independent' - the 'wave pattern' really is embedded in the fabric of reality itself, it is of a different order to the physical. That is why the 'nature of the wave function' is the metaphysical question par excellence.
Quoting Marchesk
But the meaning has been changed in the meantime. 'Atom' used to mean 'indivisible particle'. But if you look up the definition of 'atom' now, it is 'the smallest particle of a chemical element that can exist'. But as soon as the atom was shown to be mainly empty space, then I think it ceased to be an atom in the classical sense, i.e. a truly 'indivisible particle'. The idea of 'atoms and the void' could no longer hold. So the atom is no longer the ultimate explanans that is was considered to be by materialism. That is the sense in which physics has undermined materialism.
That's fine. I am sympathetic to everything you report him as saying there, and it's a widely held interpretation. All I was concerned about was whether he was rejecting either the postulates of QM, or results derived from them alone, such as the Heisenberg Uncertainty Relation. It is now clear that he was not. Questions of whether certain things are epistemological or ontological are matters of pure interpretation, since the postulates make no distinction between the two.
In your later post you said Binney said people shouldn't take the QM postulates literally or realistically. I can agree with that too, because it also is about the interpretation, not the calculation. He's not saying we shouldn't believe the predictions they make, which are purely about observations. I do not subscribe to the ontological perspective sometimes known as 'Realism' - but which I think of by the (IMHO) more accurate title 'Materialism'. I lean towards Bohr rather than Einstein.
(Y)
This is where I get confused about the Copenhagen interpretation. Is it anti-realist, or is it saying that reality is this non-classical stuff of possibilities that behave like a wave? That seems to be two different interpretations.
The first one leaves questions unanswered. It's the sort of thing Landru of the old forum would have been happy to endorse. Our experiences have a structure. We don't know why, but realism just presents a regress, etc. In terms of the double slit experiment, we don't know why it results in an interference pattern when there isn't a detector on one of the slits. That's just what happens, and physicists developed the math to describe/predict it, because science is merely concerned with prediction (on Landru's account of it).
While the second one, that the world is actually made of probability waves until a measurement (or decoherence) takes place, is puzzling, weird, and almost mystical. The second one is making an ontological claim.
I didn't explain what I heard well. It was only after several pages of replies that I figured out how to express it clearly.
You're not alone.
I would simply make the point that 'the Copenhagen Interpretation' is not a scientific hypothesis. It is only a description of the kinds of things that Bohr, Heisenberg and Pauli used to say in debates and discussions about interpretation; the term itself wasn't even coined until the 1950's.
Quoting Marchesk
During the early days of quantum physics, there was quite a bit of mysticism about. (See Quantum Mysticism - Gone but not Forgotten. And have a read of The Mental Universe.)
The wavefunction is NOT a probability wave! It's not even a probability amplitude wave! According to Copenhagen, it does not exist. According to Binney it seems to not exist either.
According to the only known REALIST interpretation that agrees with QM, the wavefunction is an element of reality, but exactly what? The mathematical properties of the wavefunction correspond to features of reality, and the only way to make sense of this is to accept that the wavefunction represents a branching (and occasionally recombining) world-density function.
It turns out that under realist QM - i.e. the sort where the only dynamics is UNITARY evolution of the wavefunction, then probability is not part of the theory, it is not required. That is not to say that probability is not an extremely useful MODEL in most circumstances.
Quoting Marchesk
For Binney, quantum mechanics is not a physical theory. If you ask me, everyone else in the discussion section of the video you posted was embarrassed into silence. It was a car-crash. At least he does not believe in objective propability - i.e. his version of QM is a stochastic theory of human ignorance.
Quoting Marchesk
For systems of more than one particle, QM takes place explicitly in Hilbert space - not in the space-time. This should at least indicate that the idea of "probability waves" flying around is wrong. In fact, under the Heisenberg picture, the wavefunction is stationary - it does not change - and all dynamics is contained within the observables! Why does no one talk about observables flying around?
Quoting Marchesk
Decoherence.
The Copenhagen is anti-realist; it is a purely epistemic theory. The "Standard" interpretation, taught at most (American) universities calls itself Copenhagen, but it's not. It is based on the famous book by von Neumann "The Mathematical Foundations of Quantum Mechanics". That interpretation definitely has a realist feel to it. In British universities, the treatment tends to be closer to Dirac, which again feels realist.
Quantum mechanics is quite hard, and is made more so by obfuscators like Binney. You aren't going to be asked for an essay on ontology or epistemology in your final exam, but you are going to need to shut up and calculate.
Quoting Marchesk
Yes we do!
Feynman said that nobody understands, assuming that wasn't taken out of context, but I always understood him to be saying that nobody knows why the double slit and other experiments give the results they do. How many nuances to the various interpretations are there, btw?
Quoting tom
What is Hilbert space, and what makes it any more real than probability waves? And I don't mean what is the math, I mean what does the math represent?
Actually, that video was pretty amazing! Maybe there really is something to pilot waves. I didn't know there was a classical system that produced similar results for the double slit experiment. And you can see it happening! Definitely helps visualize de Broglie's interpretation.
I guess the bouncing silicon oil drops creating the standing waves is a classical pilot wave system.
I think I understand what you say here, the ket describes the state of the system in such a way which allows that two distinct states have the very same ket. Therefore the ket cannot be a complete description of the state. To make an example in a very general way, the apple and the orange may both be represented by the same mathematical symbol (1), but this does not mean that these two things are the same, it means that the mathematical way of describing them, as each being one, is an incomplete description.
Quoting andrewk
The other issue I was trying to bring to your attention is the nature of the time-energy uncertainty relation. Some may say that this uncertainty relation is just a form of expression of the Heisenberg uncertainty, but it is impossible that these are the same uncertainty because time and energy are not canonically conjugate variables.
Time cannot be brought into the ket in the same way as the other variables, so it becomes a parameter. I believe that this is because time, t, is not an observable, and any relation between t and an observable is the relation of a function. I understand that Von Neumann wanted to make time an operator, most likely to maintain consistency with relativity. Apparently he tried having a t for each particle of the system, and also tried a designated t particle, to no avail. Consequently, field mathematics was utilized instead, to account for this difficulty with t. But field theory produces what I believe to be absurd conclusions, such as symmetries and anti-matter.
So the question is what is the relationship between these two distinct uncertainties, the time-energy uncertainty, and the Heisenberg uncertainty. Where exactly do these uncertainties lie, concealed within the mathematics, and what happens when they are brought to bear upon each other? The Heisenberg uncertainty is well documented and I assume the best expression of it is found in the Schrodinger equation. I assume that the time-energy uncertainty must be concealed within field theory. There's a Soviet paper, by Mandelshtam and Tamm, (Journal of Physics, vol. 9 no. 4, 1945), entitled "The uncertainty relation between energy and time in non relativistic quantum mechanics" which is quite descriptive. Also, there's a paper I haven't yet read, by D. A. Arbatsky (2006) entitled "The certainty principle". If you have the time, see if you can evaluate the mathematics of this "certainty principle". Intuitively, I feel that there is a mistake in Arbatsky's claim that the Heisenberg uncertainty is more fundamental than the time-energy uncertainty, and this might result in the falsification of Arbatsky's claim that the certainty principle is more fundamental than the uncertainty principle. but this may depend on one's approach (one's prior assumptions).
I believe that it was Bohm in one of his writings who suggested that there really wasn't a particle in the De Broglie-Bohm Interpretation, but rather what we witnessing is a wave perturbation. This would make the theories realistic properties quite straightforward to understand from a realistic, conceptual point of view. The impulse behind this wave perturbation is something to ponder which is why Bohm suggested that his model leaves open the possibility for creative impulses in his Implicate Order. The video was quite interesting.
Really like what he is doing on YouTube.
Quoting Marchesk
Would it surprise you to learn that classical mechanics can also be formulated in terms of wavefunctions on Hilbert space?
No one thinks there are probability waves flying around in classical physics. What exists are rocks, chairs, planets ... and they aren't in Hilbert space either.
Quite right, they are not the same uncertainty and, as far as I know, Heisenberg had nothing to do with the time-energy relation. The explanation of the relation in Shankar is just a hand wave, not a mathematical derivation. When I looked it up in my hard copy I found some scathing comments I had written about it at the time I read it, which is probably why I dismissed it from my mind and didn't remember it.
I have not studied the time-energy relation and so do not know whether it can be deduced from the bare postulates. My pencilled comments on the text indicate a suspicion that other, non-core, assumptions are being used. But because the Shankar presentation is so lacking in detail, one cannot be sure of that.
Quoting Metaphysician Undercover According to wikipedia, those are the people that invented that relation, and published it in that paper. One would have to read the paper to find out what assumptions it uses, and I have not read it.
I suspect the time-energy uncertainty relation is not very important anyway since (1) it only appears in a short appendix to the Shankar chapter on uncertainty relations and (2) while the wiki article on Heisenberg highlights his uncertainty principle (for complementary observables) as the discovery for which he is best known, the energy-time relation is not directly mentioned in the articles on its discoverers, Mandelshtam and Tamm.
As I understand it, the time-energy uncertainty is closely related to the local/non-local dichotomy. Von Neumann could not bring time into the QM equations as he desired, as an operator, a conjugate variable of the Hamiltonian operator for energy. Others, like Pauli saw this right away as an impossibility, time is not observable, and they were willing to accept the consequences So time became a parameter, it is therefore outside the system. This leaves an uncertainty relation between the quantum system and its environment which determines time. That allows for relations between the internal and the external of the system which are not constrained by the laws of physics.
That is not a question for philosophy, but for science - unbeknownst to most philosophers, the divination of reality has been passed on to science (for around 400 years now).
What a philosopher would ask is a question that science will never address, but desperately needs an answer to (so it will not be so easily commandeered by mindless megalomaniacs), a question that we all need an adequate answer to, the Greatest of the Great Questions of Life: that of "Why Bother?"
Without an adequate answer to that Greatest of the Great Questions of Life (for you must admit, you must answer that question before you even begin to address the others), all will crumble in uncertainty. (I have the answer, by the way) (and no - it is not a smart-ass answer. so don't go there).
To save myself from having to respond later, the answer is "Because consciousness is a good thing" (think of the alternative). This answer, by the way, also reveals the Ultimate Value of Life - Higher Consciousness (which humans have, but do not adequately use yet), which gives us the Ultimate Goal of Life (securing the Ultimate Value, naturally). In our case, it would be 'securing higher consciousness in a harsh and deadly universe". Now that we have an Ultimate Goal, we have the Ultimate Arbiter in determining good from evil (their being goal-driven), and with that ability, we can build worthwhile individual lives (with a clue) and relevant civilizations (finally).
Why must this question be answered first? If the philosophical nature is simply "the desire to know", then why can't we direct our inquisition toward anything we want? Philosophy begins in wonder, and we can wonder about anything without having any notion as to why we are wondering about this. What makes you believe that this particular question, "Why Bother", must be answered before we ask all those other questions?