Randomness
I keep running into two different concepts of randomness, so I wanted to see what you all think.
Is random the unpredictable?
Or
Is random a situation with various possible outcomes?
Is random the unpredictable?
Or
Is random a situation with various possible outcomes?
Comments (56)
Perhaps 'unpredictability' contains a bit of anthropomorphism of experience, while randomness suggests a state of affairs?
Doesn't unpredictability suggest an unsuccessful attempt to predict, while random seems to have no such implication.
If so, then what does this mean when we select something at random? That we don't know what we are gonna get?
But what if we select something at random out of say 10 possible choices? Then we know what we are gonna get; we are gonna get one of the 10 possible choices, but it was still a random selection. Is that saying we have simply removed the decision form our hands, and allowed variables we can't see to make the selection?
Nature doesn't produce dice. Only humans do. However they still illustrate the essential principle of how to understand randomness or spontaneity in nature.
So you are making the standard Laplacian complaint that, in principle, complete knowledge of nature is possible, and so all future events can be calculated from determinate microphysical laws.
Well firstly, we now know that Newtonianism in fact fails at the limits. Quantum mechanics says existence is irreducibly indeterministic - and that ontic claim can even be phrased epistemically in terms of this being due to the fact we can't ask two different (non-commutating) questions of reality simultaneously. Like where are you exactly/what is your momentum exactly?
And complexity theory shows that the very idea of calculation is also self-limiting in this fashion. Because calculation is a digital way of describing an analog world, there is always round-up error in any attempt to model real world events.
No computer could ever specify the initial conditions of a calculation to an infinite number of decimal places. And if error compounds exponentially while the calculation proceeds in linear time (polynomially), then error must swamp any claims to accuracy in a few steps if it is describing a non-linear or chaotic event (one with less constraints than the kind of regular dynamics that Newtonian mechanics was designed to describe).
So we know that this idea of a mechanically deterministic universe is itself an idealisation. It is not the "natural state" of nature. Newtonian physics describes the world after it has reached the limit of a process of symmetry breaking and thus spent its many degrees of freedom. It is the world in as determinate state as it can get - yet not actually determinate, as quantum physics and complexity theory reveal.
Anyway, back to dice and how they illustrate this.
We make dice as perfect and symmetrical as we need them to be. Which in turn means we are matchingly indifferent to imperfections that are beyond what might affect our purposes in having a die.
What we want is a die that a thrower can throw in a fashion which leaves them with no way of telling what number will roll. So it must spin easily (bevelled edges) and yet fall flat on one face (break the symmetry of spinning) without favouring any one outcome. So if you are really concerned about dice being fair, you buy machined dice. You pay extra for the engineering and certification.
You insist on certainty that the die will break symmetry in a way that is entirely spontaneous to you.
But if you wanted to insist on that level of spontaneity in terms of nature itself, then you would have to get down to harnessing some kind of quantum noise or quantum emission process. Even nature doesn't know when an atom will decay - just that it has a completely exact and predictable poisson distribution. (The propensity to decay remains constant in time - which tells us something deep about the constraints that form nature, that is, our particular Universe.)
Quoting Jeremiah
When it comes to dice, we could in theory measure these further variables. But until gamblers start troubling casinos with such high tech approaches to beating the house odds, no one has reason to care.
So the human situation shows directly that randomness is about how much we have practical reasons to care about constraining the physics of events. We don't let gamblers drop dice. They must roll them properly.
The difficult mental leap - the one I've argued for - is to see that this principle is true of nature also. And quantum physics is the best argument. Nature can only ask questions of itself (hey little particle, what's your exact location/momentum?) to a limited degree of precision. And yet this doesn't really matter on the general scale of things.
Quantum fluctuations only disrupt nature on the tiniest or hottest possible scale of being. The Universe itself is now so cold and large that it is pretty much entirely classical in practice. There is infinitesimal chance of it doing something "quantum" like winking right out existence, or fluctuating into some other bizzare arrangement.
So indeterminism is basic to existence. And yet existence has become a place where everything is more or less as good as determined.
The question then becomes, why do humans still find randomness useful? Why do we invent ways of introducing chance back into the world of dull mechanical routine?
Obviously it is because we enjoy creating zones of freedom in which we can pit our wits. Games of chance are a way to practice our skills at strategy and prediction against "unpredictable nature". And so the kind of randomness we are really modelling there is the unpredicability, or non-computationality, of complexity.
We can try to calculate the future. But also such calculation is impossible. Which is where the pleasure and pain of being lucky/unlucky comes in.
Quoting Jeremiah
Yes, you are describing epistemic uncertainty - something we have got mathematically and mechanically good at "manufacturing".
And then the deeper issue you want to address is ontic uncertainty - the randomness of nature itself.
And as I say, we can either rely on our own actions to result in our desired level of uncertainty (as i insisting gamblers roll dice properly, and don't bring moon gravity measuring devices with them into the casino). Or we could try to harness uncertainty by tapping into nature's own level of physical indifference. We could get down to quantum level processes. Or step up to uncomputable non-linear or chaotic processes.
Of course, people will still insist that at the bounding extremes of nature - the micro-physical and the macro-complex - Newtonian determinism must still reign.
But that is simply old-hat physics. We know that at the limit, things are actually different. The physics of the classical middle ground - the computationally simplest possible physics - no longer applies.
Humans are part of nature so if they do, then nature does.
"So you are making the standard Laplacian complaint that, in principle, complete knowledge of nature is possible, and so all future events can be calculated from determinate microphysical laws."
No, I am not. I never made any such claim.
But nature is an emergent mix of constraints and freedoms. So humans are free to do stuff that nothing else in nature can manage.
Quoting Jeremiah
OK, well you can make it clearer exactly what your ontic commitments are when you say stuff like.....
Quoting Jeremiah
"OK, well you can make it clearer exactly what your ontic commitments are when you say stuff like..."
Sure it is right at the start: "For all we know" And also, "our inability to see. . . "
But it was interesting watching you debate with yourself.
If you actually thought you were suggesting the kind of constraint on the physics of tumbling dice that is patently irrelevant, you should have made that clear. The fact you mentioned it can only be taken to imply you felt it was a likely, if not a definite, ontic possibility.
Sorry to be so logical about this. To be taken seriously is obviously not what you really want here.
"Sorry to be so logical about this."
Is that what you are calling it?
Something like this is probably the most useful way to grasp the concept of randomness: as 'equiprobability', or the equality of probable outcomes. Conversely, 'non-randomness', or order, would be the culling of probable outcomes so that some are more likely than others. This 'culling' can be thought of in terms of putting 'restraints' on the range of possible outcomes. Gregory Bateson uses the lovely old image of the monkeys on a typewriter to get the point across:
"If we find a monkey striking a typewriter apparently at random but in fact writing meaningful prose, we shall look for restraints, either inside the monkey or inside the typewriter. Perhaps the monkey could not strike inappropriate letters; perhaps the type bars could not move if improperly struck; perhaps incorrect letters could not survive on the paper. Somewhere there must have been a circuit which could identify error and eliminate it." (Steps To an Ecology of Mind, "Cybernetic Explanation").
If, however, one thinks of randomness in terms of equiprobability, the properly interesting question is what the 'ontological status' of 'probability' is. Are there probabilities in nature, or is probability an epistemic concept that has to do with the motivations of an inquirer? I lean towards the latter answer, but there you go.
That's surprising. I'm sure you said once you believed that spontaneity was a proper part of nature. Or is equipotential in fact physically impossible for some reason?
(And "restraints"? LOL. What is this weird jargon you've picked up?)
Not sounding very thought out.
You didn't answer the question, just introduced the further thing of abstract possibility.
And in mentioning Bergson, are you really wanting to treat chance as a matter of panpsychic will rather than pansemiotic indifference or equipotential?
Again that is confusing as I didn't think you were in to woo.
What is it that you are really trying to say?
But this is a misformed question: the point is that randomness (qua equiprobability) is indexed to motivations and expectations of an inquirer (not 'mind', btw, a word with much too much metaphysical baggage), and that it's category error to speak of nature as being either random or not random. Nature is a-random, if you like. Hence the malformed question. Perhaps the reason you are so perplexed is that your whole framework of thought is tethered to the notion of possibility, but that's not my problem.
As an aside, the very idea of the 'will' is also one of the worst posed notions in all of philosophy, so despite your attempt - as per your usual manner - to pin things on me that I don't hold, well... nope, no will here.
A random event is an event that isn't determined by antecedents, because there's some degree of acausality involved in antecedents leading to the event in question. An upshot of this is incorrigible unpredictability, at least beyond probabilistic predictability.
A is spontaneous if it isn't caused by some B.
A is random if it was caused by some B but was not a predetermined effect (i.e. B could have caused some C).
A is unpredictable if we can't use past or present information to predict it.
Using these definitions, spontaneity and randomness is a matter of ontology and unpredictability is a matter of epistemology. The first two would imply the third but the third doesn't prima facie imply either of the first two (I say not prima facie because it could be that a thing is unpredictable iff it is either spontaneous or random).
If they both look the same, what justifies your claimed categorical difference? And if they don't in fact look the same, how does their observable difference manifest?
Look, that rock just fell off the face of that cliff. Was it random in the sense that I don't know the triggering cause, the straw that broke the camels back? Or was it spontaneous - in perhaps a Bergsonian sense in which it made up its own sweet mind?
Or are spontaneity and randomness two ways of talking about the same thing - the equiprobability that is the fluctuations which contexts of "restraint" don't manage to suppress?
So there are a variety of things you might be trying to say. And you could clarify by starting with my question of what would I see as different if an event were spontaneous rather than random?
After that we could move on to this new weirdness of a-random. Perhaps you mean that to which the principle of non-contradiction fails to apply? Ie: vagueness.
I think you just don't know.
Do you think this is true, A comes out of nothing?. Can you point out any such B?
I don't believe in pure spontaneity, any uncaused b. It seems to me that what we describe as spontaneity is really a rearrangement of what already exists.
I didn't say that anything is spontaneous.
So unpredictability/predictability is normative? A point of view perhaps?
It's about the what can be known/derived.
That QM indeterminacy is incomprehensible to me is obvious, but I really believe it's beyond that as being entirely incoherent.
Perhaps, but then the same can be asked when the causation isn't random. What causes A to (always) cause B rather than C?
Shouldn't your sentence read:
It's about the limits of what can be known/derived at this point in time.
Can't point of view make a difference, 'from my point of view" it appears predictable, but not from "your point of view"....
Yep, as I don't care if you are mistake. Feel free to be as wrong as you like.
The more I have been thinking about this, the more I have to agree with your statement here:
"Conversely, 'non-randomness', or order, would be the culling of probable outcomes so that some are more likely than others."
I think randomness is about control or lack of control and unpredictability is consequential. And, from a mathematical point of view: Probability is the proportion of the various possible outcomes of the repeated exercise of a random event. So while probability has a relation to randomness, I am not so sure it is itself randomness. So I guess over all I agree with your assessment.
One thing to note however, is that insofar as probability is premised on 'all things being equal' (ceteris paribus) across the repeated exercise of a certain event (as you said), probability is concept far better suited to the laboratory than to nature: by design, it can only operate in the context of a stable, unchanging, and already individuated state if affairs, explicitly relegating any emergence of the new. It can only ever yield generalities rather than singularities. It's a wonderful scientific tool, but a poor philosophical one.
I don't see this as responsive to the ontological problem. In a determined world, A always causes B because of something. In an undetermined world, A sometimes causes B and sometimes causes C because of nothing. The arising of B or C in the undetermined world is an incoherent spontaneous event, but such things don't happen in the determined world. Thus the two are ontologically different.
To the extent I do not know what that ''something" is that causes A to always cause B, I have only an epistemological problem. There is nothing incoherent about a world where it is admitted there are certain things we simply do not know.
Then why can't I say that in a random world A can cause either B or C because of something?
I believe that that could happen, sure. As to whether it does in any particular cases, I'm more or less agnostic on that issue.
And you'd have to also say that the cause of it being B and not C was nothing, which means that something was caused by nothing, meaning that there is something ontologically different from a determined world than an an indetermined world, which again responds to your prior post asking what the difference was between our inability to explain how A caused B in a determined world and how B randomly came about in an indetermined world.
I'd also reiterate that something coming from nothing is the definition of spontaneous occurrence, something you said didn't occur in a random world.
My objection is that the QM description of truly random events is incoherent. It's not that my little mind can't fathom it. It's that it defies foundational principles of understanding. To understand the world, one must ask why things are as they are, which implicitly requires offering the underlying cause of the event. If there is no such cause, then there is per se no explanation for why something happened, meaning reason (and thus understanding) is being defied.
You don't seem to be consistent. You said that A always causes B because of some unknown "something" but when I use this unknown "something" as the reason for A sometimes causing B and sometimes causing C you then said that this "something" is actually nothing and that it's just a case of spontaneity.
If we take the following to be a representation of a predetermined and a random world respectively:
1. A ? B
2. A ? B ? C
Then why does 2 require more explanation than 1? In both cases we either have to explain or accept as axiomatic the ?.
So you might ask of 2 "what makes it sometimes B and sometimes C?" but then I'll just ask of 1 "what makes it always B?"
Quoting StreetlightX
So perhaps you can explain how spontaneity and novelty are the same or different in your book? How is one to understand you when you keep shifting your jargon?
I mean it is clear that novelty has an element of the surprising or unpredicted, and yet also "a good fit" when it is "a creative act". So novelty would be contrasted with chance or accident in terms of its relation to finality. Novelty is symmetry breaking that retroductively serves a purpose, while accident is symmetry breaking that serves no partiicular purpose - it is meaningless novelty.
So yes, as we consider causality in a full sense - the four Aristotelean causes - then the variety of terms we employ start to come into focus in terms of their ontic commitments. And if we continued to a Peircean semiotic analysis of nature, we could eventually cash out the crucial hinge that is the epistemic cut - the role that sign plays in crisply deciding those ontic boundaries.
We can roll a die and physically it must land on just one of its six faces when its spinning stops. But we still have to read off the resulting number correctly. The die doesn't "say" anything until its physical state has crossed over into the observer's epistemic universe in this fashion.
So novelty is connected with complexity as it demands the question of "who finds this predictively surprising yet retroductively fitting?". It implies an answer seeking mind at the centre of it all.
But signals must be extracted. And noise is that which is suppressed. Noise doesn't exist in nature as a purely physical fact. It is the name we give for everything about which we (now) no longer need to care - like the five other sides of the rolled die.
So what we keep finding is that it is all organised according to the logic of dichotomies - the separations achieved by symmetry breakings. If we speak of things like chance and necessity, random and determined, signal and noise, these are always the constraining limits of possibility rather than actual states of being. Existence is always happening between the extremes. So rightfully we can only speak of that which is more or less determined, more or less random, more or less spontaneous, etc. Even more or less ontic, or epistemic.
However with all this jargon-jumping by you, one will never know whether you have a well thought out position in this regard.
And random~determined - as the metaphysical dichotomy speaking most directly to action and causality - is of course pretty much right at the heart of metaphysical inquiry. It is not the place to be muddying the waters.
To be fair to QM, it is deterministic at the wavefunction level of description. Indeed, extremely so (as it extends this determinism all the way back to the beginning of the Universe, and all the way to its end, according to some interpretations).
So QM describes the world as rigidly bounded by a set of statistics-producing constraints. It just isn't the "regular" statistics of a classically-conceived system.
As I mentioned, the "sign" of pure quantum randomness or spontaneity in particle decay is that it exactly conforms to a Poisson distribution. The chance of a particle decaying is unchangingly constant in time.
And hence also the radical indeterminism, the depth of surprise, when a decay occurs "for no reason".
A constant propensity for a decay is a state of symmetry, or maximum indifference. One moment is as good as another for the decay to happen. There is no mounting tension as there is in a classical system - pressure building until the bubble must surely burst sooner than later. So a decay isn't caused even by a general thing, let alone a particular thing. It really does "just happen" ... in a way we end up describing in desperation as due to an internally frozen propensity.
So we know particle decay has this radical nature because a collection of identical particles will tend towards an exact Poisson distribution - a powerlaw pattern which is characterised by its absolute absence of a mean. There just is no average time to wait for the individual particle. It could happen in a split second, or at the end of time, with the same probability. As exceptionality or novelty, it is literally unbounded.
On the other hand, we were just talking about the ideal case. And the real world is much messier. So observation or measurement, for instance, can disturb the statistics. Decay can be prevented - futher constrained - by the quantum zeno effect. The watched kettle cannot boil.
So the pure case that produces the Poisson distribution may be an ideal description that nature - its symmetry already broken - never achieves. Yet then also we have to say that nature comes unmeasurably close as far as we human observers are concerned.
Certainly, when we employ atomic decay as our most accurate clock to measure the world, we are relying on the ideal being achieved so as to in fact be able to tell the time. :)
Anyway, what QM really does is take the contrasting notions of determinism and chance to their physically measureable extremes. And it then quantifies the degree of entanglement or non-separability that irreducibly remains - the Planckian uncertainty.
Classical dynamics can't make sense of this because it just doesn't recognise the notion of "degrees of disentanglement". It takes the all or nothing approach that things are either completely free or completely controlled, completely one or completely divided.
This is useful as it has great simplicity. And a particular statistics results - that based on the assumption of completely independent variables.
But quantum physics recognises that issues of separation and connection are always irreducibly relative - each is the yardstick of the other, as described in the reciprocal logic of a dichotomy. And so quantum statistics has to allow for variables that can be entangled.
Mathematically it is not incoherent. Well, at least not until you want to recover the classical view and disentangle your variables by "collapsing the wavefunction". At which point, the famous issue of the observer arises. It becomes "a choice" about how the epistemic cut to separate the variables cleanly is to be introduced. The maths is incomplete so far and can't do it for you.
So quantum mechanics takes a step deeper into the essential mystery of nature. It differs from the classical view in putting us firmly inside our metaphysical dichotomies. Randomness and determinisim are not absolute but relative states. The new question that comes into focus is relative to what?
Relative to a human mind is a bad answer (for a realist). Relative to each other - as in a dichotomistic relation - is logically fine but also incomplete as it does not yet explain the real world which is full of different degrees of randomness and determination. (All actual systems are a mix of constraints and freedoms.)
So that is why eventually you need a triadic, hierarchical and semiotic metaphysical scheme. You need to add in the effects of spatiotemporal scale. A local~global separation produces a "fixed" asymmetry in the universal state of affairs. Action is now anchored according to a past which has happened and so determines the constraints, while the future is now the space of the remaining possible - the degrees of freedom still available to be spent or dissipated on chance and novelty.
And this is the way physical theory is indeed going with its thermal models of time -
http://discovermagazine.com/2015/june/18-tomorrow-never-was
"Novelty - which I mistakenly understood you to be asking after...."
So it is your illiteracy which is my problem here. Apparently seeing "spontaneity" written, you were replying while thinking about something else.
But of course I don't believe you did in fact misread me. You are now just weaseling with terms because there was the danger you might have to be seen agreeing with me.
That's where this started. I'm sure I was surprised by how strongly you spoke out about the irreducibility of spontaneity in a PF post last year. I remember because I agreed strongly too - yes, a novelty!
So at first you confirmed that memory, and then very quickly you decided to backtrack. Now you are intent on rewriting history when your own words still remain to show what was said.
I understand the two questions:
1. In an undetermined world, what makes it sometimes B and sometimes C?
2. In a predetermined world, what makes it always B?
Epistemological answers:
Answer to #1: I don't know (it is incoherent to ask how a spontaneous event occurred).
Answer to #2: I don't know (that goes beyond the limit of my knowledge)
Ontological answers:
Answer to #1: Nothing (spontaneous events have no causes).
Answer to #2: Something (all events have causes, even if I don't happen to know what it is).
It is for this reason that your retort (question #2) does not establish that both the determined world and undetermined world are on equal footing in terms of coherence. The fact that both test the limits of our knowledge is irrelevant (epistemological answers #1 and #2). What is relevant is that indeterminism asserts the ontological impossibility of spontaneity (ontological answer #2 in contrast to #1).
Your answer to #2 isn't an answer at all. Even if all events have causes, what makes it the case that A always causes B? At best you're saying that there's some intermediate C between A and B that is the immediate effect of A and the immediate cause of B. But then what makes it the case that A always causes C and that C always causes B?
I don't see why A ? B ? C entails spontaneity but A ? B doesn't. In the predetermined world, B happens because of A. In the random world, either B happens because of A or C happens because of A. There's no spontaneity. Whatever happens is caused by something prior.
"One of the advantages in thinking of randomness in terms of equipotential is that is allows us to bypass many of the tricky debates about causality in a rather clear and unambiguous manner."
In that sense it actually becomes an attempt to balance causality ( or confounding variable) across equally possible outcomes. As in taking a random sample; in order to make sure the groups are as similar as can be, randomization gives a high probability of fair distribution of the confounding variables.
" is concept far better suited to the laboratory than to nature: by design, it can only operate in the context of a stable, "
Right, I doubt there are situations, which are not orchestrated by humans, that have equal possible outcomes for all the variables being considered. However, I am reluctant to separate it along the lines of natural and unnatural, as I consider humans part of nature.
"It's a wonderful scientific tool, but a poor philosophical one."
That part I have to disagree on, as in a sense I feel science is an application of philosophy.
I wouldn't use the term unnatural though, or rather, I wouldn't set them in opposition to each other; I would say instead that the laboratory setting - in which the category of 'possibility' is an apposite concept - is a kind of 'subset' of nature, embedded in, but not coincident with, the wider world (Not A?B, but A?B.). It's to the degree that philosophy deals with precisely this 'wider' subject matter that I call it a poor - or maybe rather limited - philosophical tool.
" It's to the degree that philosophy deals with precisely this 'wider' subject matter that I call it a poor - or maybe rather limited - philosophical tool."
Sorry, I just can't get fully on boat with that one. Consider this famous quote:
“Give me but a firm spot on which to stand, and I shall move the earth.”
? Archimedes, The Works of Archimedes
Science provides the firm spot on which philosophy can stand. I don't think we are too far off in our ways of thinking, I just don't argue with the notion it is a "poor" philosophical tool.
If you run the physical causal law in reverse, does B ? C ? A make any sense?
For any specific set of causes, if there can be an infinite set of events that follow (as there's no reason to limit things to just 2 possible choices), how do you conclude causation and not spontaneity? My point being that an inherent condition of causation is determinism, and any indeterminate system is necessarily non-causative and therefore spontaneous.
If you say that A must yield B or C but the option yielded is yielded without a specific cause, then I'd say you're referencing spontaneity on that level. That is, why did is "choose" B and not C? Was it because of D, E, or F, or for no reason at all? On the other hand, if you say that A must yield B in every case, I don't see an element of spontaneity.
What causes A to always yield B in a predeterministic world? Is it because of D, E, or F, or for no reason at all?
The issue I see is that when you get to the level of explaining causation itself it doesn't really make sense to then explain it in terms of causation. Trying to explain causation (whether random or predetermined) with reference to some even more fundamental causation is mistaken. It is just the case that in a predetermined world A always causes B and in a random world A can cause either B or C. You need something other than causation to explain causation (else circularity ensues).
That's not quite true. Under super-determinism (which just means determinism+entanglement), you will observe reality exactly as quantum mechanics predicts i.e. the result of measurement is one member of the spectrum of the observable, but the value you obtain is completely determined.
Another view is that measurement results in a superposition of the entire spectrum of the observable and through decoherence, you find yourself entangled with one of them, or rather you become entangled with each of them, but are unaware of your counterparts.
There are other interpretations of QM, but these are not compatible with the Free Will Theorem, which is precisely the result that A leads to B or C and no information exists in the universe to predict which. The argument is that particles are genuinely free.
I reject super-determinism, but that still leaves us with a deterministic (with unpredictable outcomes) theory and a "spontaneity" theory, which are empirically indistinguishable for the foreseeable future.*
One interesting point I find, is that BOTH the deterministic and the spontaneous theory deny causation.
*Because deterministic theories are time-reversible, certain exotic experiments have been proposed that could distinguish a time-reversible theory from a spontaneity theory.