Picking beliefs
I have a question, is it right to pick beliefs regardless of accuracy if they make you a better, more functional person? For example believing in free will rather than determinism because the latter belief makes it difficult to act as a self motivated individual. Or, if you are religious, believing in god as opposed to nihilism, because creating ones own meaning is difficult considering the random nature of reality. Should one be a utilitarian with one's beliefs? I'm an atheist but some things I cannot allow myself to believe, determinism being one of those things. Even if it comes to be that all of accepted human knowledge concludes that is in fact correct. Am I wrong? Why?
Comments (34)
There is also the issue of holding on to beliefs that part of you thinks are actually wrong. For instance, if you were in a powerful political party and enjoyed a position of privilege but at the same time, you realised that you don't accept some fundamental policy position they have. Or ditto with being part of a religion - you might be part of one by inheritance or commitment while still having nagging doubts as to whether you really believe it. And they can't be very difficult situations to face.
I think the emphasis ought to be on believing what you really hold to be true. But as you're hinting at, for atheists this can also lead to some tensions - as you've said, regarding determinism and freedom of will. I think one thing that has to be faced up to is the understanding that there may be questions of this kind, which can't really be adjudicated by reference to some objective authority. In other words, there are strong arguments for and against. Perhaps one approach is learning to live with a sense of the 'suspension of judgement' in respect of such questions - to resist the urge to believe. I think, perhaps, that is nearer the original meaning of 'skepticism' than what that term is nowadays often taken to mean.
I think you have to have some sort of mental framework with which you view the world, otherwise every action is uncertain in its effects and progressing towards a goal is impossible. You have to believe the ground is solid and not going to cave in at any moment in order to walk steadily on it. You have to believe that lifting weights causes you to be stronger after recovery in order to have the motivation to continue lifting weights. You have to believe bank tellers are not aliens sent from outer space to steal your penis in order to have a normal conversation with them. You have to believe you will be alive tomorrow, in order to plan for your future. So goal directed behavior is impossible without belief of some sort. Or maybe maintaining sanity is what belief is necessary for. I don't know, I'm going to bed, I'll figure it out tomorrow. So now the question is, is belief necessary for action?
As mentioned above, there’s utilitarian ethics; then there’s virtue ethics, various schools of existentialism, various schools of philosophical ethics. So one way to go about exploring the question, would be to explore those subject areas, although modern academic philosophy tends to be a bit removed from that kind of real-life emphasis. [Actually one popular current philosopher who that brings to mind is Jules Evans, whose Philosophy for Life is a pretty well-regarded ‘practical philosophy’ text.]
But overall, in answer to your question - yes, I do think you have to believe something, but what that might be, is a complicated matter in today’s world.
If its of help: In my younger days held the belief that we are causally predetermined to innately live as thought the illusion of freewill was not illusory. In brief, this due to our causally determined ignorance as it pertains to every single cause further causing us to need to act as though we make our own choices, and hold responsibility for these. For instance, one could become unmotivated due to the thought of “its all predetermined” or motivated due to whatever reason; regardless, its all predetermined and perfectly fixed metaphysically; but because we cannot hold a cognizance that knows everything, we have no way of knowing what our predetermined future will be. Because our future is always determined by the actions we take and the thoughts we hold that motivate these actions, our causally predetermined conditions predetermined us to always act and choose as though we do have freewill. Hope that makes sense … even though nowadays I’m a metaphysical compatibilist and no longer subscribe to causal determinism. (I now disagree with this argument, but maybe you can find a way to make it work.)
There’s a lot unspoken about atheism, including that it necessitates a causal determinism (often coupled with the belief that good old science could no longer be valid were causal determinism to not be). I challenge you to come up with a conclusive argument of why this is true, and I bet you won’t be able to come up with one. Atheism can exist just fine devoid of a substratum of causal determinism. If a concrete example is needed, some types of Buddhists are atheists … despite believes in an afterlife and in karma. No god/s, and the afterlife for them is not god-given but just part of the course (never for better since their point is to no longer be reborn and die ad infinitum)—this just as much as the most profane aspects of life are part of the course. OK, many other types of Buddhism, such as the Tibetan brand, is theistic. Still, I would think other established examples can also be found.
My main problem with believing X and not-X to be true at the same time is that it’s, technically, what George Orwell termed “doublethink”. And, personally, I find it to be a disorder of mind which generally makes our social existence worse. Like obesity: it’s perfectly OK for some, but not societally as the norm. Holding some degree of open mind about whether X or not-X is true is something altogether different, though.
My take is: steadfastly justify that your believed truths are in fact true up until you find out you’re wrong—and this time might never happen, especially where your beliefs are in fact true. Or, else, maintain an “I’m not fully sure attitude” till further information develops for you. But, at the end of the day, don’t ever start worshiping your beliefs as though they were absolute truths. I believe this latter part is what most, if not all, of the non-Cartesian skeptics ultimately want to get at.
This contradiction you rightly find between causal determinism and responsibility, till resolved to your satisfaction, should—to my mind—fall into the “I’m not yet fully sure” group (this now that the contradictory reasoning is present within consciousness). Else it would smell too much of a doublethink with belief worship to me.
It was especially with reference to moral behavior and reasoning that he seemed to admit such beliefs. So that might be one possible path for you, since you are thinking that we must have free will in order to act in a moral capacity.
I think so, for more or less the same reason as that given by Wayfarer in the first paragraph of his first reply. The notion of picking beliefs strikes me as oxymoronic and disingenuous, like picking where you were born, your age, or who you'll bump into today.
Yes. You have some control over what you read, look at, think about, and so on, but you have no control over what you are or are not convinced by. I can look up arguments in favour of the flat Earth theory, but I can't force myself to believe such rubbish. And if I could see that you were standing right in front of me, then I'd most likely be convinced of that fact, and could not choose to believe otherwise. I could pretend to believe otherwise, but that's not the same thing. You might try to convince me that you're a hologram or a hallucination or that an evil demon is playing a trick on me, but I very much doubt that I'd be convinced of that.
Quoting AlmostOutlier
No, that doesn't follow. You believe what happens to convince you, and what happens to convince you isn't necessarily what convinces others in society.
A quest to find Truth determines what I believe, regardless if it makes me a better person or not. Many a time Truth causes one to be a better, more functional person.
I think this is exactly what we do all the time, except without the "equally" bit. It's highly likely the ground you walk on will support your weight, so you walk on it as if it will, as if you know it will. But you know no such thing, and the occasional sinkhole makes this point forcefully.
The analogy I always reach for is wagering: you may figure the Cubs have a 62% chance of winning tonight, but you have to bet as if they will or won't. Confidence may fall along a gradient, but action is binary: you can kinda think the Cubs will win tonight, but you can't kinda bet.
Quoting Sapientia
I'm very sympathetic to this view. It does leave me with a bit of a puzzle though. If we do not choose our beliefs, should we say they are caused? In some scenarios that seems okay; perception seems like this.
But what about reasoning? Suppose A being the case makes it that B is. Then we would customarily say that believing A is a reason for believing B, that if you think A likely, for instance, then you should think B likely.
What do we mean by "should" there? That suggests you could choose to reason this way or not. Does that leave us saying you cannot choose simpliciter to believe B, but you can choose to conclude B by reasoning? Would you then be choosing to cause yourself to believe that B?
No, I don't think that that suggests choice at all. It's not like presenting an option, and endorsing one of the options. It's setting out what's reasonable and what's not, and saying, "If you were reasonable, then this is what you'd believe". The "should" is like an expectation, akin to saying something like, "If there's gravity, then a thrown rock should fall to the ground".
I'm not sure your example is the same though. Yours I hear as a subjunctive, acknowledging gravity as an hypothesis.
I took my "should" as normative: if you reason the way we do around here, this is what you'll conclude. Not an expectation in the predictive sense, but in the "it's mandatory" sense. (Every one of these words tries to be ambiguous in the same way...)
That interpretation is fine with me. But under that interpretation, the presumed choice still seems to appear out of thin air without explanation. How do you get from, "If you reason the way that we do around here, then this is what you'll conclude", to, "Therefore, choice!"? If my reasoning doesn't match theirs, then that's that. What's the choice supposed to be? The only choice that I see would be the choice to conform disingenuously or stand by my reasoning. That's no choice at all for someone like me. I'd have to just think it over some more and see where that leads me.
Just going off the the idea that "should" here is normative, that you should conclude such-and-such in the same sense that you should respect your elders or that you should eat your vegetables. Norms are usually something you get to choose to conform to or not, and there are consequences either way.
Quoting Sapientia
This is an excellent point. Most of the time, we no more choose to reason as we do than we choose to speak our native language as we do. We've learned, we've been trained, we've developed habits, and so on.
But I've been in the position, on this very forum, of arguing that if you say A then you should say B, and it sure feels like I'm saying that this is the right way to reason, that I'm invoking a real norm.
The conundrum stands...
How does this make sense? You can't choose to believe in free will if you don't have it.
Suppose I now said that in the next paragraph I'm going to present an argument against your position and in favor of mine. You could choose not to read it, and avoid exposing your inner conviction engine to something that might, without your control, move you over to my side. There are people who do exactly this.
"Act as if ye had faith, and faith will be given to you" is the old Catholic saying. Even if you cannot choose to hold a belief, if you can manipulate the circumstances of belief formation, then there's something choice-like going on there. People do talk themselves into beliefs, and arguably reasoning is exactly such a process.
We can of course just deny that there's choice in any of these behaviors, but that's a whole 'nother deal.
There are 3 ways we can believe something to be true. Fact, Reason, or Faith.
If one picks to believe something in spite of the fact that it is not factually true - i do not see how anything objectively good can come from that.
If one picks to believe something in spite of overwhelming argument that it is not reasonable true - i do not see how anything objectively good can come from that.
However, if not contrary to objective fact, or overwhelming reason, I think one is free to pick (within some bounds) anything to believe if one finds utility in it.
What trickery is this?!
Okay, but there's a natural follow-up to what I wrote, which is to tell a story something like this: there is progress in rationality by not filtering what your conviction engine is exposed to, and further progress in deliberately seeking out disconfirmation of your working theory. I like that story. And it gives you a ready way of ranking norms: if you must ignore evidence you don't like, I count you as less rational.
Your point that there is action taken that only indirectly affects conviction stands, because we can try to follow a course of action without a chosen outcome.
That's all quite idealized, of course, as recent issues in social science research show. So then we're back to questions about how the course of action is chosen.
And if you had that illusion, you just would and there'd be nothing you could do about it.
My own view is that the concept of free will is incoherent and I also believe that the world is incoherent without free will. Neither a world with free will nor a world without free will makes any sense.
Perhaps beliefs as a class have kinds, and some of these kinds are objective or empirical and others are subjective, and of the subjective some rational and some emotional. Then what is (assuming normative viewpoints) empirically believable, such as determinism, might not be subjectively believable. Something along the lines of what I believe about the empirical properties of H20 have little to do with the feeling of being wet.
I think free will has value in society because it seems to explain how we experience and explain our behavior, and it seems to form the basis for our accepting responsibility for our actions. So I don't know if the universe is determined, but since all deductions based on empirical evidence are probabilistic, they not certain.
I suspect we come to our beliefs, not really as a matter of choice (generally) but by what we have learned, utilized our reasoning, and/o how we feel about what our belief concern, so then more of a historical or habitual process.
What comes to my mind is how accuracy is entangled with beliefs that make us better and more functional. Why do we care about accuracy in the first place? Would we care about being scientific and rational if we didn't associate these things with physical and moral positive results?
Others have (accurately I think) mentioned that we don't get to pick many of our beliefs. In this context I think it's pretty clear that you intend those cases where we have the sense of choice. To me this sense of choice is 'free will' enough, though I believe there are other arguments against determinism. Assuming that determinism is a sufficiently meaningful/specified position to be true or false, I've never been bothered by the idea that all is predetermined. In fact, I think it has its attractions --as long as we are indeed mortal as I think we are.
The moral element would be a shallow reason to stop caring. I'm the kind of person who is interested in the results irrespective of whether they're seen as good or bad - whether it's a cure for cancer or a meteorite that's going to wipe us out at some point in the future.
Quoting fart
Can you give an example of one or more of those cases where you think that "we" have the sense of choice? I for one do not have any genuine sense of choice with respect to what I believe, and I'm certainly not the only one like this.
It's one thing to consider how you feel about determinism, but it's another to think about whether it's plausible. The former shouldn't factor into to the latter, as that would be fallacious (under the category of an appeal to emotion). And it's the latter, not the former, which matters, philosophically.
I very much relate to this. For me intellectual heroes often have an 'amoral' edge. They are willing to think beyond the morality of their community ('evil' thinking) and/or immerse themselves in studies with no immediate application (aesthetic or curiosity-driven thinking).
Quoting Sapientia
All I intend is the state of mind in which we are really not sure what is going on. Imagine a person who hosts a party and then can't find something valuable afterward. Did someone steal it? Or is this just coincidence? Now they have the choice of whether to ask embarrassing questions. I can imagine a person spending a few angst hours on this decision. More examples are a person trying to figure out whether they should or should not go to grad school or ask a female friend about becoming romantic or a boss for a raise. Illusion or not, the burden of decision seems pretty real to me.
I do see, of course, that many of our beliefs are beyond our control.
A state of mind in which one is really not sure what is going on is different from there being being a sense of choice with regard to what to believe, and a sense of choice with regard to what to believe isn't necessarily implied.
Yes, they would have the choice of whether or not to ask embarrassing questions, but that's a choice regarding what to do, not a choice regarding what to believe. If I was convinced that someone at that party stole my favourite tiara, then I might confront them about it, although I would probably just slip some poison into their drink. But the point is that my choice of action, if there really is one, would be based on a belief not of my choosing, but determined by what convinced me.
And likewise with your other examples. The burden of decision seems real to me, too. But the notion of deciding [i]what to believe[/I] strikes me as strongly counterintuitive. So much so, that I don't just think that many of our beliefs are beyond our control, I doubt whether it's even possible for any belief to be otherwise, at least directly.
What happens is that as we encounter things, we guess at what the world outside experience has to be like for experience to have been that way, and although we're usually right, if we're careful, it's always possible we might be wrong. So there is a gradual transformation of the unknown (about which one can guess) into the known (about which one knows or doesn't know) over time, gradual revelation of what is, but it's hard won.