It's a deeper question than it appears to be, and I don't think there is one single factor.
A highly inconvenient truth likely lurks around the corner which, if we were honest with ourselves, we would have to acknowledge. (1) But we don't acknowledge it because our reputation would suffer. (2) We regularly install false premises to protect ourselves from 'full disclosure' -- perish the thought! (3) We either can't escape our own self-delusions, or it is extraordinarily difficult for us to do so. We can't be our own exterior observers. (4)
We may also be engaged in deceiving other people. Effective deception requires the appearance of conviction, and in projecting conviction we may, as the saying goes, come to believe our own bullshit. (5) Successful con artists know they are deceiving others and manage their act. Most of us aren't that good at it. We believe it ourselves.
Other people do not always wish us well and say unkind things about us--some of which may be true, or may be false. True or false, we defend ourselves by denying what they say. (Believing all the negative things one hears about one's self might be quite self-destructive.) Rejecting negative feedback becomes a protective habit. (6)
In a word, 'paranoia'. Literally, a mind beside itself. In order to 'succeed', a lie requires a liar who knows the truth, and a patsy who is deceived; so a divided mind is prerequisite.
Reply to unenlightened That we can "observe ourselves exteriorly" isn't a lie, but it is a self-deception. We may be quite happy with the way we see ourselves interiorly (or not) but we can not get into the minds of others for confirmation.
Robert Burns:"O wad some Power the giftie gie us
To see oursels as ithers see us!"
It hasn't happened yet. We are doomed to deceive ourselves, at least to some extent.
We could interpret that in such a way, that the lie brings us good fortune and wealth and esteem. That would mean that the lie was indeed successful.
But was it a lie ?
Let's say you say, "I am the greatest of all athletes " But your doubt says to you, "oh, come now, you are a little bit lazy now and again, you aren't the greatest ! The man down the street is greater than you!"
But, because we have told ourselves this "lie" over a period of days and weeks and months, one day we wake up and we ARE the greatest of athletes.
So now we have to say, that the lie has vanished like vapor. The words now stand true.
Reply to Bayaz Well, by "successful" I only mean that we lie to ourselves, and we also believe it even though it is false. So the goal of lying is successful -- what comes from that is set aside.
And also -- to lie usually requires some kind of intent to deceive. So I mean this, rather than just mere confusion or delusion or something like that.
In a word, 'paranoia'. Literally, a mind beside itself. In order to 'succeed', a lie requires a liar who knows the truth, and a patsy who is deceived; so a divided mind is prerequisite.
Definitely. I'm curious about this, first at a conceptual level and also as a phenomena. I think that if we could demonstrate somehow that we were successful at doing this it shows something about the mind that's important.
Is it enough to say that having two mutually exclusive beliefs at once is enough to count as a divided mind?
Is it enough to say that having two mutually exclusive beliefs at once is enough to count as a divided mind?
Well yes and no. :wink:
There's always the question, 'who is saying it?' So I am saying that the mind is more or less always divided, (except Jesus and Buddha), so it is the divided mind that is saying the mind is divided, but saying it as if it were undivided.
That we can "observe ourselves exteriorly" isn't a lie, but it is a self-deception.
So this, by its own thesis is a self-deception too, because what is it but an exterior view? As soon as one talks about interior and exterior, or deceiver and deceived, or as soon as one talks about the divided mind in any other way, one has recourse to the speaker, the observer, the analyst, call it what you will - there is always a three way split.
That's one aspect of my yes and no, but the other aspect is that the division(s) themselves are fabrications.
So even before there are mutually exclusive beliefs, to say 'I have a belief' is already to have divided the mind into the believer and the belief. So having contradictory beliefs isy more like having non-matching socks than something that creates a division.
VagabondSpectreJuly 15, 2018 at 20:09#1971340 likes
I think we deceive ourselves all the time.
Lately I've been conceiving "consciousness" as a kind of conglomerate or patch-work that is made up of many subsystems (mainly abstract predictive models learned and encoded in neural networks). By combining the right predictive models in response to stimulus (another layer of predictive modeling which learning neuronal networks can encode and optimize), accurate belief and "awareness" emerges as a result (not self-awareness, though that may be yet another layer of learned predictive modeling, but awareness in and of itself).
The end product that that the conscious mind labels belief (the understanding it is aware of) is initially the result of bottom up selection in and between neurons and neural networks which optimize them into predictive units (between which further bottom up selection and optimization into compound predictive units occurs). The overall conscious experience is then a cascade of predictions, where we are more "aware" of those predictions/beliefs which are formed from selection between greater numbers and levels of predictive sub-networks, and we are less "aware" of those optimizations and selections which occur in and between lower level and fewer quantities of abstract predictive models.
Relating this to self deception:
Natural selection between predictive models involves trial and error; error, if when one predictive model is apt and accurate but is neglected in favor of a less apt and accurate model, then self-deception has conceivably occurred.
Self-deception doesn't exactly carry the same connotations as "lying to one's self" though. To lie to one's self implies intentional self-deception. But what is intent?
Intent, I reckon, is one of those executive components of mind and brain of which we are by definition more "aware". It is when we have a full formed notion of something we desire and we employ the sub-networks of predictive models (of which we're less aware) to actually arrive at the thing we desire.
If, for instance, we desire to be somehow virtuous (intelligent, moral, successful, likeable etc...) then we may ask ourselves whether or not it is already the case that we have such virtue. If the desire is strong enough (and the feeling failure entails too harsh) then perhaps we bias ourselves in the course of consciously discriminating between groups of predictive models/understandings and arbitrarily ignore models which do not reinforce our higher level preconceptions. In other words, when we assume that something is true we may fundamentally alter our predictive models to conform to that assumption. We may invent excuses that amount to predictive models which do not conform to reality, or we may ignore and negate predictive models which DO conform to reality.
In summary, we lie to ourselves when our high level consciousness (the thing with the most "awareness"), which ideally is the more reliable product of complex selection (more accurate), operates fast and loosely on the sub-components of mind of which we're less aware. It is an error that is reinforced via top-down selection in the hodgepodge that is the human mind and brain.
Marcus de BrunJuly 15, 2018 at 20:22#1971360 likes
I think what Judge Judy states of teenagers...
"Don't try to tell me anything about teenagers, I have two of them. Let me tell you something about teenagers, when they open their mouths in the morning... the lies form!"
I'm listening Bitter. I suspect that our criteria for what counts as being a lie differs. Does sincerity matter to you, or does truth? That makes all the difference in the world.
What is it about Jesus and the Buddha that makes them have undivided minds? Do they simply believe, rather than say they have a belief? How does that work, in the sage-like mind? (ideally speaking -- the facts are gone to history)
Reply to Moliere
Most of the time we are darkly ignorant of our real intentions. All we mostly want is pleasure.
None us all really want the other guy to win. Not if he isn't on our side !
But we will string a narrative to convince that we are the good guys and those are the bad guys.
Isn't that a lie ?
Reply to creativesoul Do you think that we can deceive ourselves, as opposed to lying?
Let's say that we are not one. If we are divided then it would seem that we could lie to our self -- from one self to another self. Not in some pathological or diagnostic sense, but rather this is something that the mind can and does often do -- it is "normal". Would it be possible, at that point, to lie to yourself?
I am interested in the possibility that this is impossible -- that "lying to yourself" is a turn of phrase. But I'm interested in what would be required, at a conceptual level, for it to mean just more than a turn of phrase -- whether or not we do so in fact. Mostly because it would provide a means for determining whether or not we can or do lie to ourselves.
If, for instance, we desire to be somehow virtuous (intelligent, moral, successful, likeable etc...) then we may ask ourselves whether or not it is already the case that we have such virtue. If the desire is strong enough (and the feeling failure entails too harsh) then perhaps we bias ourselves in the course of consciously discriminating between groups of predictive models/understandings and arbitrarily ignore models which do not reinforce our higher level preconceptions. In other words, when we assume that something is true we may fundamentally alter our predictive models to conform to that assumption. We may invent excuses that amount to predictive models which do not conform to reality, or we may ignore and negate predictive models which DO conform to reality.
I think desire plays a role, for sure. But it has to be a certain kind of desire. To use the virtue example above, if we really wanted to be virtuous then that desire would be more powerful than momentary shame at seeing who we are right now -- and we could begin working on ourselves, performing a kind of technological operation on our soul to begin changing to a certain degree.
But what is the structure of desire that makes one lie to oneself, as opposed to really desiring to be such and such?
Do you feel like an amalgamation of computations? I don't really. If it is true it's all "under the hood", so to speak.
A lie would be really hard to model just using computational models, I think -- even moreso to lie to oneself. Or maybe not, maybe it's much the same thing -- just a mind divided.
But how would you computationally model a lie to another neural network?
Most of the time we are darkly ignorant of our real intentions. All we mostly want is pleasure.
None us all really want the other guy to win. Not if he isn't on our side !
But we will string a narrative to convince that we are the good guys and those are the bad guys.
Isn't that a lie
I guess that depends on whether or not we really are the good guys or the bad guys. :D Though that sort of evaluation isn't exactly amenable to basic fact checking, since goodness and badness are not facts but judgments of value that we make or hold. So naturally we'd think we are the good guys, since these are relative to what we already hold to be good. But it is a bit circular.
You bring up real intentions, though. So there are real intentions and there are unreal ones (false ones?). And there is a kind of veil between what we believe our intentions to be and what the real ones are.
Though if that's the case then it seems we can still know that our intentions aren't what we'd like to believe they are. We know we mostly want pleasure. But somehow we believe we are good (or I am good?) -- but the knowledge goes to the wayside, like an abstract proposition.
What is this division between belief and actual intent? How do we know we mostly want pleasure, yet still believe we are good, and intend to do good? Or is this a sort of unveiling of a way of lying?
We may also be engaged in deceiving other people. Effective deception requires the appearance of conviction, and in projecting conviction we may, as the saying goes, come to believe our own bullshit. (5) Successful con artists know they are deceiving others and manage their act. Most of us aren't that good at it. We believe it ourselves.
We want the lie to be so successful that we begin to believe it ourselves? :D Sounds like a good premise for a play.
I'm noticing that your examples seem to be of delusions of one sort or another. There is something inconvenient so we ignore it and come up with alternate beliefs to shield our awareness -- give it something else to fixate on -- and in a way are thus deluded. But is that lying, exactly?
Other people do not always wish us well and say unkind things about us--some of which may be true, or may be false. True or false, we defend ourselves by denying what they say. (Believing all the negative things one hears about one's self might be quite self-destructive.) Rejecting negative feedback becomes a protective habit. (6)
I'd say this is just a way of coming to a false belief about ourselves through habits. What's going on is we hear something negative from a source we don't trust, so we just sort of tune it out on the basis that we've had negative things said about ourselves many times before and they weren't exactly true as much as expressions of how the other person felt.
Do you think that we can deceive ourselves, as opposed to lying?
Let's say that we are not one. If we are divided then it would seem that we could lie to our self -- from one self to another self. Not in some pathological or diagnostic sense, but rather this is something that the mind can and does often do -- it is "normal". Would it be possible, at that point, to lie to yourself?
I am interested in the possibility that this is impossible -- that "lying to yourself" is a turn of phrase. But I'm interested in what would be required, at a conceptual level, for it to mean just more than a turn of phrase -- whether or not we do so in fact. Mostly because it would provide a means for determining whether or not we can or do lie to ourselves.
Since lying is deliberately misrepresenting one's own thought and belief, and it is always done in situations when the speaker believes that they ought not allow others to know what they think and believe, it seems to me that one cannot lie to oneself.
Hold false belief. Sure. It is humanly impossible to knowingly do that. Seems to me that lying to oneself only makes sense in light of an ill-conceived notion of lying. That is, when one holds that lies are always false.
Reply to Moliere "I think desire plays a role, for sure. But it has to be a certain kind of desire. "
We often choose to believe things despite an absence of rational support. Is that only a lie if for virtuous purposes? Is it never a lie?
What is a lie? I tend to consider it the deliberate telling of a known falsehood. Did Trump lie when he proclaimed the inauguration crowd biggest of all time, or did he actually believe it? Did Obama lie when he said you could keep your doctor under the ACA?
Since lying is deliberately misrepresenting one's own thought and belief, and it is always done in situations when the speaker believes that they ought not allow others to know what they think and believe, it seems to me that one cannot lie to oneself
What if we are of two thoughts?
I believe something good about myself. I know that it is false. These are in conflict with one another. So let's say we become aware of different beliefs at different times. I tell myself the good thing and I want to believe it, so I do. There's the part of me who lies, and the part of me who listens. And I stop being aware of the part of me who lies right after telling myself the lie. I know that I have to deceive to achieve the desired belief.
If we are of one mind then I don't think we could lie to ourselves. I agree with that -- that's why I thought @unenlightened made a good point in saying we'd have to have a divided mind in order for us to lie successfully, and not just be delusional or some such.
That is, when one holds that lies are always false.
At least in a general sense I'd say that's what lying is -- to tell someone a falsehood while knowing it is true in order to deceive them. So I'd say that in the case of telling someone about my own thoughts then I'd be lying if I told them something I do not really think -- that this is a particular case of lying, but that lying doesn't have to be about my own thoughts. It could also be about whether I have the money for the bill.
We often choose to believe things despite an absence of rational support. Is that only a lie if for virtuous purposes? Is it never a lie?
What is a lie? I tend to consider it the deliberate telling of a known falsehood.
I think we're in agreement here. We tell someone a falsehood we know to be true. Maybe there's a motivational component to this but that seems to be the bare minimum of what a lie is.
I don't think I'd say that believing such and such without rational justification counts as a lie. It may be irrational, but without justification we do not know, and if we do not know then we couldn't be telling ourselves a known falsehood.
Part of the difficulty in determining a lie is in being able to tell if someone really knew something or if they were just mistaken, delusional, or something along those lines. Usually we mean that the person lying both knows the truth and tells the opposite. With two people this is easy enough to understand -- one person knows, the other does not, and the person who knows believes that the falsehood is better to say than the truth (for whatever reason -- could be white lies, or nefarious. Could be to preserve feelings, or manipulative to get what one wants)
But with one person it seems strange to say. But it is a common turn of phrase to claim someone is lying to themselves. Hence the line of questioning -- perhaps it is just a turn of phrase, but what would it take for someone to lie to themself, to where it was more than just a turn of phrase?
VagabondSpectreJuly 16, 2018 at 16:34#1973270 likes
Do you feel like an amalgamation of computations? I don't really. If it is true it's all "under the hood", so to speak.
Sure we don't feel like an amalgam of streaming information exchanges among and between learning neural networks, but there's too much evidence to ignore that it is so.
But how would you computationally model a lie to another neural network?
General self-deception I would describe as existing in the fact that an erroneous or inapplicable sub-model is used in the formulation of a given belief. There's no difference between this description and simply being incorrect or mistaken about something, and the feedback we get from such mistakes is how we develop and optimize existing and new models; it's how we learn.
"Lying to one's self", if such a thing exists, must be more than just self-deception. As a guess at what it could be (or something like it) from a learning network perspective, I would say that it occurs when a consciously held belief (the higher level result of complex network interactions) happens to be erroneous, and causes related lower level models/networks to move toward an erroneous or de-optimized state.
When another person lies to us and we believe them, we may alter our fundamental understanding and cognitive models of the thing we're being deceived about. When we ourselves formulate erroneous beliefs and cling to them with conviction, our fundamental understandings which underpin them must be bent or negated to fit properly, to avoid dissonance.
The trouble with modeling such phenomenon computationally is that we're unable to follow the rhymes and reasons of learning neural networks as they learn; we can create a learning machine that can become excellent at a specific task through experience, but the way it discovers and encodes patterns creates messy extended algorithms that are utterly illegible (too long, too raw, too abstract; imagine a human mind expressed in algebra).
As potentially recursive in various ways, these extended algorithms can alter themselves (for better or for worse, though the thing that makes them "learning" is that they tend to alter themselves for the better), so, when an error in one part of the algorithm is carried into its product, which then recursively alters sub-components of the algorithm toward greater error (and sustained error), we might say the algorithm has successfully lied to itself.
Put in the most uncomplicated terms I can think of, "lying to one's self" is like the opposite of learning truth; it's when we learn untruth not because of externally deceptive stimulus, but because of our own faulty minds. Bias of all kinds therefore fits the bill of lying to ourselves along with succumbing to any self-generated fallacious appeal!
Sure we don't feel like an amalgam of streaming information exchanges among and between learning neural networks, but there's too much evidence to ignore that it is so.
What evidence persuades you that you are a neural network?
****
I sort of feel like the computational approach has to abandon "belief" -- there is no belief formation, there are algorithms which optimize. There is nothing that a belief is about, there are models of math problems through logical switches. And the stream of electrons move in accord with physical facts.
Similarly, a few levels up, we have algorithms optimizing and modifying themselves in light of some goal set for them. But do the algorithms lie to one another? Do they avoid dissonance? Or are they simply following instructions and giving us a good model for understanding (some of our) learning? It seems the latter to me.
In order for a lie to be successful, and not just count as a lie, it seems to me we have to rely upon some guesses as to how the person we are lying to will take the information. We have to imagine what it would be like to be them. So we have to have some sort of beliefs (model? Possibly if we make an art of lying) about the other person's mind, how they react to different sorts of information, presentation, and their general mood. That way we can craft something that sounds believable to the person we're talking to, even though we know it to be false.
Lying, as simple as it seems and as young as we learn how to do it, is actually a really complicated behavior.
VagabondSpectreJuly 16, 2018 at 17:38#1973430 likes
What evidence persuades you that you are a neural network?
Brain damage in various places can have corresponding effects on conscious experience and mental faculties. The taking of drugs for instance interacts with individual neurons and neuro-receptors in the brain and can cause drastic effects on what we think and feel. Artificial neural networks which have proven capable of a certain aspect of learning were inspired by biological neural networks, which in and of itself is enough to convince me that I am in large part a neural network (or at least the part of me that learns, which happens to be the best part :wink: ).
I sort of feel like the computational approach has to abandon "belief" -- there is no belief formation, there are algorithms which optimize. There is nothing that a belief is about, there are models of math problems through logical switches. And the stream of electrons move in accord with physical facts.
Granted I cannot solve the hard problem of consciousness and explain how the outcome of an algorithim can seem like a "belief". As it sits physically in a network, a belief is abstract, and like raw data in a file it can only usefully be expressed when executed within the larger set of executive functions that cause "beliefs" to spill into our thoughts and out of our mouths. That this network can learn and alter itself on the most fundamental level is why it differs from an ordinary algorithm. When exposed to a world of varied and complex stimulus, there's no way of precisely predicting how such a network or algorithm will respond; it learns and evolves chaotically.
Similarly, a few levels up, we have algorithms optimizing and modifying themselves in light of some goal set for them. But do the algorithms lie to one another? Do they avoid dissonance? Or are they simply following instructions and giving us a good model for understanding (some of our) learning? It seems the latter to me.
To an algorithm, dissonance comes in the form of a weighted error value. The bigger the error, the greater the dissonance (and the more drastic the self-correction). End results such as intent and feeling are of the ineffable whole rather than a specific part, or so it seems; my brain doesn't feel, it reacts mechanically, but its abstract product - the mind with awareness - seems to.
What is it about Jesus and the Buddha that makes them have undivided minds?
Bearing in mind that you're asking me, unenlightened (surely a foolish move?), I think it is a matter of identification.
So, for example, there are facts about where I was born and what kind of passport I have, and then there is the identity of 'Englishman'. Or there are facts about what I have read and studied and thought over, and then there is the identity of 'philosopher'.
Identity is somehow more than the facts; it is a commitment to the facts; an investment in the significance of the facts. And this creates a separation, of a central self in the mind - I am an English philosopher. Something to protect against, well everything, including whatever else might be the facts of what I am.
Since lying is deliberately misrepresenting one's own thought and belief, and it is always done in situations when the speaker believes that they ought not allow others to know what they think and believe, it seems to me that one cannot lie to oneself
— creativesoul
What if we are of two thoughts?
I believe something good about myself. I know that it is false. These are in conflict with one another.
Knowing that 'X' is false makes it impossible to believe 'X'. I believe 'X' about myself. I cannot do both, know that 'X' is false(about myself) and believe that 'X' is true(about myself).
As soon as we become aware that 'X' is false, we cannot possibly believe otherwise. That holds good in cases where 'X' is true, but we believe 'X' is false. If we believe 'X', then we believe 'X' is true; is the case; corresponds to fact/reality; is the way things are; etc. We cannot do both, believe 'X' and know that 'X' is not true; is not the case; does not correspond to fact/reality; is not the way things are; etc.
If we are of one mind then I don't think we could lie to ourselves. I agree with that -- that's why I thought unenlightened made a good point in saying we'd have to have a divided mind in order for us to lie successfully, and not just be delusional or some such.
Well, strictly speaking 'one' who has two minds is two... not one. We cannot be of two minds, strictly speaking... aside from having some sort of multiple personality disorder. These are common is cases of tremendous childhood trama. It's a coping mechanism. Since the facts are too much for the one individual to bear, the one 'creates' an alternative persona as a means to 'split up' the burdens...
I see nothing wrong with saying that people of one mind can hold contradictory beliefs. I would wager that everyone does, at least during some period of their life. Some become aware of this and choose. Others become aware and suspend judgment. Others become aware and struggle to grasp what's going on, and thus chalk it up to being normal, or some other ad hoc explanation. Others never become aware.
There is some tremendous difficulty involved in becoming aware of one's own false belief, assuming one wants to correct the situation.
It is also quite common to be uncertain about something or other. These latest situations I've mentioned are often spoken of in terms of "being of two minds", and that makes perfect sense in everyday parlance.
...when one holds that lies are always false.
— creativesoul
At least in a general sense I'd say that's what lying is -- to tell someone a falsehood while knowing it is true in order to deceive them. So I'd say that in the case of telling someone about my own thoughts then I'd be lying if I told them something I do not really think -- that this is a particular case of lying, but that lying doesn't have to be about my own thoughts. It could also be about whether I have the money for the bill.
I think you mean to say that lying is -- to tell someone a falsehood while knowing it is false.
If you believe you have the money for the bill, and you state otherwise, then you've deliberately misrepresented your own thought and belief. If you do not believe that you have the money for the bill, and you state otherwise, then you've deliberately misrepresented your own thought and belief. Both are cases of lying.
Those lies could be true. Here's how...
You could be wrong about how much money you have. Thus, if you believed you had enough, and stated that you did not, you would be lying. Now, if by chance, you had forgotten how much money you'd spent over the past weekend, you would have less than you believed. So, the belief that you had enough would be false, and yet the statement(the lie) that you did not would be true.
Lying has less to do with truth, and more to do with thought and belief. That is, lies themselves consist of statements that can be either true or false, but the lie is always told by someone deliberately misrepresenting what they think and/or believe.
Lying, as simple as it seems and as young as we learn how to do it, is actually a really complicated behavior.
It's not that complicated. One always know when they have just said something that they do not believe. An honest speaker will immediately correct themselves in an authentic accidental situation of misspeaking. The dishonest speaker will not, and claim that they had misspoke if and when another calls them on it at a later date...
What must be the case in order to successfully lie to yourself?
I think most people's favorite method is convincing themselves, persuading themselves that they know something which they do not. (Second place is probably convincing themselves that they do not know something which they damn well do.) I'd count that as lying.
What must be the case in order to successfully lie to yourself?
— Moliere
I think most people's favorite method is convincing themselves, persuading themselves that they know something which they do not. (Second place is probably convincing themselves that they do not know something which they damn well do.) I'd count that as lying.
When someone believes that they know something that they do not, they hold false belief about themselves. Holding false belief is neither necessary nor sufficient for being a lie. Being convinced that one does not know something when they do, is - once again - being mistaken about oneself. Again, not dishonest or insincere, but rather just plain 'ole being mistaken... holding false belief.
Reply to creativesoul
You missed the process part. Sometimes you cherry-pick the evidence, and you know you're cherry-picking, and you know you shouldn't, but you do it anyway. A sort of cognitive akrasia. With others, it's easier: you just say something you know to be false. With yourself, it usually takes a more sustained effort.
It's a foreign notion. Could you elaborate, so I can know more about this notion of how one can deliberately misrepresent their own thought and belief to themselves?
Sometimes you cherry-pick the evidence, and you know you're cherry-picking, and you know you shouldn't, but you do it anyway...
I do not see how this is anything other than one who knows that they are doing something that they should not. Eventually... what? They deliberately trick themselves into thinking it's ok?
Allow me to present for your consideration, the notion of affirmations.
One is encouraged in some psychological quarters to seek to change the way one thinks. "Say to yourself, 'every day in every way, I'm getting better and better.'". By repetition, the theory goes, one becomes convinced of something one did not believe.
A sportsman will psyche himself up in this sort of way - 'I am the greatest', and it works, at least to an extent. Perhaps philosophers can do the same - try saying to yourself, "I am such a deep thinker, I can even appreciate unenlightened's posts." It might take a lot of repetitions, and it won't actually make either of us smarter, but don't tell yourself this, tell yourself that it really works, because it really works.
Reply to unenlightened Hmm, but there is evidence that affirmations can help produce a better self-image, and greater self-confidence, so long as they end up replacing negative thought patterns, as opposed to merely supervening on top of them.
Metaphysician UndercoverJuly 18, 2018 at 11:03#1979470 likes
Since lying is deliberately misrepresenting one's own thought and belief, and it is always done in situations when the speaker believes that they ought not allow others to know what they think and believe, it seems to me that one cannot lie to oneself.
Forgetting is very real. When a person represents to oneself a memory, which is not really a memory, but something imagined, because the real thing has been forgotten, then that person is misrepresenting one's own thought and belief. This is actually very common, that a person represents something imaginary to oneself as a memory. And the person doing the "remembering" very quickly overlooks, and forgets the division between the aspects of the memory which are real, and which are imagined.
That is why two people can both say "I remember the event this way", when the two ways are contradictory. The two people will both argue sincerely that it must be my way because I remember it that way, when it is impossible that both ways are correct because they are contradictory. Are you married?
Hmm, but there is evidence that affirmations can help produce a better self-image, and greater self-confidence, so long as they end up replacing negative thought patterns, as opposed to merely supervening on top of them.
There is evidence that lying on your CV can get you a better job, as long as you don't get found out.
But my point is that there is a self-image, positive or negative, and the image acts. This is demonstrated by the fact that when my self-image changes, my actions change. Affirmations work! And they work in exactly the same way as compliments or insults coming from others do. They build an image and the image acts.
But to have an image that acts is to have a divided mind; it is to be running a simulation of oneself and letting that run one's life. One performs one's identity.
But to have an image that acts is to have a divided mind; it is to be running a simulation of oneself and letting that run one's life. One performs one's identity.
How does one stop acting from the image?
Pattern-chaserJuly 18, 2018 at 17:03#1980650 likes
By being yourself. You have a self-image, we all do, but it is not you, and it cannot and does not compel you to act. You take note of things that happen, and things that are said, and sometimes your way of behaving (acting) changes as a result. But it is you who decide to change, and it is you, not your self-image, that acts.
So, for example, there are facts about where I was born and what kind of passport I have, and then there is the identity of 'Englishman'. Or there are facts about what I have read and studied and thought over, and then there is the identity of 'philosopher'.
Identity is somehow more than the facts; it is a commitment to the facts; an investment in the significance of the facts. And this creates a separation, of a central self in the mind - I am an English philosopher. Something to protect against, well everything, including whatever else might be the facts of what I am.
So a whole mind would be one without an identity, without a commitment to certain facts. It would accept all the facts about itself as relevant to itself, or would be committed to no facts about itself at all. A person with a whole mind would not have an identity to protect or project.
You know how it's impossible to walk any great distance** if your stride with one leg differs slightly from your stride with the other? Now tell yourself at each step that it's only a little different, and that can't make much difference. It's like that: you relax your cognitive standard just a bit, and indeed it does not make the inferential step you're taking invalid, but if you keep compounding this little compromise you end up in the wrong place. I'd call this a kind of lying to yourself and it's incredibly pervasive.
** in a straight line
Metaphysician UndercoverJuly 19, 2018 at 01:45#1981850 likes
I didn't say that. I see, as usual, you didn't read my post, responding just to an out of context word.
I said that filling in the blanks with imagination, where memory leaves things out, and representing this to oneself as memory, is misrepresenting one's own thought. In recalling distant memories it is difficult to distinguish aspects of "true memory" from imagination because "the memory" changes over time. If one represents this to oneself as "true memory" when there are aspects of imagination which have been mixed in over time, this is misrepresentation.
You know how it's impossible to walk any great distance if your stride with one leg differs slightly from your stride with the other? Now tell yourself at each step that it's only a little different, and that can't make much difference. It's like that: you relax your cognitive standard just a bit, and indeed it does not make the inferential step you're taking invalid, but if you keep compounding this little compromise you end up in the wrong place. I'd call this a kind of lying to yourself and it's incredibly pervasive.
Are you talking about cases where someone changes some standard they hold?
Metaphysician UndercoverJuly 19, 2018 at 02:13#1981930 likes
Reply to creativesoul
It's not forgetting which is deliberate misrepresentation of one's own thought and belief, it is remembering which can be such. This is the case when aspects of the event which has been remembered, have been forgotten and replaced by the imagination. These things which have been produced by the imagination are deliberately misrepresented as memories.
Reply to creativesoul
I remember hearing years ago that it's common for emergency rooms to have a spike in admissions just before dawn. The explanation was people lying awake all night telling themselves "It's nothing" and eventually accepting that something was terribly wrong.
Are you really not familiar with any of these phenomena?
It's not so much that I haven't heard of such reports, it's that I'm questioning the reporting itself. I do not see how any of it qualifies as deliberately misrepresenting one's own thought and belief to oneself.
Reply to creativesoul
Yes, I understand that as far as you're concerned the phrase "lying to yourself" is just a contradiction. But it's a phrase we all use, so what are the options?
It's just an idiom and the word "lying" is not meant literally.
People do literally deceive themselves, even though you, and maybe none of us, don't quite understand how that's possible.
Your criterion for lying is too narrow and leaves out this case and perhaps others, like "lying by omission".
Stage magic and storytelling both include techniques that rely on our capacity for self-deception. Sometimes the magician, instead of trying to hide how a trick is done, can get the audience members themselves to dismiss the solution, and this is much more effective. A movie can present a character that's a little "off" but not make a big deal about it, and the viewers will mostly decide not to worry about him, until the third reel when it turns out he's the killer.
You could say these are cases of deception, but really it's just giving us the opportunity to deceive ourselves and most of us are generally quite prepared to do so.
Self-deception - which I presume is the focus of this thread - is perhaps best not modelled on the binary relation of A deceiving B (even where A and B are the same person). After all, I could deceive myself without engaging in self-deception - an example, suppose I am in the army on a shooting range, and I am charged with camoflaging targets. I do the job so well that even I cannot tell the targets from the bushes. I've deceived myself, but it's not a case of self-deception. Someone earlier in this thread mentioned the idea that self-deception (lying to oneself) is more akin to giving yourself bad reasons for not pushing yourself to the end of a chain of reasoning that will definitively reach a conclusion you do not like. That seems right to me and doesn't involve too much metaphysical nonsense about split selves etc.
Yes, I understand that as far as you're concerned the phrase "lying to yourself" is just a contradiction. But it's a phrase we all use, so what are the options?
It's just an idiom and the word "lying" is not meant literally.
People do literally deceive themselves, even though you, and maybe none of us, don't quite understand how that's possible.
Your criterion for lying is too narrow and leaves out this case and perhaps others, like "lying by omission".
These options offer little more than unnecessary and unhelpful restriction to the considerations here.
My notion of what counts as a lie need not exhaust all other sensible notions/uses of "lying" in order for it to be able to correctly and irrefutably set out that which we all agree is - most certainly - a lie(an insincere speech act).
It is humanly impossible to knowingly believe a falsehood. Deception - in and of itself - comes in many forms. One of which is lying to another. One cannot deceive oneself. That's pure unadulterated nonsense. Being tricked requires not knowing your being tricked. Tricking another requires knowing you're tricking. One cannot both know they are tricking themself and not know that they're being tricked.
People say that it is possible to deceive ourselves. So what? Saying that that happens doesn't make it so. People use the term "truth" as a synonym for one's worldview(people conflate truth and belief). That has no bearing upon how we assess a much more disciplined and sensible use.
Calling a criterion for lying 'too narrow' implies that there are some lies that that criterion cannot account for. That's quite the specious claim. Looks strong until it is given some serious thought.
All the different conceptions sharing same name fail in some way or other to be able to account for the others. That is precisely how we arrive at different ones. That is not a flaw in my argument about lies. Rather, it is a necessary feature of all such arguments.
...it's a phrase we all use, so what are the options?
Well, who's asking? If it is not the image asking, then you have already stopped. So it must be the image asking how not to act, and then it is obvious that there is absolutely nothing that the image can do that is not the acting of the image. I think if one ( it ought to be two, really) could completely grasp that the image can do nothing to help in this situation, that one is completely helpless, then one simply does stop. One gives up.
Stage magic and storytelling both include techniques that rely on our capacity for self-deception. Sometimes the magician, instead of trying to hide how a trick is done, can get the audience members themselves to dismiss the solution, and this is much more effective. A movie can present a character that's a little "off" but not make a big deal about it, and the viewers will mostly decide not to worry about him, until the third reel when it turns out he's the killer.
You could say these are cases of deception, but really it's just giving us the opportunity to deceive ourselves and most of us are generally quite prepared to do so.
Being intentionally led by another to believe something is not self-deception.
They're nearly two sides of the same coin, the same virtue and the same fault: falling to aim for truth in what you say, failing to aim for truth in what you think.
It is humanly impossible to knowingly believe a falsehood.
I daresay you haven't had much practice. When I was your age, I always did it for half-an-hour a day. Why, sometimes I've believed as many as six falsehoods before breakfast.
[quote=George Orwell]He gazed up at the enormous face. Forty years it had taken him to learn what kind of smile was hidden beneath the dark moustache. O cruel, needless misunderstanding! O stubborn, self-willed exile from the loving breast! Two gin-scented tears trickled down the sides of his nose. But it was all right, everything was all right, the struggle was finished. He had won the victory over himself. He loved Big Brother. [/quote]
Chomsky said somewhere that his life's work was organized around two complementary problems that he called "Plato's Problem" and "Orwell's Problem". Plato's problem is: How do we know so much, given so little evidence? While Orwell's Problem is: How do we know so little, given so much evidence?
Knowing that 'X' is false makes it impossible to believe 'X'. I believe 'X' about myself. I cannot do both, know that 'X' is false(about myself) and believe that 'X' is true(about myself).
As soon as we become aware that 'X' is false, we cannot possibly believe otherwise. That holds good in cases where 'X' is true, but we believe 'X' is false. If we believe 'X', then we believe 'X' is true; is the case; corresponds to fact/reality; is the way things are; etc. We cannot do both, believe 'X' and know that 'X' is not true; is not the case; does not correspond to fact/reality; is not the way things are; etc.
I'd say that given a dimension of time that this could be overcome. So right now if I believe "X", then I know "~X", I could then choose to believe "X" and forget or ignore "~X"
With a dimension of time we also have changes of awareness. So at different moments we can come to be aware of different things.
Well, strictly speaking 'one' who has two minds is two... not one. We cannot be of two minds, strictly speaking... aside from having some sort of multiple personality disorder. These are common is cases of tremendous childhood trama. It's a coping mechanism. Since the facts are too much for the one individual to bear, the one 'creates' an alternative persona as a means to 'split up' the burdens...
I see nothing wrong with saying that people of one mind can hold contradictory beliefs. I would wager that everyone does, at least during some period of their life. Some become aware of this and choose. Others become aware and suspend judgment. Others become aware and struggle to grasp what's going on, and thus chalk it up to being normal, or some other ad hoc explanation. Others never become aware.
There is some tremendous difficulty involved in becoming aware of one's own false belief, assuming one wants to correct the situation.
It is also quite common to be uncertain about something or other. These latest situations I've mentioned are often spoken of in terms of "being of two minds", and that makes perfect sense in everyday parlance.
I think we're basically in agreement on lying. At least I'm most interested in this more robust theory of lying, as opposed to delusion, just because it's the more difficult case -- and you seem to agree that delusion is possible, just not lying
So your main point of disagreement is really that being of two minds is not normal -- it would have to be a pathology of some kind at play in order for someone to lie to themselves.
I think you mean to say that lying is -- to tell someone a falsehood while knowing it is false.
I had in mind saying "I do not have the money" when "I have the money" is true -- but yeah, I was flipping the signs in my head. The former would be a falsehood, the later a truth, and you'd be saying the falsehood and not the truth.
Lying has less to do with truth, and more to do with thought and belief. That is, lies themselves consist of statements that can be either true or false, but the lie is always told by someone deliberately misrepresenting what they think and/or believe.
I think this is a minor disagreement between us. I see what you mean, but I'd say that you'd have to know something to be true and then say its opposite, whereas you'd say that it comes down to belief -- so you believe "X" is true, but you say "~X".
Good enough for me. I think the split-mind disagreement is the stronger of the two. Yeah?
Self-deception - which I presume is the focus of this thread -
Yup! :D
is perhaps best not modelled on the binary relation of A deceiving B (even where A and B are the same person). After all, I could deceive myself without engaging in self-deception - an example, suppose I am in the army on a shooting range, and I am charged with camoflaging targets. I do the job so well that even I cannot tell the targets from the bushes. I've deceived myself, but it's not a case of self-deception. Someone earlier in this thread mentioned the idea that self-deception (lying to oneself) is more akin to giving yourself bad reasons for not pushing yourself to the end of a chain of reasoning that will definitively reach a conclusion you do not like.
So it's something, in your view, that happens along a chain of reasoning. So you might have the notion that this is going somewhere bad, and then come up with some reasons that you don't scrutinize too deeply to make it go somewhere good.
That seems right to me and doesn't involve too much metaphysical nonsense about split selves etc.
What's nonsensical about a split self? Is it any more nonsensical than a singular self?
Metaphysician UndercoverJuly 20, 2018 at 01:27#1984110 likes
It is humanly impossible to knowingly believe a falsehood. Deception - in and of itself - comes in many forms. One of which is lying to another. One cannot deceive oneself. That's pure unadulterated nonsense
What you are refusing to take account of, is the fact that people change as time passes, and their minds change as well. Deception is an act in which the act of the deceiver is prior in time to the falsity being believed as true by the deceived, the result of the deception. One is the cause, the other the effect. So the deceiver hands a falsehood and the receiver takes it and is deceived.
There is no logical reason to conclude that one cannot deceive oneself. What the person at an earlier time knew as a falsity, is represented to oneself at that time as as a truth. The same person at a later time, having forgotten the act of deception, believes the falsity as a truth. Never in this whole process does the person "knowingly believe a falsehood" as you insist is necessary for self-deception. The person at one time tells oneself that a falsity is the truth, not actually believing it is the truth. The same person at a later time believes it to be the truth without remembering that at one time it was not believed to be the truth.
So your main point of disagreement is really that being of two minds is not normal -- it would have to be a pathology of some kind at play in order for someone to lie to themselves.
Not really. I disagree with the framework itself. I acknowledge that many, if not most, folk talk in ways that lead to self-contradiction and/or incoherence. I acknowledge that these are meaningful ways to talk about stuff. My point is that we can be wrong about some stuff, particularly anything and everything that exists in it's entirety prior to our becoming aware of it. Our mental ongoings are precisely such things. Thought and belief are mental ongoings. We can get such things wrong. If we work from an ill-conceived notion of thought and belief, our notion of lying will suffer the consequences along with all else we say about ourselves.
Deliberately misrepresenting one's own thought and belief is always a lie.
Much talk about lying involves talk about "telling the truth" as well. This has all sorts of problems too. If one must only say what's true in order to be "telling the truth", then the only way that one could do such a thing is if their entire belief system is infallible. Truth cannot be false. That's the problem. We all have false belief. "Telling the truth" doesn't require omniscience. It requires honesty in speech. It requires saying what one believes to be true. "Tell the truth, the whole truth, and nothing but the truth" is rubbish on it's face. It is either an impossible criterion to meet, or it conflates truth and belief. It's a prima facie example of language based upon gross misunderstanding of how thought, belief, meaning, and truth work together long before we become aware of our ow mental ongoings...
p1 Being tricked requires not knowing your being tricked.
p2 Tricking another requires knowing you're tricking.
c1 Tricking oneself requires knowing that one is tricking oneself, and not knowing that one is being tricked.
p4 One cannot do both, know s/he is tricking him/herself, and not know that s/he is being tricked.
C2 One cannot trick oneself
I remember hearing years ago that it's common for emergency rooms to have a spike in admissions just before dawn. The explanation was people lying awake all night telling themselves "It's nothing" and eventually accepting that something was terribly wrong.
What's wrong with saying that they believed nothing was wrong, but after all night passing without change in their condition, they began to believe that something was wrong. That is, they changed their belief, as compared/contrasted to misrepresenting it to themselves.
Lying has less to do with truth, and more to do with thought and belief. That is, lies themselves consist of statements that can be either true or false, but the lie is always told by someone deliberately misrepresenting what they think and/or believe.
— creativesoul
I think this is a minor disagreement between us. I see what you mean, but I'd say that you'd have to know something to be true and then say its opposite...
Well, I once held that view as well...
Joe watches a car accident happen. A brown car ran a red light and crashed into a blue one. Joe knows the driver of the brown car, and he wants to help her avoid fault charges, so he lies to Brian about what he saw. Brian believes Joe, also knows both drivers and does not like the driver of the brown car. So, even though he did not witness the accident, he claimed he did, and when asked who's fault the accident was, Brian states something that he does not believe to be true, but is when he says "the brown car"...
If being a lie requires knowing something to be true, and saying it's opposite, then Joe lied, but Brian only lied when he said that he too saw it. Both deliberately misrepresented their own thought and belief.
On my view, they both lied. Joe once, and Brian twice. Joe's lie was false. Brian's first one was false, but his second was true...
What's wrong with saying that they believed nothing was wrong, but after all night passing without change in their condition, they began to believe that something was wrong. That is, they changed their belief, as compared/contrasted to misrepresenting it to themselves.
Because that happens too, and it's a different phenomenon. What you're missing is that self-deception is usually strongly motivated and irrational.
Here's a salient example: relationships. Self-deception often involves manipulation of evidence, but when it comes to figuring out what are other people think and feel, a lot of that evidence is subtle and ephemeral. We're good at picking up on these tiny tells, almost unnoticeable variations in inflection, expression, eye movement and focus, tone of voice -- all of that stuff we process without usually being consciously aware of it. We just know.
My point is this: it's particularly easy to get away with fooling yourself in this context because your "judgment" was arrived at automatically based on "evidence" you probably couldn't articulate. And that makes it all too easy to dismiss. You don't want to believe something's bothering your spouse? No problem: there's not much you could really point to as evidence anyway. (It was just a feeling you had.) But anyone who's ever done this knows they were fooling themselves.
Or, from the other side, want to believe that cute girl in your homeroom, or at work, or making your coffee, is into you? You can probably find something to count as "evidence". For most of us, enough contrary evidence arrives and quickly enough that a restraining order is unnecessary.
VagabondSpectreJuly 20, 2018 at 05:04#1984320 likes
It occurs to me that we can very easily and in the true sense of the phrase 'lie to ourselves' if we're content with deceiving our future-selves rather than our present-selves; the necessary ingredients are forgetfulness or suggestibility (as you can probably imagine, someone with Alzheimer's could quite effectively lie to their future selves).
A weak case can be built upon the consumption of fictional entertainment where immersion causes a suspension of disbelief. While we're never fully deceived by fiction (and that it is the fictional work doing the deceiving), the intent to become immersed in the first place (which requires some albeit weak or pseudo-level of belief change) can constitute a form of intentional self-deception. "Escapism" highlights the difference between this kind of intentional self-deception and the mere consideration of hypotheticals.
A strong case for the intentional deception of our future selves exists in the case of intentional mood altering practices (though they do not necessarily come with specific belief changes, mood changes can easily cause changes in belief). It may not be "lying" to instigate a general mood change but it's definitely intentional self-manipulation to purposefully alter one's mood by pro/prescribing substances, physical activities, or other means. Mood changes would most directly affect emotionally contingent opinions, but they can and do also impact actual beliefs. For instance, someone who lacks confidence might try to consciously project confident body language (i.e: smile more) in hopes that it will impact on how confident they actually feel, and in turn change their beliefs which are in part dependent upon their confidence (i.e: what they can accomplish, the moral nature of the average person, etc...).
Drinking in order to forget or become distracted from an unpleasant reality seems to at least fit the description of "lying by omission". If an omission can be a lie, then at least while inebriated we are possibly being lied to by our former selves...
Because that happens too, and it's a different phenomenon.
Is it though? Seems to be a different explanation of the same phenomenon; one with fewer entities, no self-contradiction, equal explanatory power, and just as much plausibility. Why opt otherwise?
...Here's a salient example: relationships. Self-deception often involves manipulation of evidence, but when it comes to figuring out what are other people think and feel, a lot of that evidence is subtle and ephemeral. We're good at picking up on these tiny tells, almost unnoticeable variations in inflection, expression, eye movement and focus, tone of voice -- all of that stuff we process without usually being consciously aware of it. We just know.(emphasis mine)
Are we though? On my view, there are multiple underlying reasons for what is otherwise the same outward behaviour(s). Misattribution of meaning regarding these subtle behaviours that we purportedly 'just know' runs rampant. That is particularly the case between people from vastly differing cultural, familial, and otherwise socially influenced norms.
My point is this: it's particularly easy to get away with fooling yourself in this context because your "judgment" was arrived at automatically based on "evidence" you probably couldn't articulate. And that makes it all too easy to dismiss. You don't want to believe something's bothering your spouse? No problem: there's not much you could really point to as evidence anyway. (It was just a feeling you had.) But anyone who's ever done this knows they were fooling themselves.
Anyone whose ever thought that something was bothering their spouse, based upon subtle behaviours but didn't want to get into it and so avoided addressing their own 'feeling' and their own belief that something may be wrong, weren't fooling themselves into believing otherwise. They were rationalizing their own decision to not talk about it. In this case, the person is deliberately not representing their own thought and belief, because they are intentionally avoiding talking about it.
Or, from the other side, want to believe that cute girl in your homeroom, or at work, or making your coffee, is into you? You can probably find something to count as "evidence". For most of us, enough contrary evidence arrives and quickly enough that a restraining order is unnecessary.
Here again, how is this lying to oneself? One either believes that another is into them or not. The belief, very well may be unfounded, but it is a belief nonetheless. One is not lying to themselves, rather they just believe things without good enough reason.
A strong case for the intentional deception of our future selves exists in the case of intentional mood altering practices (though they do not necessarily come with specific belief changes, mood changes can easily cause changes in belief). It may not be "lying" to instigate a general mood change but it's definitely intentional self-manipulation to purposefully alter one's mood by pro/prescribing substances, physical activities, or other means. Mood changes would most directly affect emotionally contingent opinions, but they can and do also impact actual beliefs. For instance, someone who lacks confidence might try to consciously project confident body language (i.e: smile more) in hopes that it will impact on how confident they actually feel, and in turn change their beliefs which are in part dependent upon their confidence (i.e: what they can accomplish, the moral nature of the average person, etc...).
I'm at a loss here...
One can most certainly change the way that they look at the world by virtue of changing the way they talk about it and/or themselves. One can do this deliberately. One can deliberately change they way that they behave as a means to change the way they feel. This can, in turn, change one's belief.
If it takes talking about one person as though they were a plurality of different selves in order to make sense of lying to oneself, it seems to me that it makes better sense to abandon the notion altogether and learn to talk about the same situations in better ways...
VagabondSpectreJuly 20, 2018 at 05:34#1984370 likes
Because it's intentional belief altering via coercive/irrational means. Is deception the intent to conceal or manipulate or is it the successful concealment/manipulation of objective truth? I'll satisfy both:
Lets say I'm at a singles bar looking for a date, and I know that statistically my chances of being successful are low... Consuming alcohol can make me go from believing it is true that I will likely fail to either forgetting or believing the opposite, even while it remains true that I will likely fail despite the statistical benefits alcohol may confer.
I think if one ( it ought to be two, really) could completely grasp that the image can do nothing to help in this situation, that one is completely helpless, then one simply does stop. One gives up.
It is hard to answer, because in saying anything, I am going to be making an image. But as near as I can get, compare :
1. I am an English philosopher. (identity, image)
2. I am writing a post. (activity, fact)
If I make an identity of 'poster', then I am a poster even when I am not posting, or a philosopher when I am not philosophising, or English when I live in France.
Metaphysician UndercoverJuly 20, 2018 at 10:51#1985270 likes
p1 Being tricked requires not knowing your being tricked.
You're mixing present and past tense "being tricked". "Having been tricked" requires not knowing that you've been tricked. And this is fulfilled when you forget that you've tricked yourself.
p4 One cannot do both, know s/he is tricking him/herself, and not know that s/he is being tricked.
Why not?
The reason I say that lying being based on knowledge or belief is a minor disagreement is because I'm willing to go along with your theory of lying. I'm not so interested in justification, meaning, truth, or belief as much as I am in a theory of mind. So sure, it's a disagreement, but I'm fine with setting the stage as you say -- that lying is the intentional misrepresentation of one's own belief. That fits well enough for me.
It seems to me that if we are of a split mind that we could still accomplish this -- adding a dimension of time and some notion of awareness would resolve any sort of conflict. And if this could be demonstrated to be non-pathological, it would even be a possible normal event ("possible" just because that seems more empirical question that I do not have an answer to)
If it takes talking about one person as though they were a plurality of different selves in order to make sense of lying to oneself, it seems to me that it makes better sense to abandon the notion altogether and learn to talk about the same situations in better ways.
I don't see what makes a singular self better than a divided self. For that matter I'm not sure what would make a divided self better than a singular self, at this point.
One wouldn't need to be a literal two selves within a single mind. I think merely having a divided mind -- of whatever kind -- is enough to count as lying as you define lying. Some part of the mind can deliberately misrepresent a belief to another part of the mind, and our awareness can shift from the one to the other through time.
But what would make either notion a better notion?
If it takes talking about one person as though they were a plurality of different selves in order to make sense of lying to oneself, it seems to me that it makes better sense to abandon the notion altogether and learn to talk about the same situations in better ways...
That seems along the right lines to me. The "splitting of selves" approach (I think it goes by the term "psychological partitioning" in the literature) only makes sense if one tries to force self-deception into the model of one person being deceitful to another. In those cases the key point is that the deceitful person both believes/knows something to be the case and intends that the other should believe the opposite is the case. Self-deception does not seem like that to me, it is more like having a suspicion that something you wish to be true may not be true, but rather than pursuing the chain of reasoning that will decide the issue for you, you give yourself (perhaps bad) reasons for not pursuing that chain of reasoning.
Lets say I'm at a singles bar looking for a date, and I know that statistically my chances of being successful are low
This is a point that should have been made earlier. Beliefs are almost always best thought of as partial, as confidences. You believe you're unlikely to be successful and that you have a chance of being successful. Alcohol either suppresses the former completely, or just futzes with the numbers, so that your chances look better with every drink. You're not going from believing P to believing ~P or something, because you believe both, partially, from the start. Typical self-deception is deliberately mis-calibrating your confidences.
Reply to jkg20 I think what you describe is plausible. I don't think I'd call it lying in the strict sense.
But what makes this plausible description something which actually annuls the act of lying to oneself?
By "splitting of selves" I just mean it generically -- like, I can see multiple ways you or others might parse what that means. Partitioning, or having a tripartite division of mind such as Plato's or Freud's, or as I've been saying just having an awareness which can move from different parts of the mind, or as un has been saying between the image, the self, and the speaker of the sentence making the division. I'm sure there are other ways it could be parsed.
In some sense there is a factual aspect that would need to be investigated ,and it could even be case-by-case. But investigating the facts of a mind is something of a tricky business, and deserving of some philosophical scrutiny to understand how a fact might be significant one way or another. And in a sense I think it's worthy to note that it may not be just the facts -- as un points out, there could also be commitments of one kind or another in making an identity, which are over and above the facts.
And then even more generally speaking -- what would make this singular self picture a better picture than a split self picture? It must be more than the facts because we could probably reconcile facts either way.
Something that @VagabondSpectre's approach does make me think of, explicitly at least, is that there could also be a difference between a self and a mind. So the self has a seemingly singular quality to it -- we always feel like we're the same person and can entertain, at least in a clear and distinct way as philosophers tend to like to do, about one thought at a time. But the mind can be much wider than the self, and it may not just be the self that lies but the mind.
what would make this singular self picture a better picture than a split self picture?
For some purposes we ignore what's going on under the hood. You, the single individual person, are responsible for what you say, and for the consequences of your decisions. Looking under the hood provides a more nuanced description, but it's really changing the subject.
Reply to Srap Tasmaner Is it? If it is necessary to have a split mind in order for it to be possible for one to lie to oneself, then it seems pretty relevant to me. At least it's a logical next step, under the assumption that this is the only way to parse that phrase into something which is actually lying to oneself, as opposed to it just being a turn of phrase that, in a strict sense, means something else.
Absolutely. In fact, since posting it occurs to me that the concept of "lying" belongs to one level -- the person level, where we hold individuals responsible for their words -- while "self deception" belongs to another level, where we try to understand how we and others think.
I think that's probably right, but we've become so sophisticated that now we hold people responsible for fooling themselves. Which is not completely crazy -- as I said above, I think there are related norms in play here. Both lying and self-deception are violations; they're just not exactly the same violation of exactly the same norm.
p4 One cannot do both, know s/he is tricking him/herself, and not know that s/he is being tricked.
— creativesoul
Why not?
My apologies.
I'm at a momentary loss here. I presumed that that was obvious. I am rather prone to mistakenly assuming that my interlocutor has read me for quite some time, and thus is already amidst the same stream of thought that I've been following for quite some time now.
This ought help to make it more so...
One cannot be tricked into believing something if they know both how they're being tricked, and that they're being tricked.
One who is performing the trickery knows both how and that they're doing it. That is because it's being done purely for the sake of doing so. Intentionally tricking oneself is impossible.
One cannot know how and that one is tricking him/herself and not know how and that one is tricking oneself.
...I'm willing to go along with your theory of lying. I'm not so interested in justification, meaning, truth, or belief as much as I am in a theory of mind. So sure, it's a disagreement, but I'm fine with setting the stage as you say -- that lying is the intentional misrepresentation of one's own belief. That fits well enough for me.
All mind consists entirely of thought and belief on my view. All thought and belief is meaningful and itself presupposes truth... lies notwithstanding. Without introducing meaning, truth, and belief into the mix whatever theory of mind discussed will be utterly incomplete, wouldn't you agree?
One can most certainly change the way that they look at the world by virtue of changing the way they talk about it and/or themselves. One can do this deliberately. One can deliberately change they way that they behave as a means to change the way they feel. This can, in turn, change one's belief.
Because it's intentional belief altering via coercive/irrational means. Is deception the intent to conceal or manipulate or is it the successful concealment/manipulation of objective truth? I'll satisfy both:
Lets say I'm at a singles bar looking for a date, and I know that statistically my chances of being successful are low... Consuming alcohol can make me go from believing it is true that I will likely fail to either forgetting or believing the opposite, even while it remains true that I will likely fail despite the statistical benefits alcohol may confer.
Hey, I'm going to deceive you by telling you what I'm going to do, how I'm going to do it, and when I begin doing so.
You wouldn't be deceived, and neither would I.
The example above could also be explained as follows...
People become temporarily inebriated. Poor judgment is an effect/affect of inebriation. If I know that getting drunk increases my confidence level, and if some women like guys with confidence moreso than guys without, and I entertain these thoughts in close proximity to one another, I could then deliberately do something with a clear purpose and very well may increase my chances of success.
One cannot intentionally misrepresent their own thought and belief to themself. That is, one cannot lie to oneself.
If a notion permits a meaningful account of "self-deception", it does so by virtue of either a lack of intention, or a plurality of self/mind. The former means is acceptable in light of the criterion above. The latter is not for those are negations of one another. Intentionally misrepresenting one's thought and belief requires a plurality as adequately argued heretofore. One is not a plurality.
That seems along the right lines to me. The "splitting of selves" approach (I think it goes by the term "psychological partitioning" in the literature) only makes sense if one tries to force self-deception into the model of one person being deceitful to another. In those cases the key point is that the deceitful person both believes/knows something to be the case and intends that the other should believe the opposite is the case.
I think we're in agreement here.
We know what it is to deceive another. We know what it takes. We know some things that must be both present and not in order for it to happen. That's what makes it deception. This has been argued for heretofore. As before... it takes a plurality. One is not.
One cannot deliberately misrepresent their own thought and belief to oneself. Self-deception requires that. One cannot deceive oneself.
Self-deception does not seem like that to me, it is more like having a suspicion that something you wish to be true may not be true, but rather than pursuing the chain of reasoning that will decide the issue for you, you give yourself (perhaps bad) reasons for not pursuing that chain of reasoning.
We do indeed talk ourselves into and out of things. No one makes a mistake on purpose. Using bad reasoning is a mistake.
One cannot deliberately misrepresent their own thought and belief to oneself. Deceiving another requires doing so. Deceiving oneself is impossible if we hold to the criterion for lying that I've been working from...
A heap is not a plurality of grains? A mind is not a plurality of thoughts? A brain is not a plurality of neurones? A body is not a plurality of cells?
Reply to creativesoul I think our disagreement, then, is that for me self-deception exists as I described, but does not involve lying to oneself if we consider lying to oneself only on the model of one person lying to another. I agree that lying to oneself conceived on that model is just not coherent, it involves a contradiction. However, that needn't make self-deception impossible.
I think our disagreement, then, is that for me self-deception exists as I described, but does not involve lying to oneself if we consider lying to oneself only on the model of one person lying to another. I agree that lying to oneself conceived on that model is just not coherent, it involves a contradiction. However, that needn't make self-deception impossible
A heap is not a plurality of grains? A mind is not a plurality of thoughts? A brain is not a plurality of neurones? A body is not a plurality of cells?
One heap is a plurality of grains. One mind is a plurality of thoughts. One brain is a plurality of cells. One body is a plurality of cells.
One mind is not a plurality of minds.
One heap is not a plurality of heaps. A plurality of grains is not one grain. One mind is not a plurality of minds. A plurality of thoughts is not one thought. One brain is not a plurality of brains. A plurality of neurones is not one neuron. One body is not a plurality of bodies. A plurality of cells is not one cell.
The process you described is nothing more than using inadequate reasoning. Using inadequate reasoning is a mistake. Mistakes are accidental. That which is accidental cannot be intentional.
So the example is neither intentional nor a case of lying to oneself.
But it might be a plurality of awarenesses, a plurality of intentions, or one element of several of a person. You seem to be ruling out a division on the ground of calling it 'one'. As if someone called 'Honesty' cannot be dishonest.
Laing Wrote a book about the divided self; you may not agree with his psychology, but you really cannot rule it out a priori.
As I understand it, a self-deceiver is confronted with a choice to pursue a difficult line of reasoning which he/she suspects (but does not know) might lead to reassessing a cherished belief, but instead of following that line of reasoning finds comforting, probably superficial, reasons for ignoring that line of reasoning and just continuing to maintain the cherished belief. The truth or falsity of the cherished belief might not necessarily matter, incidently, maybe the belief that the self-deceiver cherishes is in fact true (constructing an example might be interesting - I'll have a think about it) but the self-deceiver is (arguably) at fault from the rational perspective for not having engaged in the ignored reasoning process. The thing about self-deception that is important is that at some level it is rationally blame worthy, there is something the self-deceiver should do but does not do, and this conception I am offering at least allows for the self-deceiver to be blamed in that way.
One mind is not a plurality of minds.
— creativesoul
But it might be a plurality of awarenesses, a plurality of intentions, or one element of several of a person. You seem to be ruling out a division on the ground of calling it 'one'. As if someone called 'Honesty' cannot be dishonest.
Non sequitur.
As if someone called 'Honesty' cannot be a plurality called 'Honesty'. Surely there are plenty of sensible ways to divide up one mind...
Ahem...
One mind is a plurality of thoughts... belief... emotions.
Can these be in conflict with one another? Sure.
Such conflict gives rise to uncertainty, insanity, confusion, disbelief, moral dilemma, solid ground for temporarily suspending one's judgment, and a host of other things too I'm sure.
As I understand it, a self-deceiver is confronted with a choice to pursue a difficult line of reasoning which he/she suspects (but does not know) might lead to reassessing a cherished belief, but instead of following that line of reasoning finds comforting, probably superficial, reasons for ignoring that line of reasoning and just continuing to maintain the cherished belief. The truth or falsity of the cherished belief might not necessarily matter, incidently, maybe the belief that the self-deceiver cherishes is in fact true (constructing an example might be interesting - I'll have a think about it) but the self-deceiver is (arguably) at fault from the rational perspective for not having engaged in the ignored reasoning process. The thing about self-deception that is important is that at some level it is rationally blame worthy, there is something the self-deceiver should do but does not do, and this conception I am offering at least allows for the self-deceiver to be blamed in that way.
One cannot make a mistake intentionally. Mistakes are accidental. Blameworthiness doesn't belong here.
Laing Wrote a book about the divided self; you may not agree with his psychology, but you really cannot rule it out a priori.
I don't even know what "ruling it our a priori" is supposed to mean. If it is impossible for one to deliberately misrepresent their own thought and belief to oneself, then any and all arguments which assume or validly conclude that are themselves based upon at least one false premiss.
Responsibility for consequences of mistakes belongs here(is in line with my position). Blameworthiness doesn't. Both are tangential enough to warrant another thread. This one is supposed to be about the notion of lying to oneself.
As I understand it, a self-deceiver is confronted with a choice to pursue a difficult line of reasoning which he/she suspects (but does not know) might lead to reassessing a cherished belief, but instead of following that line of reasoning finds comforting, probably superficial, reasons for ignoring that line of reasoning and just continuing to maintain the cherished belief. The truth or falsity of the cherished belief might not necessarily matter, incidently, maybe the belief that the self-deceiver cherishes is in fact true (constructing an example might be interesting - I'll have a think about it) but the self-deceiver is (arguably) at fault from the rational perspective for not having engaged in the ignored reasoning process.
I think that this is a common form of self-deception (there are numerous different types). Let's suppose that I don't know with any degree of certainty that X is the case ([perhaps someone just told me X is the case and I believed it). So I believe that X is the case though I have no reason to be certain about this. Over time I will forget that I am truly uncertain that X is the case, remembering only that I believe X is the case. In this frame of mind, I may perceive hints of evidence that X is really not the case, but I may not act to reassess that belief because I deceive myself by thinking that I would not hold the belief, X is the case, without properly assessing it in the first place.
In other words, I falsely believe that if I hold a belief, that belief must have already been properly justified. The deeper the belief, the more fundamental it is, (like a Wittgensteinian "hinge-proposition") the deeper the self-deception is, that the belief is beyond doubt. So the self-deception involves telling oneself that such a deep seated belief cannot be doubted when in reality the person knows that it can and ought to be doubted.
So creativesoul holds the belief that it is impossible for a person to self-deceive. But clearly this is a belief which can and ought to be doubted. Creative self-deceives by refusing to look at the vast evidence presented, believing only the prejudice, refusing to doubt what ought to be doubted, merely insisting over and over again, that self-deception is impossible
As I understand it, a self-deceiver is confronted with a choice to pursue a difficult line of reasoning which he/she suspects (but does not know) might lead to reassessing a cherished belief, but instead of following that line of reasoning finds comforting, probably superficial, reasons for ignoring that line of reasoning and just continuing to maintain the cherished belief. The truth or falsity of the cherished belief might not necessarily matter, incidently, maybe the belief that the self-deceiver cherishes is in fact true (constructing an example might be interesting - I'll have a think about it) but the self-deceiver is (arguably) at fault from the rational perspective for not having engaged in the ignored reasoning process. The thing about self-deception that is important is that at some level it is rationally blame worthy, there is something the self-deceiver should do but does not do, and this conception I am offering at least allows for the self-deceiver to be blamed in that way.
Setting aside this notion of blameworthiness, but taking on the rest...
One has a deeply held belief. That is, one has unshakable conviction that something is true, or is the case, or some such. One is confronted with a line of reasoning that places that belief in question. The truth or falsity doesn't matter...
Full stop. What else about the belief would be reassessed?
So here we are considering an example where a line of reasoning is being ignored that would have otherwise led one to reassess a deeply held belief. It's being said here that it's not so much the truth/falsity of the belief that matters. What matters more, according to this purported notion of self-deception is that that person ought to have done something that they did not.
So self deception is when one doesn't do what another thinks they ought?
:worry:
Metaphysician UndercoverJuly 22, 2018 at 02:17#1990580 likes
Looks like a situation where all examples of self-deception only have one thing in common that makes them so... they are all called "self deception". Like Witt's "game" situation.
Is that so though?
I mean surely self-deception is something in and of itself, right? In that case there should be something or some things that make it qualify. Ahem... a criterion.
I mean surely self-deception is something in and of itself, right? In that case there should be something or some things that make it qualify. Ahem... a criterion.
Anyone here have one?
If I am mistaken about the fact that I am a good philosopher, and someone points out that my thinking is sloppy and my ideas confused, then on seeing the evidence I will correct the mistake. "I thought I was quite good at this, but I see I was wrong. No worries."
If OTOH, I am deceiving myself that i am a good philosopher, and the same thing happens, I will resist, my feelings will be hurt, I will get angry and dismissive, I will attack the evidence, make excuses, and so on. Folks will commonly die to maintain a false image of themselves.
I gave this criterion a long way back- "commitment".
So self deception is when one doesn't do what another thinks they ought?
There is something wrong in being self-deceptive, one is doing something one should not be doing. Note that there is a difference between one person being deceitful to another and one person simply deceiving another (magicians deceive people, but when they do so, they are not being deceitful). What in general that is added to deceptive behaviour in order to make it deceitful is that some social norms of acceptable behaviour are being violated. Self-deception retains from deceitfulness that aspect of its being wrong, and since that is based on social norms it would follow that when one is deceiving oneself it involves going against what others believe one ought to be doing/have done. Solitary self-deception probably makes as little sense as solitary rule following.
As for a criterion, let me have a stab at one (there may be others): refusal to engage in a rational process that one is aware exists, that one can engage fully in and where that refusal is motivated by the fact that it may undermine a cherished belief (might need to add that an alternative rational process is engaged in which provides - perhaps superficial - support for the cherished belief).
And I do for the moment at least stand by the idea that the truth or falsity of the cherished belief need not be relevant as to whether a person is engaging in self-deception. Suppose John is accused of murdering Janet. John's mother believes that John did not murder Janet. The evidence is stacked up against John - video surveillance, finger prints, motive, means, opportunity etc. Rather than examine the evidence for what it is, John's mother insists that she knows her Johnny, he's a gentle boy that she brought up and would not harm a soul and so he did not murder that man-eating Janet. Now, suppose that John really did want to murder Janet, and on the day in question went with malice of forethought to her appartment to kill her. Arriving there, Janet is already butchered, so John flees the scene after accidentally leaving some top quality finger prints, and is caught on camera entering and exiting the building. John's mother has a true belief that her son did not murder Janet, but she's still engaging in self-deception - at least arguably. Of course, unpacking the example might expose it as not showing what I think it shows, but examples have to start somewhere.
If I am mistaken about the fact that I am a good philosopher, and someone points out that my thinking is sloppy and my ideas confused, then on seeing the evidence I will correct the mistake. "I thought I was quite good at this, but I see I was wrong. No worries."
If OTOH, I am deceiving myself that i am a good philosopher, and the same thing happens, I will resist, my feelings will be hurt, I will get angry and dismissive, I will attack the evidence, make excuses, and so on. Folks will commonly die to maintain a false image of themselves.
I gave this criterion a long way back- "commitment".
Commitment isn't enough though Un. Necessary? Surely. Sufficient or adequate? Not even close. I think we agree.
We may get somewhere helpful thinking along these lines. I'm curious enough to see if I have this right(if I'm understanding your claims) and if the path will end up being a helpful one...
So, deceiving oneself is always being mistaken, but not the other way around. The difference between being mistaken and deceiving oneself is that one who is deceiving oneself takes being told that they're mistaken personally, so much so that they are incapable of correcting the mistake. This overly general parsing is good enough for now, I think.
On my view...
It is humanly impossible to knowingly believe a falsehood. We all have or have had false belief somewhere along the line. Belief systems(world-views) are self-contained. So, we cannot see our own mistakes. Correcting mistakes requires first seeing them. Seeing them requires an other. Thus, in order to even be able to correct our own mistakes, we must be capable of admitting our own fallibility(that we could be holding false belief), and we must recognize that an other is necessary. In addition, we have to place more confidence(trust) in an other than we do in our own thought and belief(the ones in question). One who cannot do this would meet your criterion for self-deception.
There is something wrong in being self-deceptive, one is doing something one should not be doing. Note that there is a difference between one person being deceitful to another and one person simply deceiving another (magicians deceive people, but when they do so, they are not being deceitful).
What in general that is added to deceptive behaviour in order to make it deceitful is that some social norms of acceptable behaviour are being violated. Self-deception retains from deceitfulness that aspect of its being wrong, and since that is based on social norms it would follow that when one is deceiving oneself it involves going against what others believe one ought to be doing/have done. Solitary self-deception probably makes as little sense as solitary rule following.
...what makes deception deceitful is when it breaks the rules of acceptable/unacceptable behaviour.
Install a real life scenario...
Politicians, in the States at least, are expected to lie to the people about their motives for holding elected office. That is to say that that is a social norm. Lots of Americans hold the view that all politicians lie.
Using the standard you've put forth, politicians that deliberately misrepresent their own thought and belief as a means to convince voters are not being deceitful.
Bullshit.
The fact that some deception is socially acceptable does not change the fact that it is deception. All deception is deceitful. That is precisely what makes it deception.
As for a criterion, let me have a stab at one (there may be others): refusal to engage in a rational process that one is aware exists, that one can engage fully in and where that refusal is motivated by the fact that it may undermine a cherished belief (might need to add that an alternative rational process is engaged in which provides - perhaps superficial - support for the cherished belief).
Rational process can involve putting certain kinds of logic to use. Para-consistent logic qualifies. Para-consistent logic holds that a statement can be both true and false at the same time and in the same sense. This logic has the ability to render any statement either true or false.
It is humanly impossible to knowingly believe a falsehood.
Right, I think I get what you are saying here. If one says out loud, "I believe X and X is false." there is an obvious contradiction. But folks can get very close: consider the cliche "I'm not a racist but ..." where what follows the 'but' is some obviously racist belief. One can believe things that are contradictory, just as long as one does not notice the contradiction.
But thereafter, I stop agreeing. I might believe I can lift up that rock, and all it takes to change my belief is trying and failing. I don't need anyone else.
I think your 'knowingly believe' is doing too much work. That is to say, I do not know everything I believe, certainly not until I start looking. Consider prejudice. I repudiate prejudicial beliefs, and yet I find on reflection that I act on them. And when a man crosses the void on the bridge, that is stronger evidence that he believes it will support him, than any amount of confession.
It is humanly impossible to knowingly believe a falsehood.
— creativesoul
Right, I think I get what you are saying here. If one says out loud, "I believe X and X is false." there is an obvious contradiction. But folks can get very close: consider the cliche "I'm not a racist but ..." where what follows the 'but' is some obviously racist belief. One can believe things that are contradictory, just as long as one does not notice the contradiction.
But thereafter, I stop agreeing. I might believe I can lift up that rock, and all it takes to change my belief is trying and failing. I don't need anyone else.
I think your 'knowingly believe' is doing too much work.
Point taken. Some false belief can be recognized by the believer without an other. So...
Self-deception is not being able to correct one's mistaken belief.
That is to say, I do not know everything I believe, certainly not until I start looking. Consider prejudice. I repudiate prejudicial beliefs, and yet I find on reflection that I act on them. And when a man crosses the void on the bridge, that is stronger evidence that he believes it will support him, than any amount of confession.
Perhaps. It would require having considered whether or not the bridge would support him at some time or other though, wouldn't it? A lizard crosses the bridge, but that crossing is not strong evidence that it believes that the bridge will support it.
How does this tie into lying to oneself or self-deception?
Perhaps. It would require having considered whether or not the bridge would support him at some time or other though, wouldn't it? A lizard crosses the bridge, but that crossing is not strong evidence that it believes that the bridge will support it.
I don't think belief requires much consideration. Hear the bell, start salivating. I'm not the lizard whisperer, but the folks that I know of, cats, for example, are sometimes unsure whether something will support them or not, and sometimes surprised when it does not. [insert cute video here]
But humans. Humans are largely opaque to themselves. I find I can not know someone's name, even though I know that I know it. It's on the tip of my tongue... And certainly I believe and act upon all sorts of stuff that I never consider, and that the ground will support me, whether it is a bridge or a cutting, is a trivial example. That the nearest shop is right, left, and on the opposite corner at the crossroads... I have never really thought about it 'til now.
This is how I go on when I'm not philosophising, and since I can not know things I know I know, and know things without knowing and believe things I've never considered whether or not to believe them, it becomes really rather easy to deceive myself if I have reason to want to. And one reason I might want to deceive myself that all this is not the case is that I like to consider myself a philosopher, who is much more insightful.
I don't think belief requires much consideration. Hear the bell, start salivating. I'm not the lizard whisperer, but the folks that I know of, cats, for example, are sometimes unsure whether something will support them or not, and sometimes surprised when it does not. [insert cute video here]
Rudimentary belief requires no consideration. Not all belief is rudimentary.
But humans. Humans are largely opaque to themselves. I find I can not know someone's name, even though I know that I know it. It's on the tip of my tongue...
Forgetting and remembering. We either know something or we don't. I find that regularly employing self-contradictory language perpetuates itself.
And certainly I believe and act upon all sorts of stuff that I never consider, and that the ground will support me, whether it is a bridge or a cutting, is a trivial example. That the nearest shop is right, left, and on the opposite corner at the crossroads... I have never really thought about it 'til now. This is how I go on when I'm not philosophising, and since I can not know things I know I know, and know things without knowing and believe things I've never considered whether or not to believe them, it becomes really rather easy to deceive myself if I have reason to want to. And one reason I might want to deceive myself that all this is not the case is that I like to consider myself a philosopher, who is much more insightful.
I can't make much sense of this, which is unsurprising given the inherent self-contradiction.
Metaphysician UndercoverJuly 23, 2018 at 00:06#1993010 likes
So, deceiving oneself is always being mistaken, but not the other way around. The difference between being mistaken and deceiving oneself is that one who is deceiving oneself takes being told that they're mistaken personally, so much so that they are incapable of correcting the mistake. This overly general parsing is good enough for now, I think.
Self deception comes in many forms. If you do what you know that you ought not do, and you have success, so that you later think that perhaps it's ok to do what you did, and now proceed to do this regularly, progressing to the point of having forgotten that you ought not do this, then you have deceived yourself. You have deceived yourself into thinking that it's OK to do what you knew that you ought not do.
So I use a snow blower. I know that when the chute gets plugged with wet snow I ought to shut off the machine before sticking my hand in there, to be sure to avoid injury. However, I realize that if I check the machine to make sure that it's not turning before I stick my hand in there, it's not a problem I can do this without shutting off the machine and there's no injury. So I deceive myself into believing that I need not shut the machine off before sticking my hand in there, to avoid injury. You might think that this is not deception, there really is no need to shut the machine off. But one time I mistakenly determined that the machine was not turning, when it really was, and there was injury. So I realized that I had deceived myself into believing that I didn't need to shut off the machine to avoid the possibility of injury.
Lots of Americans hold the view that all politicians lie.
Even those who hold the view that all politicians lie probably do not find it acceptable that they should do so, so the deceiving politician can still be deceitful on my account - your counterexample seems misguided.
Rational process can involve putting certain kinds of logic to use. Para-consistent logic qualifies. Para-consistent logic holds that a statement can be both true and false at the same time and in the same sense. This logic has the ability to render any statement either true or false.
Do you see the problem?
Not really, but then perhaps there isn't a problem to see. First, paracconsistent logics do not render any substantive non-logical statement either true or false, they are just formalisms of different types of logicial consequence. Furthermore so-called true contradictions, even if acceptable within a paraconsistent logic, are limited to a special range of propositions (involving vagueness and the use of the truth predicate for instance). So the relevance of the existence of paraconsistent logics is unclear to me in the context of self-deception - particularly since on my account the truth or falsity of the belief concerned need not be relevant (as in the example I gave). Of course, if someone who appears to be self-deceiving were suddenly to start justifying their belief by quoting theorems from paraconsistent logics, then perhaps one would have to revisit the claim that they were deceiving themselves, but it would depend what belief they were trying to justify. You would need to give me a fleshed out example for me to see the real problem you are getting at.
So, deceiving oneself is always being mistaken, but not the other way around.
This might be right, but care needs to be taken to understand where the mistake lies. Deceiving yourself that some proposition P is true (or false) does not require that the mistake be about whether P is true (or false). In the example I gave John's mother believes that John did not murder Jane, and she is not mistaken about that because John really did not murder Jane, yet she is deceiving herself. If there is a role for mistake in that example it is her mistake of not taking the evidence stacked up against John seriously.
This might be right, but care needs to be taken to understand where the mistake lies. Deceiving yourself that some proposition P is true (or false) does not require that the mistake be about whether P is true (or false). In the example I gave John's mother believes that John did not murder Jane, and she is not mistaken about that because John really did not murder Jane, yet she is deceiving herself. If there is a role for mistake in that example it is her mistake of not taking the evidence stacked up against John seriously.
What makes it go from being mistaken to self-deception? For that matter, what makes it either?
Our belief system('mind', if you prefer) works in many ways that are autonomous. We've all experienced a gestalt, I imagine. We've all had something on the tip of our tongue. We've all remembered something 'out of the blue'. We've all forgotten things that we're sure we used to be able to remember, we used to know. We've all used poor reasoning at times. We've all been guilty of confirmation bias, cognitive dissonance, etc. Why call these things "self-deception"?
When we talk about deception, particularly when we talk about someone deceiving an other, there are elements which make it what it is. Those elements are non-existent in all the sensible, reasonable, and/or coherent examples of self-deception put forth. What is the ground for changing this criterion? It's been shown to lead to self-contradiction, incoherency, equivocation, or just plain nonsensical talk.
We can all imagine a stick insect, or a moth, or any other type of camouflaged critter. Our choices are saying that it is deceiving it's predators(and changing the criterion or equivocating) or simply refrain from saying that it is deceiving it's predators(it has no intent after-all), and begin talking about it in better ways.
Squirrels...
Now they are intentionally deceiving others, by pretending to hides nuts when others are watching and they know it, or they do this by pure accident and the behaviour itself increases the likelihood of their survival, so it has been a trait/behaviour that has proven beneficial, and hence has transcended the individual squirrels.
What makes it go from being mistaken to self-deception? For that matter, what makes it either?
Your opinion?
I mean, she was right after-all.
Well, it certainly is my opinion that John's mother is deceiving herself, and the mistake she is making is not taking seriously evidence that ought to be taken seriously. You seem not to share that opinion, so perhaps - as in many cases of philosophical argument - we've arrived at a clash of intuitions. Of course, you might try to push the line that John's mother is not deceiving herself concerning the belief that John murdered Janet, but rather the belief that Johncould have murdered Janet and then start giving some counter-factual analysis of self-deception in terms of possible worlds - for all I know David Lewis has already done this. So in the end, perhaps the truth of falsity of the belief involved in self-deception might be argued to be an important concern.
When we talk about deception, particularly when we talk about someone deceiving an other, there are elements which make it what it is
However, even if my intuition about truth or falsity being a side issue in self-deception is misguided, I still insist that self-deception is not correctly modelled along the lines of one person deceiving another (although it would not be too hard to think of an example of one person deceiving another into believing a truth). John's mother is doing something wrong, she is making a mistake - ignoring evidence - that she, as a rational person, ought not to have made. Self-deception, in this sense, is as much (if not more) a moral issues as it is a metaphysical one.
Well, it certainly is my opinion that John's mother is deceiving herself, and the mistake she is making is not taking seriously evidence that ought to be taken seriously...
So what does it take for her to deceive herself?
You saying so, or something that has nothing at all to do with you? I say, if the notion of self-deception is to be worth something, it has to be the latter.
She was right.
I think your example highlights the reasons why we ought carefully consider the sorts of evidence that are presented. You claim she didn't take it seriously enough, but it seems to me that you would've wrongfully convicted someone, and yet say that she was deceiving herself when she would've gotten it right.
:worry:
Doesn't what you've done here fit your own criterion of self-deception? It seems to me that it does, although on my view you're just simply mistaken(assuming sincerity).
Reply to creativesoul @jkg20 is arguing the same as I did that self-deception is a violation of our norms of rationality, often related to the treatment of evidence, sometimes related to inference or other elements of reasoning.
What you're still missing is that reasoning is a process without a pre-selected outcome. You can choose to futz with the process in various ways, and this is quite intentional, but it needn't ever lead to direct confrontation with the outcome -- that can be endlessly pushed aside and never arrived at. So there's no issue of at once assenting to and not assenting to some proposition. You just make sure that proposition never makes it to the floor for a vote. You do this deliberately. You find ways to rationalize doing it, reasons that have nothing to do with your real motivation, reasons that allow you to give what you're doing the color of rationality. No doubt when confronted you will be able to defend yourself at length and explain how every step you took or didn't take was thoroughly justified.
The natural competitor for describing such a process is simply "being mistaken". But this sort of thing doesn't look much like being mistaken to me.
That's an interesting take. Of course I'm reminded of conventional American politics. I'm not sure that I'd call what Senator Mitch McConnell does and what John Boener did(which you've just described perfectly) as either being mistaken or self-deception.
Reply to creativesoul
It's very hard to judge which politicians are lying to themselves and which are soul-less tools.
I'd rather not do more politics, but I wholeheartedly recommend this excellent piece of popular philosophy by the estimable John Scalzi: The Cinemax Theory of Racism. I think it's on point.
You find ways to rationalize doing it, reasons that have nothing to do with your real motivation, reasons that allow you to give what you're doing the color of rationality.
As I've described self-deception, rationalizing is one step in the process. It is the cover-up. This is when one knows that what one is doing, or saying is wrong, but the individual produces a rationalization to make it appear as if it were right. Creativesoul insists that the only people who could be tricked, or fooled by the cover-up are people other than the one producing the rationalization. But this is not the case because all that is required is that the person who produces the rationalization becomes so focused on remembering and describing the cover-up, and enamoured by the cover-up, that they forget the thing being covered up.
What creativesoul doesn't seem to allow for is the fact that self-deception is a process which takes time. So creativesoul presents a logical argument in which a person cannot both believe and disbelieve the same thing at the same time, and dismisses self-deception as impossible. But this does not properly represent the temporal process of self-deception in which one belief replaces another over a period of time. The rationalization, or reason for replacement, is known to be unjustified when introduced, but the process of repetition and attention to the details of the rationalization, ends with the fact that the rationalization is unjustified, having been forgotten.
The point to notice is that memory requires mental effort, it is not automatic. Therefore, as we say, memory is selective. So when we do not choose to remember certain things they are forgotten and this is what makes self-deception possible.
What must be the case in order to successfully lie to yourself?
Simple enough question. But hard to answer.
If by "successfully" you mean "genuinely", then I don't think it's possible to lie to oneself. You either believe A about X in situation Z or you don't. You can try to persuade or convince yourself to believe B about Y in situation Z even though you still genuinley believe A about X in situation Z. You can try to avoid thinking about, acknowledging, or denying a belief, but that's not the same as lying.
Lying involves holding a view about X as true but presenting it as false.
half the people i lie to don't believe me either.
but i'm still lying to them - intentionally.
so are you really asking how to lie to yourself and believe it?
i don't think you can - intentionally.
i think when people say "you are lying to yourself" the case is actually that you are mistaken in your perception of the subject and genuinely believe the wrong concept.
they can see the truth so clearly that it stands to their reason that you should see it equally as clear, but since you cannot it appears that you are "lying to yourself."
a lie is simply something that is false. which means it's not correct.
if you form a believable opinion, based on inaccurate information, you are basically "lying to yourself" by believing false data.
I suppose it's possible that one continue offering the same lie about themselves to others so often that they themselves begin to believe the lie. Unintentionally. Deception requires intent. Being wrong is accidental. Being wrong about oneself is accidental.
So, there's two basic notions to consider with self-deception. The first is what is the or a 'self'. The second is what counts as deception.
There is no self without others. What one comes to think about oneself comes largely via language acquisition. We learn how to talk and thus think about the world and/or ourselves through language use. We adopt our first world-view in this way. However, it can be the case that one's innate personality/character is different in remarkable ways than what s/he has learned is acceptable within and to their immediate familial and/or social surroundings.
For example, one may be attracted to the same sex while being raised in a social setting where such a thing is condemned. All humans are interdependent social creatures by our very nature, and the need to feel accepted and/or fit into a larger peer group is evidently a strong one. If one's social group does not accept homosexuality, then one who is attracted to members of the same sex has to suppress any and all behaviours that may cause others to view them in a negative way.
Here, if the self is determined by others' moral values/beliefs about how things ought be, then one who would otherwise be prone to engage in homosexual behaviours and thoughts must alter the way they act and talk in order to conform and be accepted.
But is that who they are or who others think that they should be?
If one within such a situation were to intentionally suppress any and all homosexual thought as a means of self-discipline with the goal of fitting in, then their notion of self would be in conflict with who they would otherwise have been.
Sorry for the time delay on not tending the thread. I'm glad to see the discussion continue, though. There was the weekend, and family, and other things besides philosophy. But I'm back now.
I don't even know what "ruling it our a priori" is supposed to mean. If it is impossible for one to deliberately misrepresent their own thought and belief to oneself, then any and all arguments which assume or validly conclude that are themselves based upon at least one false premiss.
I'd say that "ruling it out a priori" means that you are ruling out the possibility that our minds are divided by means of some conceptual analysis of the concept of lying, or by declaring it to be impossible. Maybe you're not, but I'm not sure why it is impossible to deliberately misrepresent one's own belief to oneself.
I am fine with your notion of lying. So lying, rather than merely being mistaken, is when you deliberately misrepresent your own belief to yourself. Merely being mistaken is holding a false belief. Since falsity isn't in the notion of lying the two don't even have to relate. We may deliberately misrepresent some true or false belief to ourselves, just depending upon what we believe. By removing truth, in fact, there is a lot more wiggle room here -- the beliefs need not even have a factual component (EDIT: Or even be truth-apt). They merely need to be misrepresented to ourselves.
And such a thing would be possible -- conceptually speaking, here -- if the mind were in some sense divided. So let's just stick with @unenlightened's notion of commitment. I am committed to some belief. I come to believe something that is in conflict with this other belief. Here I can be honest with myself, realize that these two beliefs are not compatible, and try and think through that conflict and resolve it in some way. Or I can be dishonest with myself, act out of fear, and tell myself that the beliefs are not in conflict. However I might accomplish this -- it seems that this dishonesty is really what lying to yourself is all about. You aren't coming to terms with a conflict in beliefs, but rather accepting both beliefs in spite of having the niggling realization that they are in conflict. So you misrepresent your beliefs -- or meta-beliefs? -- by saying they can get along fine. Your commitment and your new belief that said commitment is somehow erroneous (not necessarily false) and your belief that they are not in conflict are all somehow simultaneously preserved. It seems a mental feat which would result in conflict of the self, and indeed I'd say that this is the case -- which really only makes sense if different parts of the self can actually be in conflict, which is easily understood if the mind is divided.
so are you really asking how to lie to yourself and believe it?
Bingo. Well, not exactly how I, personally, might do so -- I'm not after a step-by-step guide to lying to myself. But rather what would necessarily be true if it were possible to lie to yourself. So I'm not really assuming that it is possible to lie to yourself, even. I'm more interested in a conceptual analysis of lying to yourself -- exploring what is necessary under the assumption that it is true.
The benefit being that by so doing it might lead to a way of determining whether it is or is not true, but without simply assuming one way or the other.
All thought and belief is meaningful and presupposes truth. Mind consists in part at least(entirely on my view) of thought and belief(mental correlations). That's how truth and meaning belong here as well.
One cannot be tricked into believing something if they know both how they're being tricked, and that they're being tricked.
One who is performing the trickery knows both how and that they're doing it.
One cannot know how and that one is tricking him/herself and not know how and that one is tricking oneself(how and that it's being done).
The same applies to deliberately misrepresenting one's own thought and belief to oneself. It's just plain common sense. It's not at all difficult to grasp.
I am fine with your notion of lying. So lying, rather than merely being mistaken, is when you deliberately misrepresent your own belief to yourself. Merely being mistaken is holding a false belief. Since falsity isn't in the notion of lying the two don't even have to relate.
This is incorrect. Truth/falsity is in the notion of lying. It's just that the lie(what's being said as opposed/compared to what's believed) can be either. On the face of it even it's more than obvious that being mistaken and lying are related. They both require belief.
We may deliberately misrepresent some true or false belief to ourselves, just depending upon what we believe.
That's what I'm rejecting, and have argued for without subsequent refutation. On my view, just saying it isn't enough. Can you argue for this?
By removing truth, in fact, there is a lot more wiggle room here -- the beliefs need not even have a factual component (EDIT: Or even be truth-apt). They merely need to be misrepresented to ourselves.
Removing truth from the notion of thought and belief? Cannot be done. All thought and belief presupposes it's own truth somewhere along the line. The notion of being "truth-apt" is misleading at best. All thought and belief can be either. It's only as a result of their being inadequately represented in speech that makes it seem like some are not.
I am committed to some belief. I come to believe something that is in conflict with this other belief. Here I can be honest with myself, realize that these two beliefs are not compatible, and try and think through that conflict and resolve it in some way. Or I can be dishonest with myself, act out of fear, and tell myself that the beliefs are not in conflict.
If you know that they are in conflict, then you cannot believe that they are not.
However I might accomplish this -- it seems that this dishonesty is really what lying to yourself is all about. You aren't coming to terms with a conflict in beliefs, but rather accepting both beliefs in spite of having the niggling realization that they are in conflict. So you misrepresent your beliefs -- or meta-beliefs? -- by saying they can get along fine. Your commitment and your new belief that said commitment is somehow erroneous (not necessarily false) and your belief that they are not in conflict are all somehow simultaneously preserved. It seems a mental feat which would result in conflict of the self, and indeed I'd say that this is the case -- which really only makes sense if different parts of the self can actually be in conflict, which is easily understood if the mind is divided.
The mind is divided. However, it is still one mind. It is divided in terms of having/holding conflicting beliefs. Your example is one of cognitive dissonance being ignored. Very common practice hereabouts and everywhere I've ever been.
Lying is dishonest/insincerity. Being insincere is precisely what one is doing when being dishonest(when lying). Both point towards what's going on when one is deliberately misrepresenting one's own thought and belief. So, there's no difference on my view between 'being dishonest with oneself' and 'lying to oneself'. Both are poor uses of language stemming from misconceptions. They hamper understanding.
One cannot be tricked into believing something if they know both how they're being tricked, and that they're being tricked.
One who is performing the trickery knows both how and that they're doing it.
One cannot know how and that one is tricking him/herself and not know how and that one is tricking oneself(how and that it's being done).
The same applies to deliberately misrepresenting one's own thought and belief to oneself. It's just plain common sense. It's not at all difficult to grasp.
The mind is divided. However, it is still one mind. It is divided in terms of having/holding conflicting beliefs. Your example is one of cognitive dissonance being ignored. Very common practice hereabouts and everywhere I've ever been.
So if we can have or hold conflicting beliefs -- ignore cognitive dissonance, as you put it -- then we can both know that two beliefs are in conflict, and believe they are not in conflict. Because both of those beliefs, too, are in conflict, yet we can hold conflicting beliefs, so.... what's the problem?
It goes against common sense. But here it seems you're admitting that common sense is wrong?
Removing truth from the notion of thought and belief? Cannot be done.
I feel that's irritating.
"I feel that's irritating" is true. But is the feeling of irritation true? No. But it is a part of the mind. So if the entire mind is belief, then surely there are non-cognitive beliefs.
So if we can have or hold conflicting beliefs -- ignore cognitive dissonance, as you put it -- then we can both know that two beliefs are in conflict, and believe they are not in conflict. Because both of those beliefs, too, are in conflict, yet we can hold conflicting beliefs, so.... what's the problem?
It goes against common sense. But here it seems you're admitting that common sense is wrong?
Ignoring the self-contradiction is not believing there is none. It's neglecting to address it, or believing it's unimportant.
Reply to creativesoul But if we can be in self-contradiction, then we can also be in self-contradiction about our beliefs. So we might just ignore it, which is something like what I believe @jkg20 is saying. But we can also form a further belief, a belief that the two are not in self-contradiction. So we can believe that "A and B do not contradict" as well as believe that "A and B do contradict" -- since we can believe contradictory things.
"I feel that's irritating" is true. But is the feeling of irritation true? No. But it is a part of the mind. So if the entire mind is belief, then surely there are non-cognitive beliefs.
All thought and belief consist of correlations drawn between 'objects' of physiological sensory perception and/or oneself. The 'feeling' of irritation is no different. The statement sets out the connection to oneself(the irritation) and the 'object'(that).
But if we can be in self-contradiction, then we can also be in self-contradiction about our beliefs. So we might just ignore it, which is something like what I believe jkg20 is saying. But we can also form a further belief, a belief that the two are not in self-contradiction. So we can believe that "A and B do not contradict" as well as believe that "A and B do contradict" -- since we can believe contradictory things.
We can believe contradictory things. We cannot acknowledge that they are and believe that they are not in the same breath.
Reply to creativesoul Sure. I can go with that. That's why I thought a dimension of time was necessary, as well as some way of explaining how we shift from one part of the mind to another -- like having an awareness that shifts.
That's normal. If one holds contradictory belief, they usually do not face one another in one's thought. That usually takes an other. Always actually, but that would get into how thinking about thought and belief requires language.
Why call this lying to oneself if it shares nothing with lying to another? If you're going with my notion of what counts as lying, it just doesn't work...
One's beliefs change over time. This can certainly result in having contradictory beliefs that weren't there before. We don't think about all our beliefs at the same time. It(maintaining coherency) takes a lot of work when one begins to look at the world differently...
Let's consider the philosopher's friend, phantom limb pain. I seems I cannot be deceived or mistaken about my own pain, because it is experiential. But then I find that the leg I do not have hurts. Real pain, phantom leg. So neurobabbblle tells us that it is due to activity in a region of the brain that has a 'body-image' to which sensations are referred and the leg image goes a bit mad from sensory deprivation. ( They don't say that, I'm paraphrasing).
There seems to be a certain sleight of leg going on - one might say that I am deceived by my nervous system into thinking I have a hurty leg, when I do not have a hurty leg. But then I have separated me, my body and my nervous system, and allowed one to deceive another, albeit unintentionally.
Or consider an anorexic, who considers herself grossly overweight as she is dying of starvation.
Or the poor philosopher, who reasons thus:
Men like football.
I am a man.
Therefore I like football.
Which is an example of conforming to an image. Is it impossible to convince oneself that one likes football, or gurls, or shaving, or fighting, because one more desperately wants to conform than to be 'true to oneself'? Surely, it happens all the time?
Reply to creativesoul Because you can intentionally tell yourself a lie, and then become unaware of said action. I'd say I agree with @unenlightened's examples above -- we can have an image we want to conform to, realize we are not like the image, and then tell ourselves "But really, deep down inside, I am like that image" and then have our awareness flip such that we are no longer aware that we intentionally deceived ourselves.
lie to your self ...we do it all the time , we compromise ethics , we convince ourselves it's their fault , you assume they don't like you .... all forms of lying to your self
now ... can you lie to YOU .... no .... deep down if the mind quiets there is never a single doubt as to action needed , so to pretend we do not know something that is internal is a lie in itself .....
the answer to your question depends which you we are speaking about , the self image/ego .... or the consciousness capable of observing thought without judgment .... one yes , the other one no
Which is an example of conforming to an image. Is it impossible to convince oneself that one likes football, or gurls, or shaving, or fighting, because one more desperately wants to conform than to be 'true to oneself'? Surely, it happens all the time?
Surely. I'm confident that neither 'self-deception' nor 'lying to oneself' is the best way to describe these situations though. I mentioned one earlier... the homosexual. We all adopt our initial worldview, and that includes much, if not most, of our own original 'self' image.
Here we see the inherent problem with the notion of self. I'm certain we all agree here. The self is largely delineated by others.
Because you can intentionally tell yourself a lie, and then become unaware of said action. I'd say I agree with unenlightened's examples above -- we can have an image we want to conform to, realize we are not like the image, and then tell ourselves "But really, deep down inside, I am like that image" and then have our awareness flip such that we are no longer aware that we intentionally deceived ourselves.
This is presupposing exactly what needs argued for. Care to address the arguments I've given for how and why one cannot deceive oneself and one cannot lie to oneself? I think I've directly addressed all you've said here. I certainly intended to.
Reply to creativesoul
Your argument appears to be:
1: Self-deception only makes sense if it makes sense for one person A to act deceitfully towards a person B, where A and B are the same person.
2: It does not make sense for one person A to act deceitfully towards person B, where A and B are the same person.
3: Therefore, modus tollens, self-deception does not make sense.
To support (2), the model of A being deceitful towards B regarding some proposition P, goes along the following lines
a) A knowingly believes that not-P.
b) B's own interests are best served by knowingly believing that not-P.
c) A knowingly believes that his (A's) intentions/wishes are best served by having B knowingly believe that P.
c) A acts intentionally in order that B should knowingly believe that P.
So we end up, if the deceitful behaviour is successful, with A knowingly believing that not-P and B knowingly believing that P at one and the same time. This is not possible where A and B are one and the same person.
(Note that we are talking about A being deceitful to B, not A merely deceiving B.)
That seems right, and so (2) is a true premise.
But this leaves conditional (1) undefended and @Srap Tasmaner and I have been suggesting that self-deception should not be modelled on one person being deceitful to another. That part of the argument you have not yet established. As far as I recall, it came down to intuitions about what you/I/Srap would or would not call cases of self-deception.
Being wrong is not equivalent to lying. Being wrong about oneself is not equivalent to lying to oneself.
The notion of 'self'-deception is nonsense. I've already adequately argued for that without subsequent valid criticism.
Have you? It seems to me that you've just declared it nonsense. (EDIT: I should note here I believe you're sincere, I'm just telling you my impression is all) You do so on the basis of saying that we can classify any intentional act of misrepresenting belief to oneself as something other than lying, because we haven't given the necessary and sufficient conditions that are up to your standard.
But is that an argument? We have given criteria that marks simple error from self-deception. You've just said "OK, sure that's necessary, but not good enough" -- but then we do in fact have a means for distinguishing the two, and so what exactly is nonsensical here?
we can have an image we want to conform to, realize we are not like the image, and then tell ourselves "But really, deep down inside, I am like that image" and then have our awareness flip such that we are no longer aware that we intentionally deceived ourselves.
I don't think it works quite like that, in most cases. One needs a bit of psychology here.There is 'what I am', and there is 'what I think I am' (my self image), and the latter is an aspect of the former. But inevitably, I think that what I am is what I think I am. So self- preservation becomes a matter of preserving the image.
Suppose I look at myself from a position of ignorance. It comes naturally, from this realisation that I am not who I think I am. Then I see there is the self-image I have, but I give it less importance, because it is incomplete at best. So I am ready to discover myself anew. Perhaps, after all I am not the wise philosopher I think I am; perhaps I am not the nice balanced social being I think I am. I will find out as I go - I will learn about myself in my relationship to the world, but it will always be learning, never knowing. This is too frightening for me as long as I still think I am what I think I am, and it seems that to change my image is to die.
I'm confident that neither 'self-deception' nor 'lying to oneself' is the best way to describe these situations though. I mentioned one earlier... the homosexual. We all adopt our initial worldview, and that includes much, if not most, of our own original 'self' image.
Here we see the inherent problem with the notion of self. I'm certain we all agree here. The self is largely delineated by others.
Well I'm not committed to a particular way of describing things, but when people use these terms, I think I know what is meant.
"You are naughty! Be a good boy for Mummy!"
On the one hand one discovers oneself in relationship, one is learning, and on the other, one is told what one is and what one must be. One must be good because one is naughty. And Santa will know which you are. Most people are naughty and being good, taught to live a lie in negation of the lie they have been taught.
I don't think it works quite like that, in most cases. One needs a bit of psychology here.There is 'what I am', and there is 'what I think I am' (my self image), and the latter is an aspect of the former. But inevitably, I think that what I am is what I think I am. So self- preservation becomes a matter of preserving the image.
That's fair. I can get along with that -- I haven't really been thinking in terms of plausibility, psychology, or facts as much as just getting a basic and easy to recognize theory of lying-to-oneself across to @creativesoul
Suppose I look at myself from a position of ignorance. It comes naturally, from this realisation that I am not who I think I am. Then I see there is the self-image I have, but I give it less importance, because it is incomplete at best. So I am ready to discover myself anew. Perhaps, after all I am not the wise philosopher I think I am; perhaps I am not the nice balanced social being I think I am. I will find out as I go - I will learn about myself in my relationship to the world, but it will always be learning, never knowing. This is too frightening for me as long as I still think I am what I think I am, and it seems that to change my image is to die.
So for yourself 'lying to yourself' is much more subtle, really. It's almost like an approach to the world and the self -- whereas in one case we must be something we are not, or we believe we are this exact thing and it's a hill to die on, and in the other case we recognize that we are not this set of beliefs about ourself and are open to learning more -- it is exciting to change the image in the face of new information, rather than a death-threat.
it is exciting to change the image in the face of new information, rather than a death-threat.
Yes, exactly, learning without accumulation, one could say.
[quote=J.Krishnamurti] Learning through experience is one thing – it is the accumulation of conditioning – and learning all the time, not only about objective things but also about oneself, is something quite different. There is the accumulation which brings about conditioning – this we know – and there is the learning which we speak about. This learning is observation – to observe without accumulation, to observe in freedom. This observation is not directed from the past. Let us keep those two things clear.[/quote] https://www.jkrishnamurti.org/content/‘learning’/learning%20without%20accumulation
So my image is from the past, and the image does not observe; I observe in the present, unidentified with the image.
Metaphysician UndercoverJuly 28, 2018 at 12:10#2008590 likes
On the one hand one discovers oneself in relationship, one is learning, and on the other, one is told what one is and what one must be. One must be good because one is naughty. And Santa will know which you are. Most people are naughty and being good, taught to live a lie in negation of the lie they have been taught.
Here we go ...the divided self. Can you describe or explain how the divided self, divided in this way, between what I want to do, and what I ought to do (mummy tells me so), relates to your description of that other division between "what I am" and "what I think I am"?
I would assume that what I am relates to what I want to do, and what I think I am relates to what I ought to do. But you say that what I think I am is an aspect of what I am, so how would what I ought to do become an aspect of what I am when it's only related to what I think I am, and separate from it? See, I'm divided because I perceive what I ought to do as an aspect of others, "mummy told me not to do this", and not really an aspect of myself at all. Maybe it's a third person perspective. But then it's impossible that what I ought to do can be an aspect of what I think I am, unless others are somehow controlling my thoughts. How would "what I ought to do" become a real aspect of "what I think I am"? It would seem like it could only be an aspect of "what I am".
Would you say that "what I am" is itself a deception, that there is only what I think I am, and what others think I am? There really is no self, only an image. If not, then what supports the assumption that there is such a thing as what I am? Is it necessary to assume a "what I am", in order to produce a divided self, to expose the possibility of self-deception, which is really an attribute of the one undivided "what I think I am?
Attempting to find out what would be a bit more congenial to your taste's @creativesoul --
I think awareness through time is doing most of the work in making the concept of lying to oneself coherent, for me. Just as I can flip my awareness in a moment from the thoughts I am having to my fingers, to my memories, to my feelings it seems to me that a flip in awareness could happen from two halves of myself. So where I do agree with you is that the part of myself that is lying could not misrepresent their own thoughts and trick themselves -- there is a need for some kind of a division for trickery to be successful on this model, because you have to be aware of the trick if you're setting out to trick someone. Like a three card monte player knows how to replace a card without someone observing, they couldn't do so to themselves.
So I'm tracking with you on that. For me the flip in awareness is what's important -- so at one point we are aware of the trick, and at the other point we are not. For something like three card monte, where we have concrete points of reference in our literal hands this would be pretty extreme, though maybe possible. But for something a bit more abstract, like knowledge of myself, it doesn't seem so extreme to me because we aren't perfectly transparent to ourselves.
Since we aren't perfectly transparent to ourselves it actually becomes rather easy to lie to ourselves because the trick lies in what is actually a very plausible belief: "I am not transparent to myself" -- so if I come across something that I'd term inconvenient for myself, all I need do is remind myself that I am not transparent to myself and suddenly what was inconvenient becomes questionable.
That's why it makes sense for me, at least. Where in this line of reasoning does something just balk as unnacceptable to you? My guess is you'd just say this is not lying. But if I both believe P and ~P -- because I did, after all, come across something inconvenient -- then that seems to fit perfectly with the notion of lying, or tricking myself. In fact it seems like in order for me to intentional trick myself I would have to believe both, since to be intentional about the lie I'd have to believe P and want myself to believe ~P, then convince myself of ~P -- without changing the original belief.
Whereas to be mistaken would just be to believe something that is false, or to believe something that is true but for bad reasons.
Would you say that "what I am" is itself a deception, that there is only what I think I am, and what others think I am? There really is no self, only an image. If not, then what supports the assumption that there is such a thing as what I am? Is it necessary to assume a "what I am", in order to produce a divided self, to expose the possibility of self-deception, which is really an attribute of the one undivided "what I think I am?
Well I'm only talking, and only even talking about thought. Physically, there is no division, obviously. So there is a division in thought, and a mental conflict. Or perhaps there isn't in your case, it's for you to say.
'What I am' is writing a response - this one - (dasein?). This is real enough, I don't have to assume anything. And then you want a response that explains and justifies the writer in some way, and that is the image I am conveying in the writing that is not the writer, but an image of him. I don't see why you want to problematise this?
Metaphysician UndercoverJuly 28, 2018 at 16:54#2009170 likes
'What I am' is writing a response - this one - (dasein?). This is real enough, I don't have to assume anything. And then you want a response that explains and justifies the writer in some way, and that is the image I am conveying in the writing that is not the writer, but an image of him. I don't see why you want to problematise this?
What I was asking is how do you relate this to the division between what I want to do, and what I ought to do. It's all thought, as you say, but suppose I am writing this response because I want to, but I am thinking that I ought not to be, because I have other responsibilities which I should be taking care of right now, instead of wasting my time doing this.
So in your response, you indicate that "what I am" is writing this response, despite the fact that I ought not to be writing this response. How do I get to the point where I can produce consistency between what I want to do, and what I ought to do, such that what I am is the same as what I ought to be, because I would be doing what I ought to be doing? Otherwise I see no reason to do what I ought to do, because it's simply not me.
How do I get to the point where I can produce consistency between what I want to do, and what I ought to do, such that what I am is the same as what I ought to be, because I would be doing what I ought to be doing?
You'll have to ask Jesus about that one, dude. All I can point out is that 'what I want' and 'what I ought' are images that often conflict, and 'what I am' is what happens as a matter of fact.
Metaphysician UndercoverJuly 28, 2018 at 20:24#2009570 likes
Your argument appears to be:
1: Self-deception only makes sense if it makes sense for one person A to act deceitfully towards a person B, where A and B are the same person.
2: It does not make sense for one person A to act deceitfully towards person B, where A and B are the same person.
3: Therefore, modus tollens, self-deception does not make sense.
To support (2), the model of A being deceitful towards B regarding some proposition P, goes along the following lines
a) A knowingly believes that not-P.
b) B's own interests are best served by knowingly believing that not-P.
c) A knowingly believes that his (A's) intentions/wishes are best served by having B knowingly believe that P.
c) A acts intentionally in order that B should knowingly believe that P.
So we end up, if the deceitful behaviour is successful, with A knowingly believing that not-P and B knowingly believing that P at one and the same time. This is not possible where A and B are one and the same person.
(Note that we are talking about A being deceitful to B, not A merely deceiving B.)
That seems right, and so (2) is a true premise.
But this leaves conditional (1) undefended and Srap Tasmaner and I have been suggesting that self-deception should not be modelled on one person being deceitful to another. That part of the argument you have not yet established. As far as I recall, it came down to intuitions about what you/I/Srap would or would not call cases of self-deception.
That's close, but not quite the argument...
Lying to another is deliberately misrepresenting one's own thought and belief to another. Lying to oneself would be deliberately misrepresenting one's own thought and belief to oneself. One always knows if s/he believes something, doesn't believe something, or are uncertain whether or not s/he believes something. Thus, one cannot deliberately misrepresent one's own thought and belief to oneself.
Strictly speaking, I shouldn't say 'nonsense' if I mean incoherent, unintelligible, or self-contradictory. Whether or not something is sensible(in this context) isn't determined by being coherent. It's determined by common use. The notion of self-deception and lying to oneself is often used in common talk.
Attempting to find out what would be a bit more congenial to your taste's creativesoul --
I think awareness through time is doing most of the work in making the concept of lying to oneself coherent, for me. Just as I can flip my awareness in a moment from the thoughts I am having to my fingers, to my memories, to my feelings it seems to me that a flip in awareness could happen from two halves of myself. So where I do agree with you is that the part of myself that is lying could not misrepresent their own thoughts and trick themselves -- there is a need for some kind of a division for trickery to be successful on this model, because you have to be aware of the trick if you're setting out to trick someone. Like a three card monte player knows how to replace a card without someone observing, they couldn't do so to themselves.
So I'm tracking with you on that. For me the flip in awareness is what's important -- so at one point we are aware of the trick, and at the other point we are not. For something like three card monte, where we have concrete points of reference in our literal hands this would be pretty extreme, though maybe possible. But for something a bit more abstract, like knowledge of myself, it doesn't seem so extreme to me because we aren't perfectly transparent to ourselves.
Since we aren't perfectly transparent to ourselves it actually becomes rather easy to lie to ourselves because the trick lies in what is actually a very plausible belief: "I am not transparent to myself" -- so if I come across something that I'd term inconvenient for myself, all I need do is remind myself that I am not transparent to myself and suddenly what was inconvenient becomes questionable.
That's why it makes sense for me, at least. Where in this line of reasoning does something just balk as unnacceptable to you? My guess is you'd just say this is not lying. But if I both believe P and ~P -- because I did, after all, come across something inconvenient -- then that seems to fit perfectly with the notion of lying, or tricking myself. In fact it seems like in order for me to intentional trick myself I would have to believe both, since to be intentional about the lie I'd have to believe P and want myself to believe ~P, then convince myself of ~P -- without changing the original belief.
Whereas to be mistaken would just be to believe something that is false, or to believe something that is true but for bad reasons.
It seems that this hinges upon the notion of being transparent to ourselves.
On my view, one always knows whether or not they believe what they say. That is, one always know whether they believe it, do not believe it, or are uncertain.
Seems to me that you've offered an account of one developing contradictory beliefs.
So, it's basically the idea that the same person acts differently in different situations. This revolves around ethics. That is to say that how one acts in some situation or other, assuming the person has some notion of appropriate behaviour, ought be situation specific. We decide what counts as acceptable/unacceptable behaviour in different situations, and we hold others to that standard. I mean, we all know that some of what we do in our own homes, we would not do in public. In school, there are codes of conduct. At work, at the movies, in the theatre, in a restaurant, etc; all these places have slightly different codes of acceptable/unacceptable behaviour. So, we have different criteria and thus differing expectations for behaviour depending upon the situation we're in...
That's the groundwork. I want to tie this into the conversation here...
If it is the case that someone strongly believes that they must consciously alter their behavior as a means to conform to what they believe is expected, this could set up a situation that fulfills - I think - some of the notions expressed in this thread regarding a case of lying to oneself, and/or deceiving oneself.
When one acts in ways that satisfy what one thinks that others expect, and they do this intentionally and deliberately, then they think that that's how they ought act. If those ways include saying things that they do not believe, but rather they say them because they think that it's the best thing to say at the time because of the situation they're in, then we have everything needed for one to lose sight of what they actually believe...
Now, I am not saying that this must be the case or always is the case when one 'wears different hats', but rather that it has what it takes for one to lose sight of oneself, over a significant enough period of time.
What must be the case in order to successfully lie to yourself?
A lie needs a truth or unanswered question. If we consider the former then one can't possibly lie to oneself. If the latter then ignorance can always be replaced with a lie at random.
In fact I think the whole process of human thinking, the so-called hypothetico-deductive method, consists of entertaining possible explanations for a phenomenon and eliminating the lies to get to the truth. Looks like we do lie to ourselves but not for long IF we're rational.
Victoria NovaSeptember 23, 2018 at 18:38#2145440 likes
People tend to stick to ego rather than logic or sense. Example: a woman learned, accidentally, of an ethnic tradition of her in–law-family to give relatives anything they like in her house. She's appauled that her in-law-relatives used this tradition to ask for her alarm clock and take it. She later on gives away to the strangers the only beautiful beeds she had out of spitefulness: let this not be her relatives, whom, she assumed, will asked for her beeds as well and she won't have them anyway. Her ego erased any logic or real sense. Building relationship with relatives did not take important place in her mind, but rather the need to be revengeful.
Pattern-chaserSeptember 24, 2018 at 15:15#2147660 likes
"Self-deception" is a description applied by someone else, yes? It's a judgement that you make about me (for example). But what if you are wrong? Should we question the judgement of 'self-deception'? Maybe so. After all, *I* don't think I'm deceiving myself; it's you who thinks that. Or are we considering how someone would purposefully deceive themselves? I assume not.
how someone would purposefully deceive themselves? I assume not.
The latter is what I was considering. I'm not as interested, here, in determining how we might know others -- only the conditions under which one might lie to themselves. It could be that said conditions are not easily determinable due to practical considerations.
Comments (184)
A highly inconvenient truth likely lurks around the corner which, if we were honest with ourselves, we would have to acknowledge. (1) But we don't acknowledge it because our reputation would suffer. (2) We regularly install false premises to protect ourselves from 'full disclosure' -- perish the thought! (3) We either can't escape our own self-delusions, or it is extraordinarily difficult for us to do so. We can't be our own exterior observers. (4)
We may also be engaged in deceiving other people. Effective deception requires the appearance of conviction, and in projecting conviction we may, as the saying goes, come to believe our own bullshit. (5) Successful con artists know they are deceiving others and manage their act. Most of us aren't that good at it. We believe it ourselves.
Other people do not always wish us well and say unkind things about us--some of which may be true, or may be false. True or false, we defend ourselves by denying what they say. (Believing all the negative things one hears about one's self might be quite self-destructive.) Rejecting negative feedback becomes a protective habit. (6)
And more!
Quoting Bitter Crank
Have to disagree. It is exactly by being our own exterior observers that self deception becomes possible.
It hasn't happened yet. We are doomed to deceive ourselves, at least to some extent.
How does one attain this state of mind, oh wise one? :cool:
To SUCCESSFULLY lie to oneself ?
We could interpret that in such a way, that the lie brings us good fortune and wealth and esteem. That would mean that the lie was indeed successful.
But was it a lie ?
Let's say you say, "I am the greatest of all athletes " But your doubt says to you, "oh, come now, you are a little bit lazy now and again, you aren't the greatest ! The man down the street is greater than you!"
But, because we have told ourselves this "lie" over a period of days and weeks and months, one day we wake up and we ARE the greatest of athletes.
So now we have to say, that the lie has vanished like vapor. The words now stand true.
That could be an example of the successful lie.
And also -- to lie usually requires some kind of intent to deceive. So I mean this, rather than just mere confusion or delusion or something like that.
Definitely. I'm curious about this, first at a conceptual level and also as a phenomena. I think that if we could demonstrate somehow that we were successful at doing this it shows something about the mind that's important.
Is it enough to say that having two mutually exclusive beliefs at once is enough to count as a divided mind?
Well yes and no. :wink:
There's always the question, 'who is saying it?' So I am saying that the mind is more or less always divided, (except Jesus and Buddha), so it is the divided mind that is saying the mind is divided, but saying it as if it were undivided.
Quoting Bitter Crank
So this, by its own thesis is a self-deception too, because what is it but an exterior view? As soon as one talks about interior and exterior, or deceiver and deceived, or as soon as one talks about the divided mind in any other way, one has recourse to the speaker, the observer, the analyst, call it what you will - there is always a three way split.
That's one aspect of my yes and no, but the other aspect is that the division(s) themselves are fabrications.
So even before there are mutually exclusive beliefs, to say 'I have a belief' is already to have divided the mind into the believer and the belief. So having contradictory beliefs isy more like having non-matching socks than something that creates a division.
Lately I've been conceiving "consciousness" as a kind of conglomerate or patch-work that is made up of many subsystems (mainly abstract predictive models learned and encoded in neural networks). By combining the right predictive models in response to stimulus (another layer of predictive modeling which learning neuronal networks can encode and optimize), accurate belief and "awareness" emerges as a result (not self-awareness, though that may be yet another layer of learned predictive modeling, but awareness in and of itself).
The end product that that the conscious mind labels belief (the understanding it is aware of) is initially the result of bottom up selection in and between neurons and neural networks which optimize them into predictive units (between which further bottom up selection and optimization into compound predictive units occurs). The overall conscious experience is then a cascade of predictions, where we are more "aware" of those predictions/beliefs which are formed from selection between greater numbers and levels of predictive sub-networks, and we are less "aware" of those optimizations and selections which occur in and between lower level and fewer quantities of abstract predictive models.
Relating this to self deception:
Natural selection between predictive models involves trial and error; error, if when one predictive model is apt and accurate but is neglected in favor of a less apt and accurate model, then self-deception has conceivably occurred.
Self-deception doesn't exactly carry the same connotations as "lying to one's self" though. To lie to one's self implies intentional self-deception. But what is intent?
Intent, I reckon, is one of those executive components of mind and brain of which we are by definition more "aware". It is when we have a full formed notion of something we desire and we employ the sub-networks of predictive models (of which we're less aware) to actually arrive at the thing we desire.
If, for instance, we desire to be somehow virtuous (intelligent, moral, successful, likeable etc...) then we may ask ourselves whether or not it is already the case that we have such virtue. If the desire is strong enough (and the feeling failure entails too harsh) then perhaps we bias ourselves in the course of consciously discriminating between groups of predictive models/understandings and arbitrarily ignore models which do not reinforce our higher level preconceptions. In other words, when we assume that something is true we may fundamentally alter our predictive models to conform to that assumption. We may invent excuses that amount to predictive models which do not conform to reality, or we may ignore and negate predictive models which DO conform to reality.
In summary, we lie to ourselves when our high level consciousness (the thing with the most "awareness"), which ideally is the more reliable product of complex selection (more accurate), operates fast and loosely on the sub-components of mind of which we're less aware. It is an error that is reinforced via top-down selection in the hodgepodge that is the human mind and brain.
"Don't try to tell me anything about teenagers, I have two of them. Let me tell you something about teenagers, when they open their mouths in the morning... the lies form!"
is effectively true for the majority.
M
Quite simply: motivation.
I want something, so I manufacture reasons why I should have it or why I should believe it.
When one lies, they are intentionally misrepresenting their own thought and belief. One cannot do that to oneself.
I've given my argument. What's yours?
What is it about Jesus and the Buddha that makes them have undivided minds? Do they simply believe, rather than say they have a belief? How does that work, in the sage-like mind? (ideally speaking -- the facts are gone to history)
Most of the time we are darkly ignorant of our real intentions. All we mostly want is pleasure.
None us all really want the other guy to win. Not if he isn't on our side !
But we will string a narrative to convince that we are the good guys and those are the bad guys.
Isn't that a lie ?
Let's say that we are not one. If we are divided then it would seem that we could lie to our self -- from one self to another self. Not in some pathological or diagnostic sense, but rather this is something that the mind can and does often do -- it is "normal". Would it be possible, at that point, to lie to yourself?
I am interested in the possibility that this is impossible -- that "lying to yourself" is a turn of phrase. But I'm interested in what would be required, at a conceptual level, for it to mean just more than a turn of phrase -- whether or not we do so in fact. Mostly because it would provide a means for determining whether or not we can or do lie to ourselves.
I think desire plays a role, for sure. But it has to be a certain kind of desire. To use the virtue example above, if we really wanted to be virtuous then that desire would be more powerful than momentary shame at seeing who we are right now -- and we could begin working on ourselves, performing a kind of technological operation on our soul to begin changing to a certain degree.
But what is the structure of desire that makes one lie to oneself, as opposed to really desiring to be such and such?
Do you feel like an amalgamation of computations? I don't really. If it is true it's all "under the hood", so to speak.
A lie would be really hard to model just using computational models, I think -- even moreso to lie to oneself. Or maybe not, maybe it's much the same thing -- just a mind divided.
But how would you computationally model a lie to another neural network?
Seems complicated and difficult.
I guess that depends on whether or not we really are the good guys or the bad guys. :D Though that sort of evaluation isn't exactly amenable to basic fact checking, since goodness and badness are not facts but judgments of value that we make or hold. So naturally we'd think we are the good guys, since these are relative to what we already hold to be good. But it is a bit circular.
You bring up real intentions, though. So there are real intentions and there are unreal ones (false ones?). And there is a kind of veil between what we believe our intentions to be and what the real ones are.
Though if that's the case then it seems we can still know that our intentions aren't what we'd like to believe they are. We know we mostly want pleasure. But somehow we believe we are good (or I am good?) -- but the knowledge goes to the wayside, like an abstract proposition.
What is this division between belief and actual intent? How do we know we mostly want pleasure, yet still believe we are good, and intend to do good? Or is this a sort of unveiling of a way of lying?
We want the lie to be so successful that we begin to believe it ourselves? :D Sounds like a good premise for a play.
I'm noticing that your examples seem to be of delusions of one sort or another. There is something inconvenient so we ignore it and come up with alternate beliefs to shield our awareness -- give it something else to fixate on -- and in a way are thus deluded. But is that lying, exactly?
Quoting Bitter Crank
I'd say this is just a way of coming to a false belief about ourselves through habits. What's going on is we hear something negative from a source we don't trust, so we just sort of tune it out on the basis that we've had negative things said about ourselves many times before and they weren't exactly true as much as expressions of how the other person felt.
Since lying is deliberately misrepresenting one's own thought and belief, and it is always done in situations when the speaker believes that they ought not allow others to know what they think and believe, it seems to me that one cannot lie to oneself.
Hold false belief. Sure. It is humanly impossible to knowingly do that. Seems to me that lying to oneself only makes sense in light of an ill-conceived notion of lying. That is, when one holds that lies are always false.
We often choose to believe things despite an absence of rational support. Is that only a lie if for virtuous purposes? Is it never a lie?
What is a lie? I tend to consider it the deliberate telling of a known falsehood. Did Trump lie when he proclaimed the inauguration crowd biggest of all time, or did he actually believe it? Did Obama lie when he said you could keep your doctor under the ACA?
"Lie" may be too black and white a term.
What if we are of two thoughts?
I believe something good about myself. I know that it is false. These are in conflict with one another. So let's say we become aware of different beliefs at different times. I tell myself the good thing and I want to believe it, so I do. There's the part of me who lies, and the part of me who listens. And I stop being aware of the part of me who lies right after telling myself the lie. I know that I have to deceive to achieve the desired belief.
If we are of one mind then I don't think we could lie to ourselves. I agree with that -- that's why I thought @unenlightened made a good point in saying we'd have to have a divided mind in order for us to lie successfully, and not just be delusional or some such.
Quoting creativesoul
At least in a general sense I'd say that's what lying is -- to tell someone a falsehood while knowing it is true in order to deceive them. So I'd say that in the case of telling someone about my own thoughts then I'd be lying if I told them something I do not really think -- that this is a particular case of lying, but that lying doesn't have to be about my own thoughts. It could also be about whether I have the money for the bill.
I think we're in agreement here. We tell someone a falsehood we know to be true. Maybe there's a motivational component to this but that seems to be the bare minimum of what a lie is.
I don't think I'd say that believing such and such without rational justification counts as a lie. It may be irrational, but without justification we do not know, and if we do not know then we couldn't be telling ourselves a known falsehood.
Part of the difficulty in determining a lie is in being able to tell if someone really knew something or if they were just mistaken, delusional, or something along those lines. Usually we mean that the person lying both knows the truth and tells the opposite. With two people this is easy enough to understand -- one person knows, the other does not, and the person who knows believes that the falsehood is better to say than the truth (for whatever reason -- could be white lies, or nefarious. Could be to preserve feelings, or manipulative to get what one wants)
But with one person it seems strange to say. But it is a common turn of phrase to claim someone is lying to themselves. Hence the line of questioning -- perhaps it is just a turn of phrase, but what would it take for someone to lie to themself, to where it was more than just a turn of phrase?
Not just complicated; complex!
Quoting Moliere
Sure we don't feel like an amalgam of streaming information exchanges among and between learning neural networks, but there's too much evidence to ignore that it is so.
Quoting Moliere
General self-deception I would describe as existing in the fact that an erroneous or inapplicable sub-model is used in the formulation of a given belief. There's no difference between this description and simply being incorrect or mistaken about something, and the feedback we get from such mistakes is how we develop and optimize existing and new models; it's how we learn.
"Lying to one's self", if such a thing exists, must be more than just self-deception. As a guess at what it could be (or something like it) from a learning network perspective, I would say that it occurs when a consciously held belief (the higher level result of complex network interactions) happens to be erroneous, and causes related lower level models/networks to move toward an erroneous or de-optimized state.
When another person lies to us and we believe them, we may alter our fundamental understanding and cognitive models of the thing we're being deceived about. When we ourselves formulate erroneous beliefs and cling to them with conviction, our fundamental understandings which underpin them must be bent or negated to fit properly, to avoid dissonance.
The trouble with modeling such phenomenon computationally is that we're unable to follow the rhymes and reasons of learning neural networks as they learn; we can create a learning machine that can become excellent at a specific task through experience, but the way it discovers and encodes patterns creates messy extended algorithms that are utterly illegible (too long, too raw, too abstract; imagine a human mind expressed in algebra).
As potentially recursive in various ways, these extended algorithms can alter themselves (for better or for worse, though the thing that makes them "learning" is that they tend to alter themselves for the better), so, when an error in one part of the algorithm is carried into its product, which then recursively alters sub-components of the algorithm toward greater error (and sustained error), we might say the algorithm has successfully lied to itself.
Put in the most uncomplicated terms I can think of, "lying to one's self" is like the opposite of learning truth; it's when we learn untruth not because of externally deceptive stimulus, but because of our own faulty minds. Bias of all kinds therefore fits the bill of lying to ourselves along with succumbing to any self-generated fallacious appeal!
What evidence persuades you that you are a neural network?
****
I sort of feel like the computational approach has to abandon "belief" -- there is no belief formation, there are algorithms which optimize. There is nothing that a belief is about, there are models of math problems through logical switches. And the stream of electrons move in accord with physical facts.
Similarly, a few levels up, we have algorithms optimizing and modifying themselves in light of some goal set for them. But do the algorithms lie to one another? Do they avoid dissonance? Or are they simply following instructions and giving us a good model for understanding (some of our) learning? It seems the latter to me.
In order for a lie to be successful, and not just count as a lie, it seems to me we have to rely upon some guesses as to how the person we are lying to will take the information. We have to imagine what it would be like to be them. So we have to have some sort of beliefs (model? Possibly if we make an art of lying) about the other person's mind, how they react to different sorts of information, presentation, and their general mood. That way we can craft something that sounds believable to the person we're talking to, even though we know it to be false.
Lying, as simple as it seems and as young as we learn how to do it, is actually a really complicated behavior.
Brain damage in various places can have corresponding effects on conscious experience and mental faculties. The taking of drugs for instance interacts with individual neurons and neuro-receptors in the brain and can cause drastic effects on what we think and feel. Artificial neural networks which have proven capable of a certain aspect of learning were inspired by biological neural networks, which in and of itself is enough to convince me that I am in large part a neural network (or at least the part of me that learns, which happens to be the best part :wink: ).
Quoting Moliere
Granted I cannot solve the hard problem of consciousness and explain how the outcome of an algorithim can seem like a "belief". As it sits physically in a network, a belief is abstract, and like raw data in a file it can only usefully be expressed when executed within the larger set of executive functions that cause "beliefs" to spill into our thoughts and out of our mouths. That this network can learn and alter itself on the most fundamental level is why it differs from an ordinary algorithm. When exposed to a world of varied and complex stimulus, there's no way of precisely predicting how such a network or algorithm will respond; it learns and evolves chaotically.
Quoting Moliere
To an algorithm, dissonance comes in the form of a weighted error value. The bigger the error, the greater the dissonance (and the more drastic the self-correction). End results such as intent and feeling are of the ineffable whole rather than a specific part, or so it seems; my brain doesn't feel, it reacts mechanically, but its abstract product - the mind with awareness - seems to.
Bearing in mind that you're asking me, unenlightened (surely a foolish move?), I think it is a matter of identification.
So, for example, there are facts about where I was born and what kind of passport I have, and then there is the identity of 'Englishman'. Or there are facts about what I have read and studied and thought over, and then there is the identity of 'philosopher'.
Identity is somehow more than the facts; it is a commitment to the facts; an investment in the significance of the facts. And this creates a separation, of a central self in the mind - I am an English philosopher. Something to protect against, well everything, including whatever else might be the facts of what I am.
Knowing that 'X' is false makes it impossible to believe 'X'. I believe 'X' about myself. I cannot do both, know that 'X' is false(about myself) and believe that 'X' is true(about myself).
As soon as we become aware that 'X' is false, we cannot possibly believe otherwise. That holds good in cases where 'X' is true, but we believe 'X' is false. If we believe 'X', then we believe 'X' is true; is the case; corresponds to fact/reality; is the way things are; etc. We cannot do both, believe 'X' and know that 'X' is not true; is not the case; does not correspond to fact/reality; is not the way things are; etc.
Quoting Moliere
Well, strictly speaking 'one' who has two minds is two... not one. We cannot be of two minds, strictly speaking... aside from having some sort of multiple personality disorder. These are common is cases of tremendous childhood trama. It's a coping mechanism. Since the facts are too much for the one individual to bear, the one 'creates' an alternative persona as a means to 'split up' the burdens...
I see nothing wrong with saying that people of one mind can hold contradictory beliefs. I would wager that everyone does, at least during some period of their life. Some become aware of this and choose. Others become aware and suspend judgment. Others become aware and struggle to grasp what's going on, and thus chalk it up to being normal, or some other ad hoc explanation. Others never become aware.
There is some tremendous difficulty involved in becoming aware of one's own false belief, assuming one wants to correct the situation.
It is also quite common to be uncertain about something or other. These latest situations I've mentioned are often spoken of in terms of "being of two minds", and that makes perfect sense in everyday parlance.
Quoting Moliere
I think you mean to say that lying is -- to tell someone a falsehood while knowing it is false.
If you believe you have the money for the bill, and you state otherwise, then you've deliberately misrepresented your own thought and belief. If you do not believe that you have the money for the bill, and you state otherwise, then you've deliberately misrepresented your own thought and belief. Both are cases of lying.
Those lies could be true. Here's how...
You could be wrong about how much money you have. Thus, if you believed you had enough, and stated that you did not, you would be lying. Now, if by chance, you had forgotten how much money you'd spent over the past weekend, you would have less than you believed. So, the belief that you had enough would be false, and yet the statement(the lie) that you did not would be true.
Lying has less to do with truth, and more to do with thought and belief. That is, lies themselves consist of statements that can be either true or false, but the lie is always told by someone deliberately misrepresenting what they think and/or believe.
It's not that complicated. One always know when they have just said something that they do not believe. An honest speaker will immediately correct themselves in an authentic accidental situation of misspeaking. The dishonest speaker will not, and claim that they had misspoke if and when another calls them on it at a later date...
I think most people's favorite method is convincing themselves, persuading themselves that they know something which they do not. (Second place is probably convincing themselves that they do not know something which they damn well do.) I'd count that as lying.
When someone believes that they know something that they do not, they hold false belief about themselves. Holding false belief is neither necessary nor sufficient for being a lie. Being convinced that one does not know something when they do, is - once again - being mistaken about oneself. Again, not dishonest or insincere, but rather just plain 'ole being mistaken... holding false belief.
You missed the process part. Sometimes you cherry-pick the evidence, and you know you're cherry-picking, and you know you shouldn't, but you do it anyway. A sort of cognitive akrasia. With others, it's easier: you just say something you know to be false. With yourself, it usually takes a more sustained effort.
It's a foreign notion. Could you elaborate, so I can know more about this notion of how one can deliberately misrepresent their own thought and belief to themselves?
Quoting Srap Tasmaner
I do not see how this is anything other than one who knows that they are doing something that they should not. Eventually... what? They deliberately trick themselves into thinking it's ok?
Yes. People, for instance, buy lottery tickets.
Yeah, I'm not following you...
What counts as a lie? What is the criterion which, when met, counts as being a lie?
I agree, you must define what you mean by lying and in what context to be clear.
One is encouraged in some psychological quarters to seek to change the way one thinks. "Say to yourself, 'every day in every way, I'm getting better and better.'". By repetition, the theory goes, one becomes convinced of something one did not believe.
A sportsman will psyche himself up in this sort of way - 'I am the greatest', and it works, at least to an extent. Perhaps philosophers can do the same - try saying to yourself, "I am such a deep thinker, I can even appreciate unenlightened's posts." It might take a lot of repetitions, and it won't actually make either of us smarter, but don't tell yourself this, tell yourself that it really works, because it really works.
Forgetting is very real. When a person represents to oneself a memory, which is not really a memory, but something imagined, because the real thing has been forgotten, then that person is misrepresenting one's own thought and belief. This is actually very common, that a person represents something imaginary to oneself as a memory. And the person doing the "remembering" very quickly overlooks, and forgets the division between the aspects of the memory which are real, and which are imagined.
That is why two people can both say "I remember the event this way", when the two ways are contradictory. The two people will both argue sincerely that it must be my way because I remember it that way, when it is impossible that both ways are correct because they are contradictory. Are you married?
There is evidence that lying on your CV can get you a better job, as long as you don't get found out.
But my point is that there is a self-image, positive or negative, and the image acts. This is demonstrated by the fact that when my self-image changes, my actions change. Affirmations work! And they work in exactly the same way as compliments or insults coming from others do. They build an image and the image acts.
But to have an image that acts is to have a divided mind; it is to be running a simulation of oneself and letting that run one's life. One performs one's identity.
How does one stop acting from the image?
Just my two pennyworth. :wink:
Deliberately misrepresenting is not forgetting...
Heh. Well, I'm not exactly the wisest so I don't mind. :D
I suppose I'm trying to understand the notion of a split mind -- so I'm looking for something to contrast it with to make sense of it.
Quoting unenlightened
So a whole mind would be one without an identity, without a commitment to certain facts. It would accept all the facts about itself as relevant to itself, or would be committed to no facts about itself at all. A person with a whole mind would not have an identity to protect or project.
But confabulation is a little of each.
You know how it's impossible to walk any great distance** if your stride with one leg differs slightly from your stride with the other? Now tell yourself at each step that it's only a little different, and that can't make much difference. It's like that: you relax your cognitive standard just a bit, and indeed it does not make the inferential step you're taking invalid, but if you keep compounding this little compromise you end up in the wrong place. I'd call this a kind of lying to yourself and it's incredibly pervasive.
** in a straight line
I didn't say that. I see, as usual, you didn't read my post, responding just to an out of context word.
I said that filling in the blanks with imagination, where memory leaves things out, and representing this to oneself as memory, is misrepresenting one's own thought. In recalling distant memories it is difficult to distinguish aspects of "true memory" from imagination because "the memory" changes over time. If one represents this to oneself as "true memory" when there are aspects of imagination which have been mixed in over time, this is misrepresentation.
If you didn't say that, then your example is irrelevant, a lie is deliberately misrepresenting one's own thought and belief.
Are you talking about cases where someone changes some standard they hold?
It's not forgetting which is deliberate misrepresentation of one's own thought and belief, it is remembering which can be such. This is the case when aspects of the event which has been remembered, have been forgotten and replaced by the imagination. These things which have been produced by the imagination are deliberately misrepresented as memories.
Looks like rhetorical muddle.
Which part is thought and belief?
I remember hearing years ago that it's common for emergency rooms to have a spike in admissions just before dawn. The explanation was people lying awake all night telling themselves "It's nothing" and eventually accepting that something was terribly wrong.
Are you really not familiar with any of these phenomena?
It's not so much that I haven't heard of such reports, it's that I'm questioning the reporting itself. I do not see how any of it qualifies as deliberately misrepresenting one's own thought and belief to oneself.
Yes, I understand that as far as you're concerned the phrase "lying to yourself" is just a contradiction. But it's a phrase we all use, so what are the options?
Stage magic and storytelling both include techniques that rely on our capacity for self-deception. Sometimes the magician, instead of trying to hide how a trick is done, can get the audience members themselves to dismiss the solution, and this is much more effective. A movie can present a character that's a little "off" but not make a big deal about it, and the viewers will mostly decide not to worry about him, until the third reel when it turns out he's the killer.
You could say these are cases of deception, but really it's just giving us the opportunity to deceive ourselves and most of us are generally quite prepared to do so.
These options offer little more than unnecessary and unhelpful restriction to the considerations here.
My notion of what counts as a lie need not exhaust all other sensible notions/uses of "lying" in order for it to be able to correctly and irrefutably set out that which we all agree is - most certainly - a lie(an insincere speech act).
It is humanly impossible to knowingly believe a falsehood. Deception - in and of itself - comes in many forms. One of which is lying to another. One cannot deceive oneself. That's pure unadulterated nonsense. Being tricked requires not knowing your being tricked. Tricking another requires knowing you're tricking. One cannot both know they are tricking themself and not know that they're being tricked.
People say that it is possible to deceive ourselves. So what? Saying that that happens doesn't make it so. People use the term "truth" as a synonym for one's worldview(people conflate truth and belief). That has no bearing upon how we assess a much more disciplined and sensible use.
Calling a criterion for lying 'too narrow' implies that there are some lies that that criterion cannot account for. That's quite the specious claim. Looks strong until it is given some serious thought.
All the different conceptions sharing same name fail in some way or other to be able to account for the others. That is precisely how we arrive at different ones. That is not a flaw in my argument about lies. Rather, it is a necessary feature of all such arguments.
Self-discipline?
Well, who's asking? If it is not the image asking, then you have already stopped. So it must be the image asking how not to act, and then it is obvious that there is absolutely nothing that the image can do that is not the acting of the image. I think if one ( it ought to be two, really) could completely grasp that the image can do nothing to help in this situation, that one is completely helpless, then one simply does stop. One gives up.
Being intentionally led by another to believe something is not self-deception.
I daresay you haven't had much practice. When I was your age, I always did it for half-an-hour a day. Why, sometimes I've believed as many as six falsehoods before breakfast.
My god, that's brutal. I had forgotten.
Chomsky said somewhere that his life's work was organized around two complementary problems that he called "Plato's Problem" and "Orwell's Problem". Plato's problem is: How do we know so much, given so little evidence? While Orwell's Problem is: How do we know so little, given so much evidence?
I'd say that given a dimension of time that this could be overcome. So right now if I believe "X", then I know "~X", I could then choose to believe "X" and forget or ignore "~X"
With a dimension of time we also have changes of awareness. So at different moments we can come to be aware of different things.
Quoting creativesoul
I think we're basically in agreement on lying. At least I'm most interested in this more robust theory of lying, as opposed to delusion, just because it's the more difficult case -- and you seem to agree that delusion is possible, just not lying
So your main point of disagreement is really that being of two minds is not normal -- it would have to be a pathology of some kind at play in order for someone to lie to themselves.
Quoting creativesoul
I had in mind saying "I do not have the money" when "I have the money" is true -- but yeah, I was flipping the signs in my head. The former would be a falsehood, the later a truth, and you'd be saying the falsehood and not the truth.
Quoting creativesoul
I think this is a minor disagreement between us. I see what you mean, but I'd say that you'd have to know something to be true and then say its opposite, whereas you'd say that it comes down to belief -- so you believe "X" is true, but you say "~X".
Good enough for me. I think the split-mind disagreement is the stronger of the two. Yeah?
Yup! :D
So it's something, in your view, that happens along a chain of reasoning. So you might have the notion that this is going somewhere bad, and then come up with some reasons that you don't scrutinize too deeply to make it go somewhere good.
What's nonsensical about a split self? Is it any more nonsensical than a singular self?
What you are refusing to take account of, is the fact that people change as time passes, and their minds change as well. Deception is an act in which the act of the deceiver is prior in time to the falsity being believed as true by the deceived, the result of the deception. One is the cause, the other the effect. So the deceiver hands a falsehood and the receiver takes it and is deceived.
There is no logical reason to conclude that one cannot deceive oneself. What the person at an earlier time knew as a falsity, is represented to oneself at that time as as a truth. The same person at a later time, having forgotten the act of deception, believes the falsity as a truth. Never in this whole process does the person "knowingly believe a falsehood" as you insist is necessary for self-deception. The person at one time tells oneself that a falsity is the truth, not actually believing it is the truth. The same person at a later time believes it to be the truth without remembering that at one time it was not believed to be the truth.
Not really. I disagree with the framework itself. I acknowledge that many, if not most, folk talk in ways that lead to self-contradiction and/or incoherence. I acknowledge that these are meaningful ways to talk about stuff. My point is that we can be wrong about some stuff, particularly anything and everything that exists in it's entirety prior to our becoming aware of it. Our mental ongoings are precisely such things. Thought and belief are mental ongoings. We can get such things wrong. If we work from an ill-conceived notion of thought and belief, our notion of lying will suffer the consequences along with all else we say about ourselves.
Deliberately misrepresenting one's own thought and belief is always a lie.
Much talk about lying involves talk about "telling the truth" as well. This has all sorts of problems too. If one must only say what's true in order to be "telling the truth", then the only way that one could do such a thing is if their entire belief system is infallible. Truth cannot be false. That's the problem. We all have false belief. "Telling the truth" doesn't require omniscience. It requires honesty in speech. It requires saying what one believes to be true. "Tell the truth, the whole truth, and nothing but the truth" is rubbish on it's face. It is either an impossible criterion to meet, or it conflates truth and belief. It's a prima facie example of language based upon gross misunderstanding of how thought, belief, meaning, and truth work together long before we become aware of our ow mental ongoings...
p2 Tricking another requires knowing you're tricking.
c1 Tricking oneself requires knowing that one is tricking oneself, and not knowing that one is being tricked.
p4 One cannot do both, know s/he is tricking him/herself, and not know that s/he is being tricked.
C2 One cannot trick oneself
What's wrong with saying that they believed nothing was wrong, but after all night passing without change in their condition, they began to believe that something was wrong. That is, they changed their belief, as compared/contrasted to misrepresenting it to themselves.
Well, I once held that view as well...
Joe watches a car accident happen. A brown car ran a red light and crashed into a blue one. Joe knows the driver of the brown car, and he wants to help her avoid fault charges, so he lies to Brian about what he saw. Brian believes Joe, also knows both drivers and does not like the driver of the brown car. So, even though he did not witness the accident, he claimed he did, and when asked who's fault the accident was, Brian states something that he does not believe to be true, but is when he says "the brown car"...
If being a lie requires knowing something to be true, and saying it's opposite, then Joe lied, but Brian only lied when he said that he too saw it. Both deliberately misrepresented their own thought and belief.
On my view, they both lied. Joe once, and Brian twice. Joe's lie was false. Brian's first one was false, but his second was true...
Because that happens too, and it's a different phenomenon. What you're missing is that self-deception is usually strongly motivated and irrational.
Here's a salient example: relationships. Self-deception often involves manipulation of evidence, but when it comes to figuring out what are other people think and feel, a lot of that evidence is subtle and ephemeral. We're good at picking up on these tiny tells, almost unnoticeable variations in inflection, expression, eye movement and focus, tone of voice -- all of that stuff we process without usually being consciously aware of it. We just know.
My point is this: it's particularly easy to get away with fooling yourself in this context because your "judgment" was arrived at automatically based on "evidence" you probably couldn't articulate. And that makes it all too easy to dismiss. You don't want to believe something's bothering your spouse? No problem: there's not much you could really point to as evidence anyway. (It was just a feeling you had.) But anyone who's ever done this knows they were fooling themselves.
Or, from the other side, want to believe that cute girl in your homeroom, or at work, or making your coffee, is into you? You can probably find something to count as "evidence". For most of us, enough contrary evidence arrives and quickly enough that a restraining order is unnecessary.
A weak case can be built upon the consumption of fictional entertainment where immersion causes a suspension of disbelief. While we're never fully deceived by fiction (and that it is the fictional work doing the deceiving), the intent to become immersed in the first place (which requires some albeit weak or pseudo-level of belief change) can constitute a form of intentional self-deception. "Escapism" highlights the difference between this kind of intentional self-deception and the mere consideration of hypotheticals.
A strong case for the intentional deception of our future selves exists in the case of intentional mood altering practices (though they do not necessarily come with specific belief changes, mood changes can easily cause changes in belief). It may not be "lying" to instigate a general mood change but it's definitely intentional self-manipulation to purposefully alter one's mood by pro/prescribing substances, physical activities, or other means. Mood changes would most directly affect emotionally contingent opinions, but they can and do also impact actual beliefs. For instance, someone who lacks confidence might try to consciously project confident body language (i.e: smile more) in hopes that it will impact on how confident they actually feel, and in turn change their beliefs which are in part dependent upon their confidence (i.e: what they can accomplish, the moral nature of the average person, etc...).
Drinking in order to forget or become distracted from an unpleasant reality seems to at least fit the description of "lying by omission". If an omission can be a lie, then at least while inebriated we are possibly being lied to by our former selves...
Is it though? Seems to be a different explanation of the same phenomenon; one with fewer entities, no self-contradiction, equal explanatory power, and just as much plausibility. Why opt otherwise?
Quoting Srap Tasmaner
Yeah, and the explanation is self-contradictory, and thus furthering the irrationality.
Quoting Srap Tasmaner
Are we though? On my view, there are multiple underlying reasons for what is otherwise the same outward behaviour(s). Misattribution of meaning regarding these subtle behaviours that we purportedly 'just know' runs rampant. That is particularly the case between people from vastly differing cultural, familial, and otherwise socially influenced norms.
Anyone whose ever thought that something was bothering their spouse, based upon subtle behaviours but didn't want to get into it and so avoided addressing their own 'feeling' and their own belief that something may be wrong, weren't fooling themselves into believing otherwise. They were rationalizing their own decision to not talk about it. In this case, the person is deliberately not representing their own thought and belief, because they are intentionally avoiding talking about it.
Here again, how is this lying to oneself? One either believes that another is into them or not. The belief, very well may be unfounded, but it is a belief nonetheless. One is not lying to themselves, rather they just believe things without good enough reason.
I'm at a loss here...
One can most certainly change the way that they look at the world by virtue of changing the way they talk about it and/or themselves. One can do this deliberately. One can deliberately change they way that they behave as a means to change the way they feel. This can, in turn, change one's belief.
How is that deception?
Because it's intentional belief altering via coercive/irrational means. Is deception the intent to conceal or manipulate or is it the successful concealment/manipulation of objective truth? I'll satisfy both:
Lets say I'm at a singles bar looking for a date, and I know that statistically my chances of being successful are low... Consuming alcohol can make me go from believing it is true that I will likely fail to either forgetting or believing the opposite, even while it remains true that I will likely fail despite the statistical benefits alcohol may confer.
If there are two... who/ what is the other?
It is hard to answer, because in saying anything, I am going to be making an image. But as near as I can get, compare :
1. I am an English philosopher. (identity, image)
2. I am writing a post. (activity, fact)
If I make an identity of 'poster', then I am a poster even when I am not posting, or a philosopher when I am not philosophising, or English when I live in France.
You're mixing present and past tense "being tricked". "Having been tricked" requires not knowing that you've been tricked. And this is fulfilled when you forget that you've tricked yourself.
Why not?
The reason I say that lying being based on knowledge or belief is a minor disagreement is because I'm willing to go along with your theory of lying. I'm not so interested in justification, meaning, truth, or belief as much as I am in a theory of mind. So sure, it's a disagreement, but I'm fine with setting the stage as you say -- that lying is the intentional misrepresentation of one's own belief. That fits well enough for me.
It seems to me that if we are of a split mind that we could still accomplish this -- adding a dimension of time and some notion of awareness would resolve any sort of conflict. And if this could be demonstrated to be non-pathological, it would even be a possible normal event ("possible" just because that seems more empirical question that I do not have an answer to)
EDIT: (Relates to the above)
Quoting creativesoul
I don't see what makes a singular self better than a divided self. For that matter I'm not sure what would make a divided self better than a singular self, at this point.
One wouldn't need to be a literal two selves within a single mind. I think merely having a divided mind -- of whatever kind -- is enough to count as lying as you define lying. Some part of the mind can deliberately misrepresent a belief to another part of the mind, and our awareness can shift from the one to the other through time.
But what would make either notion a better notion?
That seems along the right lines to me. The "splitting of selves" approach (I think it goes by the term "psychological partitioning" in the literature) only makes sense if one tries to force self-deception into the model of one person being deceitful to another. In those cases the key point is that the deceitful person both believes/knows something to be the case and intends that the other should believe the opposite is the case. Self-deception does not seem like that to me, it is more like having a suspicion that something you wish to be true may not be true, but rather than pursuing the chain of reasoning that will decide the issue for you, you give yourself (perhaps bad) reasons for not pursuing that chain of reasoning.
This is a point that should have been made earlier. Beliefs are almost always best thought of as partial, as confidences. You believe you're unlikely to be successful and that you have a chance of being successful. Alcohol either suppresses the former completely, or just futzes with the numbers, so that your chances look better with every drink. You're not going from believing P to believing ~P or something, because you believe both, partially, from the start. Typical self-deception is deliberately mis-calibrating your confidences.
But what makes this plausible description something which actually annuls the act of lying to oneself?
By "splitting of selves" I just mean it generically -- like, I can see multiple ways you or others might parse what that means. Partitioning, or having a tripartite division of mind such as Plato's or Freud's, or as I've been saying just having an awareness which can move from different parts of the mind, or as un has been saying between the image, the self, and the speaker of the sentence making the division. I'm sure there are other ways it could be parsed.
In some sense there is a factual aspect that would need to be investigated ,and it could even be case-by-case. But investigating the facts of a mind is something of a tricky business, and deserving of some philosophical scrutiny to understand how a fact might be significant one way or another. And in a sense I think it's worthy to note that it may not be just the facts -- as un points out, there could also be commitments of one kind or another in making an identity, which are over and above the facts.
And then even more generally speaking -- what would make this singular self picture a better picture than a split self picture? It must be more than the facts because we could probably reconcile facts either way.
Something that @VagabondSpectre's approach does make me think of, explicitly at least, is that there could also be a difference between a self and a mind. So the self has a seemingly singular quality to it -- we always feel like we're the same person and can entertain, at least in a clear and distinct way as philosophers tend to like to do, about one thought at a time. But the mind can be much wider than the self, and it may not just be the self that lies but the mind.
For some purposes we ignore what's going on under the hood. You, the single individual person, are responsible for what you say, and for the consequences of your decisions. Looking under the hood provides a more nuanced description, but it's really changing the subject.
Absolutely. In fact, since posting it occurs to me that the concept of "lying" belongs to one level -- the person level, where we hold individuals responsible for their words -- while "self deception" belongs to another level, where we try to understand how we and others think.
I think that's probably right, but we've become so sophisticated that now we hold people responsible for fooling themselves. Which is not completely crazy -- as I said above, I think there are related norms in play here. Both lying and self-deception are violations; they're just not exactly the same violation of exactly the same norm.
My apologies.
I'm at a momentary loss here. I presumed that that was obvious. I am rather prone to mistakenly assuming that my interlocutor has read me for quite some time, and thus is already amidst the same stream of thought that I've been following for quite some time now.
This ought help to make it more so...
One cannot be tricked into believing something if they know both how they're being tricked, and that they're being tricked.
One who is performing the trickery knows both how and that they're doing it. That is because it's being done purely for the sake of doing so. Intentionally tricking oneself is impossible.
One cannot know how and that one is tricking him/herself and not know how and that one is tricking oneself.
------------------------------------------------------------------------------------------------------------------------
Quoting Moliere
All mind consists entirely of thought and belief on my view. All thought and belief is meaningful and itself presupposes truth... lies notwithstanding. Without introducing meaning, truth, and belief into the mix whatever theory of mind discussed will be utterly incomplete, wouldn't you agree?
Quoting VagabondSpectre
Hey, I'm going to deceive you by telling you what I'm going to do, how I'm going to do it, and when I begin doing so.
You wouldn't be deceived, and neither would I.
The example above could also be explained as follows...
People become temporarily inebriated. Poor judgment is an effect/affect of inebriation. If I know that getting drunk increases my confidence level, and if some women like guys with confidence moreso than guys without, and I entertain these thoughts in close proximity to one another, I could then deliberately do something with a clear purpose and very well may increase my chances of success.
One cannot intentionally misrepresent their own thought and belief to themself. That is, one cannot lie to oneself.
If a notion permits a meaningful account of "self-deception", it does so by virtue of either a lack of intention, or a plurality of self/mind. The former means is acceptable in light of the criterion above. The latter is not for those are negations of one another. Intentionally misrepresenting one's thought and belief requires a plurality as adequately argued heretofore. One is not a plurality.
:wink:
On second thought, neither case of self-deception is acceptable in light of the criterion for lying that we're working with.
I think we're in agreement here.
We know what it is to deceive another. We know what it takes. We know some things that must be both present and not in order for it to happen. That's what makes it deception. This has been argued for heretofore. As before... it takes a plurality. One is not.
One cannot deliberately misrepresent their own thought and belief to oneself. Self-deception requires that. One cannot deceive oneself.
We do indeed talk ourselves into and out of things. No one makes a mistake on purpose. Using bad reasoning is a mistake.
One cannot deliberately misrepresent their own thought and belief to oneself. Deceiving another requires doing so. Deceiving oneself is impossible if we hold to the criterion for lying that I've been working from...
A heap is not a plurality of grains? A mind is not a plurality of thoughts? A brain is not a plurality of neurones? A body is not a plurality of cells?
It makes intentional self-deception impossible.
One heap is a plurality of grains. One mind is a plurality of thoughts. One brain is a plurality of cells. One body is a plurality of cells.
One mind is not a plurality of minds.
One heap is not a plurality of heaps. A plurality of grains is not one grain. One mind is not a plurality of minds. A plurality of thoughts is not one thought. One brain is not a plurality of brains. A plurality of neurones is not one neuron. One body is not a plurality of bodies. A plurality of cells is not one cell.
Why? The process I described looks intentional, but does not seem to involve any contradictions.
The process you described is nothing more than using inadequate reasoning. Using inadequate reasoning is a mistake. Mistakes are accidental. That which is accidental cannot be intentional.
So the example is neither intentional nor a case of lying to oneself.
What makes it self-deception again?
But it might be a plurality of awarenesses, a plurality of intentions, or one element of several of a person. You seem to be ruling out a division on the ground of calling it 'one'. As if someone called 'Honesty' cannot be dishonest.
Laing Wrote a book about the divided self; you may not agree with his psychology, but you really cannot rule it out a priori.
Non sequitur.
As if someone called 'Honesty' cannot be a plurality called 'Honesty'. Surely there are plenty of sensible ways to divide up one mind...
Ahem...
One mind is a plurality of thoughts... belief... emotions.
Can these be in conflict with one another? Sure.
Such conflict gives rise to uncertainty, insanity, confusion, disbelief, moral dilemma, solid ground for temporarily suspending one's judgment, and a host of other things too I'm sure.
One cannot make a mistake intentionally. Mistakes are accidental. Blameworthiness doesn't belong here.
I don't even know what "ruling it our a priori" is supposed to mean. If it is impossible for one to deliberately misrepresent their own thought and belief to oneself, then any and all arguments which assume or validly conclude that are themselves based upon at least one false premiss.
I think that this is a common form of self-deception (there are numerous different types). Let's suppose that I don't know with any degree of certainty that X is the case ([perhaps someone just told me X is the case and I believed it). So I believe that X is the case though I have no reason to be certain about this. Over time I will forget that I am truly uncertain that X is the case, remembering only that I believe X is the case. In this frame of mind, I may perceive hints of evidence that X is really not the case, but I may not act to reassess that belief because I deceive myself by thinking that I would not hold the belief, X is the case, without properly assessing it in the first place.
In other words, I falsely believe that if I hold a belief, that belief must have already been properly justified. The deeper the belief, the more fundamental it is, (like a Wittgensteinian "hinge-proposition") the deeper the self-deception is, that the belief is beyond doubt. So the self-deception involves telling oneself that such a deep seated belief cannot be doubted when in reality the person knows that it can and ought to be doubted.
So creativesoul holds the belief that it is impossible for a person to self-deceive. But clearly this is a belief which can and ought to be doubted. Creative self-deceives by refusing to look at the vast evidence presented, believing only the prejudice, refusing to doubt what ought to be doubted, merely insisting over and over again, that self-deception is impossible
Setting aside this notion of blameworthiness, but taking on the rest...
One has a deeply held belief. That is, one has unshakable conviction that something is true, or is the case, or some such. One is confronted with a line of reasoning that places that belief in question. The truth or falsity doesn't matter...
Full stop. What else about the belief would be reassessed?
So here we are considering an example where a line of reasoning is being ignored that would have otherwise led one to reassess a deeply held belief. It's being said here that it's not so much the truth/falsity of the belief that matters. What matters more, according to this purported notion of self-deception is that that person ought to have done something that they did not.
So self deception is when one doesn't do what another thinks they ought?
:worry:
It's called "denial", refusing to consider the evidence.
What is the difference between being mistaken and self-deception?
Is that so though?
I mean surely self-deception is something in and of itself, right? In that case there should be something or some things that make it qualify. Ahem... a criterion.
Anyone here have one?
If I am mistaken about the fact that I am a good philosopher, and someone points out that my thinking is sloppy and my ideas confused, then on seeing the evidence I will correct the mistake. "I thought I was quite good at this, but I see I was wrong. No worries."
If OTOH, I am deceiving myself that i am a good philosopher, and the same thing happens, I will resist, my feelings will be hurt, I will get angry and dismissive, I will attack the evidence, make excuses, and so on. Folks will commonly die to maintain a false image of themselves.
I gave this criterion a long way back- "commitment".
There is something wrong in being self-deceptive, one is doing something one should not be doing. Note that there is a difference between one person being deceitful to another and one person simply deceiving another (magicians deceive people, but when they do so, they are not being deceitful). What in general that is added to deceptive behaviour in order to make it deceitful is that some social norms of acceptable behaviour are being violated. Self-deception retains from deceitfulness that aspect of its being wrong, and since that is based on social norms it would follow that when one is deceiving oneself it involves going against what others believe one ought to be doing/have done. Solitary self-deception probably makes as little sense as solitary rule following.
As for a criterion, let me have a stab at one (there may be others): refusal to engage in a rational process that one is aware exists, that one can engage fully in and where that refusal is motivated by the fact that it may undermine a cherished belief (might need to add that an alternative rational process is engaged in which provides - perhaps superficial - support for the cherished belief).
And I do for the moment at least stand by the idea that the truth or falsity of the cherished belief need not be relevant as to whether a person is engaging in self-deception. Suppose John is accused of murdering Janet. John's mother believes that John did not murder Janet. The evidence is stacked up against John - video surveillance, finger prints, motive, means, opportunity etc. Rather than examine the evidence for what it is, John's mother insists that she knows her Johnny, he's a gentle boy that she brought up and would not harm a soul and so he did not murder that man-eating Janet. Now, suppose that John really did want to murder Janet, and on the day in question went with malice of forethought to her appartment to kill her. Arriving there, Janet is already butchered, so John flees the scene after accidentally leaving some top quality finger prints, and is caught on camera entering and exiting the building. John's mother has a true belief that her son did not murder Janet, but she's still engaging in self-deception - at least arguably. Of course, unpacking the example might expose it as not showing what I think it shows, but examples have to start somewhere.
Commitment isn't enough though Un. Necessary? Surely. Sufficient or adequate? Not even close. I think we agree.
We may get somewhere helpful thinking along these lines. I'm curious enough to see if I have this right(if I'm understanding your claims) and if the path will end up being a helpful one...
So, deceiving oneself is always being mistaken, but not the other way around. The difference between being mistaken and deceiving oneself is that one who is deceiving oneself takes being told that they're mistaken personally, so much so that they are incapable of correcting the mistake. This overly general parsing is good enough for now, I think.
On my view...
It is humanly impossible to knowingly believe a falsehood. We all have or have had false belief somewhere along the line. Belief systems(world-views) are self-contained. So, we cannot see our own mistakes. Correcting mistakes requires first seeing them. Seeing them requires an other. Thus, in order to even be able to correct our own mistakes, we must be capable of admitting our own fallibility(that we could be holding false belief), and we must recognize that an other is necessary. In addition, we have to place more confidence(trust) in an other than we do in our own thought and belief(the ones in question). One who cannot do this would meet your criterion for self-deception.
Agree?
So not all deception is deceitful and...
Quoting jkg20
...what makes deception deceitful is when it breaks the rules of acceptable/unacceptable behaviour.
Install a real life scenario...
Politicians, in the States at least, are expected to lie to the people about their motives for holding elected office. That is to say that that is a social norm. Lots of Americans hold the view that all politicians lie.
Using the standard you've put forth, politicians that deliberately misrepresent their own thought and belief as a means to convince voters are not being deceitful.
Bullshit.
The fact that some deception is socially acceptable does not change the fact that it is deception. All deception is deceitful. That is precisely what makes it deception.
Rational process can involve putting certain kinds of logic to use. Para-consistent logic qualifies. Para-consistent logic holds that a statement can be both true and false at the same time and in the same sense. This logic has the ability to render any statement either true or false.
Do you see the problem?
Quoting creativesoul
Right, I think I get what you are saying here. If one says out loud, "I believe X and X is false." there is an obvious contradiction. But folks can get very close: consider the cliche "I'm not a racist but ..." where what follows the 'but' is some obviously racist belief. One can believe things that are contradictory, just as long as one does not notice the contradiction.
But thereafter, I stop agreeing. I might believe I can lift up that rock, and all it takes to change my belief is trying and failing. I don't need anyone else.
I think your 'knowingly believe' is doing too much work. That is to say, I do not know everything I believe, certainly not until I start looking. Consider prejudice. I repudiate prejudicial beliefs, and yet I find on reflection that I act on them. And when a man crosses the void on the bridge, that is stronger evidence that he believes it will support him, than any amount of confession.
Point taken. Some false belief can be recognized by the believer without an other. So...
Self-deception is not being able to correct one's mistaken belief.
Perhaps. It would require having considered whether or not the bridge would support him at some time or other though, wouldn't it? A lizard crosses the bridge, but that crossing is not strong evidence that it believes that the bridge will support it.
How does this tie into lying to oneself or self-deception?
I don't think belief requires much consideration. Hear the bell, start salivating. I'm not the lizard whisperer, but the folks that I know of, cats, for example, are sometimes unsure whether something will support them or not, and sometimes surprised when it does not. [insert cute video here]
But humans. Humans are largely opaque to themselves. I find I can not know someone's name, even though I know that I know it. It's on the tip of my tongue... And certainly I believe and act upon all sorts of stuff that I never consider, and that the ground will support me, whether it is a bridge or a cutting, is a trivial example. That the nearest shop is right, left, and on the opposite corner at the crossroads... I have never really thought about it 'til now.
This is how I go on when I'm not philosophising, and since I can not know things I know I know, and know things without knowing and believe things I've never considered whether or not to believe them, it becomes really rather easy to deceive myself if I have reason to want to. And one reason I might want to deceive myself that all this is not the case is that I like to consider myself a philosopher, who is much more insightful.
Rudimentary belief requires no consideration. Not all belief is rudimentary.
Forgetting and remembering. We either know something or we don't. I find that regularly employing self-contradictory language perpetuates itself.
I can't make much sense of this, which is unsurprising given the inherent self-contradiction.
Self deception comes in many forms. If you do what you know that you ought not do, and you have success, so that you later think that perhaps it's ok to do what you did, and now proceed to do this regularly, progressing to the point of having forgotten that you ought not do this, then you have deceived yourself. You have deceived yourself into thinking that it's OK to do what you knew that you ought not do.
So I use a snow blower. I know that when the chute gets plugged with wet snow I ought to shut off the machine before sticking my hand in there, to be sure to avoid injury. However, I realize that if I check the machine to make sure that it's not turning before I stick my hand in there, it's not a problem I can do this without shutting off the machine and there's no injury. So I deceive myself into believing that I need not shut the machine off before sticking my hand in there, to avoid injury. You might think that this is not deception, there really is no need to shut the machine off. But one time I mistakenly determined that the machine was not turning, when it really was, and there was injury. So I realized that I had deceived myself into believing that I didn't need to shut off the machine to avoid the possibility of injury.
But surely, self-contradiction is impossible?
Even those who hold the view that all politicians lie probably do not find it acceptable that they should do so, so the deceiving politician can still be deceitful on my account - your counterexample seems misguided.
Not really, but then perhaps there isn't a problem to see. First, paracconsistent logics do not render any substantive non-logical statement either true or false, they are just formalisms of different types of logicial consequence. Furthermore so-called true contradictions, even if acceptable within a paraconsistent logic, are limited to a special range of propositions (involving vagueness and the use of the truth predicate for instance). So the relevance of the existence of paraconsistent logics is unclear to me in the context of self-deception - particularly since on my account the truth or falsity of the belief concerned need not be relevant (as in the example I gave). Of course, if someone who appears to be self-deceiving were suddenly to start justifying their belief by quoting theorems from paraconsistent logics, then perhaps one would have to revisit the claim that they were deceiving themselves, but it would depend what belief they were trying to justify. You would need to give me a fleshed out example for me to see the real problem you are getting at.
This might be right, but care needs to be taken to understand where the mistake lies. Deceiving yourself that some proposition P is true (or false) does not require that the mistake be about whether P is true (or false). In the example I gave John's mother believes that John did not murder Jane, and she is not mistaken about that because John really did not murder Jane, yet she is deceiving herself. If there is a role for mistake in that example it is her mistake of not taking the evidence stacked up against John seriously.
Not at all. I've never claimed otherwise. Self-contradiction is not self-deception.
Quoting jkg20
What makes it go from being mistaken to self-deception? For that matter, what makes it either?
Your opinion?
I mean, she was right after-all.
When we talk about deception, particularly when we talk about someone deceiving an other, there are elements which make it what it is. Those elements are non-existent in all the sensible, reasonable, and/or coherent examples of self-deception put forth. What is the ground for changing this criterion? It's been shown to lead to self-contradiction, incoherency, equivocation, or just plain nonsensical talk.
We can all imagine a stick insect, or a moth, or any other type of camouflaged critter. Our choices are saying that it is deceiving it's predators(and changing the criterion or equivocating) or simply refrain from saying that it is deceiving it's predators(it has no intent after-all), and begin talking about it in better ways.
Squirrels...
Now they are intentionally deceiving others, by pretending to hides nuts when others are watching and they know it, or they do this by pure accident and the behaviour itself increases the likelihood of their survival, so it has been a trait/behaviour that has proven beneficial, and hence has transcended the individual squirrels.
Well, it certainly is my opinion that John's mother is deceiving herself, and the mistake she is making is not taking seriously evidence that ought to be taken seriously. You seem not to share that opinion, so perhaps - as in many cases of philosophical argument - we've arrived at a clash of intuitions. Of course, you might try to push the line that John's mother is not deceiving herself concerning the belief that John murdered Janet, but rather the belief that Johncould have murdered Janet and then start giving some counter-factual analysis of self-deception in terms of possible worlds - for all I know David Lewis has already done this. So in the end, perhaps the truth of falsity of the belief involved in self-deception might be argued to be an important concern.
However, even if my intuition about truth or falsity being a side issue in self-deception is misguided, I still insist that self-deception is not correctly modelled along the lines of one person deceiving another (although it would not be too hard to think of an example of one person deceiving another into believing a truth). John's mother is doing something wrong, she is making a mistake - ignoring evidence - that she, as a rational person, ought not to have made. Self-deception, in this sense, is as much (if not more) a moral issues as it is a metaphysical one.
So what does it take for her to deceive herself?
You saying so, or something that has nothing at all to do with you? I say, if the notion of self-deception is to be worth something, it has to be the latter.
She was right.
I think your example highlights the reasons why we ought carefully consider the sorts of evidence that are presented. You claim she didn't take it seriously enough, but it seems to me that you would've wrongfully convicted someone, and yet say that she was deceiving herself when she would've gotten it right.
:worry:
Doesn't what you've done here fit your own criterion of self-deception? It seems to me that it does, although on my view you're just simply mistaken(assuming sincerity).
@jkg20 is arguing the same as I did that self-deception is a violation of our norms of rationality, often related to the treatment of evidence, sometimes related to inference or other elements of reasoning.
What you're still missing is that reasoning is a process without a pre-selected outcome. You can choose to futz with the process in various ways, and this is quite intentional, but it needn't ever lead to direct confrontation with the outcome -- that can be endlessly pushed aside and never arrived at. So there's no issue of at once assenting to and not assenting to some proposition. You just make sure that proposition never makes it to the floor for a vote. You do this deliberately. You find ways to rationalize doing it, reasons that have nothing to do with your real motivation, reasons that allow you to give what you're doing the color of rationality. No doubt when confronted you will be able to defend yourself at length and explain how every step you took or didn't take was thoroughly justified.
The natural competitor for describing such a process is simply "being mistaken". But this sort of thing doesn't look much like being mistaken to me.
Hmmm...
That's an interesting take. Of course I'm reminded of conventional American politics. I'm not sure that I'd call what Senator Mitch McConnell does and what John Boener did(which you've just described perfectly) as either being mistaken or self-deception.
I'll think about this a bit. Got more?
It's very hard to judge which politicians are lying to themselves and which are soul-less tools.
I'd rather not do more politics, but I wholeheartedly recommend this excellent piece of popular philosophy by the estimable John Scalzi: The Cinemax Theory of Racism. I think it's on point.
Read the article if you haven't before. I just reread it and it is as good as I remember.
Right? And it's a solid piece of philosophy, to my mind.
Yeah, pretty good.
As I've described self-deception, rationalizing is one step in the process. It is the cover-up. This is when one knows that what one is doing, or saying is wrong, but the individual produces a rationalization to make it appear as if it were right. Creativesoul insists that the only people who could be tricked, or fooled by the cover-up are people other than the one producing the rationalization. But this is not the case because all that is required is that the person who produces the rationalization becomes so focused on remembering and describing the cover-up, and enamoured by the cover-up, that they forget the thing being covered up.
What creativesoul doesn't seem to allow for is the fact that self-deception is a process which takes time. So creativesoul presents a logical argument in which a person cannot both believe and disbelieve the same thing at the same time, and dismisses self-deception as impossible. But this does not properly represent the temporal process of self-deception in which one belief replaces another over a period of time. The rationalization, or reason for replacement, is known to be unjustified when introduced, but the process of repetition and attention to the details of the rationalization, ends with the fact that the rationalization is unjustified, having been forgotten.
The point to notice is that memory requires mental effort, it is not automatic. Therefore, as we say, memory is selective. So when we do not choose to remember certain things they are forgotten and this is what makes self-deception possible.
If by "successfully" you mean "genuinely", then I don't think it's possible to lie to oneself. You either believe A about X in situation Z or you don't. You can try to persuade or convince yourself to believe B about Y in situation Z even though you still genuinley believe A about X in situation Z. You can try to avoid thinking about, acknowledging, or denying a belief, but that's not the same as lying.
Lying involves holding a view about X as true but presenting it as false.
i just lied to myself.
easy.
do i believe it?
no.
half the people i lie to don't believe me either.
but i'm still lying to them - intentionally.
so are you really asking how to lie to yourself and believe it?
i don't think you can - intentionally.
i think when people say "you are lying to yourself" the case is actually that you are mistaken in your perception of the subject and genuinely believe the wrong concept.
they can see the truth so clearly that it stands to their reason that you should see it equally as clear, but since you cannot it appears that you are "lying to yourself."
a lie is simply something that is false. which means it's not correct.
if you form a believable opinion, based on inaccurate information, you are basically "lying to yourself" by believing false data.
So, there's two basic notions to consider with self-deception. The first is what is the or a 'self'. The second is what counts as deception.
There is no self without others. What one comes to think about oneself comes largely via language acquisition. We learn how to talk and thus think about the world and/or ourselves through language use. We adopt our first world-view in this way. However, it can be the case that one's innate personality/character is different in remarkable ways than what s/he has learned is acceptable within and to their immediate familial and/or social surroundings.
For example, one may be attracted to the same sex while being raised in a social setting where such a thing is condemned. All humans are interdependent social creatures by our very nature, and the need to feel accepted and/or fit into a larger peer group is evidently a strong one. If one's social group does not accept homosexuality, then one who is attracted to members of the same sex has to suppress any and all behaviours that may cause others to view them in a negative way.
Here, if the self is determined by others' moral values/beliefs about how things ought be, then one who would otherwise be prone to engage in homosexual behaviours and thoughts must alter the way they act and talk in order to conform and be accepted.
But is that who they are or who others think that they should be?
If one within such a situation were to intentionally suppress any and all homosexual thought as a means of self-discipline with the goal of fitting in, then their notion of self would be in conflict with who they would otherwise have been.
Quoting creativesoul
Belief, sure. I'm not so certain about meaning or truth, though.
Quoting creativesoul
Quoting creativesoul
Quoting creativesoul
I'd say that "ruling it out a priori" means that you are ruling out the possibility that our minds are divided by means of some conceptual analysis of the concept of lying, or by declaring it to be impossible. Maybe you're not, but I'm not sure why it is impossible to deliberately misrepresent one's own belief to oneself.
I am fine with your notion of lying. So lying, rather than merely being mistaken, is when you deliberately misrepresent your own belief to yourself. Merely being mistaken is holding a false belief. Since falsity isn't in the notion of lying the two don't even have to relate. We may deliberately misrepresent some true or false belief to ourselves, just depending upon what we believe. By removing truth, in fact, there is a lot more wiggle room here -- the beliefs need not even have a factual component (EDIT: Or even be truth-apt). They merely need to be misrepresented to ourselves.
And such a thing would be possible -- conceptually speaking, here -- if the mind were in some sense divided. So let's just stick with @unenlightened's notion of commitment. I am committed to some belief. I come to believe something that is in conflict with this other belief. Here I can be honest with myself, realize that these two beliefs are not compatible, and try and think through that conflict and resolve it in some way. Or I can be dishonest with myself, act out of fear, and tell myself that the beliefs are not in conflict. However I might accomplish this -- it seems that this dishonesty is really what lying to yourself is all about. You aren't coming to terms with a conflict in beliefs, but rather accepting both beliefs in spite of having the niggling realization that they are in conflict. So you misrepresent your beliefs -- or meta-beliefs? -- by saying they can get along fine. Your commitment and your new belief that said commitment is somehow erroneous (not necessarily false) and your belief that they are not in conflict are all somehow simultaneously preserved. It seems a mental feat which would result in conflict of the self, and indeed I'd say that this is the case -- which really only makes sense if different parts of the self can actually be in conflict, which is easily understood if the mind is divided.
Quoting Uniquorn
Bingo. Well, not exactly how I, personally, might do so -- I'm not after a step-by-step guide to lying to myself. But rather what would necessarily be true if it were possible to lie to yourself. So I'm not really assuming that it is possible to lie to yourself, even. I'm more interested in a conceptual analysis of lying to yourself -- exploring what is necessary under the assumption that it is true.
The benefit being that by so doing it might lead to a way of determining whether it is or is not true, but without simply assuming one way or the other.
It's been explained and argued for several times over. I'm sure if you really want to know why I hold that view, you'll go back and see for yourself.
One cannot be tricked into believing something if they know both how they're being tricked, and that they're being tricked.
One who is performing the trickery knows both how and that they're doing it.
One cannot know how and that one is tricking him/herself and not know how and that one is tricking oneself(how and that it's being done).
The same applies to deliberately misrepresenting one's own thought and belief to oneself. It's just plain common sense. It's not at all difficult to grasp.
I've already argued for why commitment alone is inadequate.
Quoting Moliere
This is incorrect. Truth/falsity is in the notion of lying. It's just that the lie(what's being said as opposed/compared to what's believed) can be either. On the face of it even it's more than obvious that being mistaken and lying are related. They both require belief.
That's what I'm rejecting, and have argued for without subsequent refutation. On my view, just saying it isn't enough. Can you argue for this?
Removing truth from the notion of thought and belief? Cannot be done. All thought and belief presupposes it's own truth somewhere along the line. The notion of being "truth-apt" is misleading at best. All thought and belief can be either. It's only as a result of their being inadequately represented in speech that makes it seem like some are not.
If you know that they are in conflict, then you cannot believe that they are not.
The mind is divided. However, it is still one mind. It is divided in terms of having/holding conflicting beliefs. Your example is one of cognitive dissonance being ignored. Very common practice hereabouts and everywhere I've ever been.
Lying is dishonest/insincerity. Being insincere is precisely what one is doing when being dishonest(when lying). Both point towards what's going on when one is deliberately misrepresenting one's own thought and belief. So, there's no difference on my view between 'being dishonest with oneself' and 'lying to oneself'. Both are poor uses of language stemming from misconceptions. They hamper understanding.
Quoting creativesoul
Quoting creativesoul
So if we can have or hold conflicting beliefs -- ignore cognitive dissonance, as you put it -- then we can both know that two beliefs are in conflict, and believe they are not in conflict. Because both of those beliefs, too, are in conflict, yet we can hold conflicting beliefs, so.... what's the problem?
It goes against common sense. But here it seems you're admitting that common sense is wrong?
Quoting creativesoul
I feel that's irritating.
"I feel that's irritating" is true. But is the feeling of irritation true? No. But it is a part of the mind. So if the entire mind is belief, then surely there are non-cognitive beliefs.
Ignoring the self-contradiction is not believing there is none. It's neglecting to address it, or believing it's unimportant.
All thought and belief consist of correlations drawn between 'objects' of physiological sensory perception and/or oneself. The 'feeling' of irritation is no different. The statement sets out the connection to oneself(the irritation) and the 'object'(that).
We can believe contradictory things. We cannot acknowledge that they are and believe that they are not in the same breath.
Why call this lying to oneself if it shares nothing with lying to another? If you're going with my notion of what counts as lying, it just doesn't work...
There seems to be a certain sleight of leg going on - one might say that I am deceived by my nervous system into thinking I have a hurty leg, when I do not have a hurty leg. But then I have separated me, my body and my nervous system, and allowed one to deceive another, albeit unintentionally.
Or consider an anorexic, who considers herself grossly overweight as she is dying of starvation.
Or the poor philosopher, who reasons thus:
Men like football.
I am a man.
Therefore I like football.
Which is an example of conforming to an image. Is it impossible to convince oneself that one likes football, or gurls, or shaving, or fighting, because one more desperately wants to conform than to be 'true to oneself'? Surely, it happens all the time?
now ... can you lie to YOU .... no .... deep down if the mind quiets there is never a single doubt as to action needed , so to pretend we do not know something that is internal is a lie in itself .....
the answer to your question depends which you we are speaking about , the self image/ego .... or the consciousness capable of observing thought without judgment .... one yes , the other one no
Surely. I'm confident that neither 'self-deception' nor 'lying to oneself' is the best way to describe these situations though. I mentioned one earlier... the homosexual. We all adopt our initial worldview, and that includes much, if not most, of our own original 'self' image.
Here we see the inherent problem with the notion of self. I'm certain we all agree here. The self is largely delineated by others.
This is presupposing exactly what needs argued for. Care to address the arguments I've given for how and why one cannot deceive oneself and one cannot lie to oneself? I think I've directly addressed all you've said here. I certainly intended to.
The notion of 'self'-deception is nonsense. I've already adequately argued for that without subsequent valid criticism.
Your argument appears to be:
1: Self-deception only makes sense if it makes sense for one person A to act deceitfully towards a person B, where A and B are the same person.
2: It does not make sense for one person A to act deceitfully towards person B, where A and B are the same person.
3: Therefore, modus tollens, self-deception does not make sense.
To support (2), the model of A being deceitful towards B regarding some proposition P, goes along the following lines
a) A knowingly believes that not-P.
b) B's own interests are best served by knowingly believing that not-P.
c) A knowingly believes that his (A's) intentions/wishes are best served by having B knowingly believe that P.
c) A acts intentionally in order that B should knowingly believe that P.
So we end up, if the deceitful behaviour is successful, with A knowingly believing that not-P and B knowingly believing that P at one and the same time. This is not possible where A and B are one and the same person.
(Note that we are talking about A being deceitful to B, not A merely deceiving B.)
That seems right, and so (2) is a true premise.
But this leaves conditional (1) undefended and @Srap Tasmaner and I have been suggesting that self-deception should not be modelled on one person being deceitful to another. That part of the argument you have not yet established. As far as I recall, it came down to intuitions about what you/I/Srap would or would not call cases of self-deception.
Have you? It seems to me that you've just declared it nonsense. (EDIT: I should note here I believe you're sincere, I'm just telling you my impression is all) You do so on the basis of saying that we can classify any intentional act of misrepresenting belief to oneself as something other than lying, because we haven't given the necessary and sufficient conditions that are up to your standard.
But is that an argument? We have given criteria that marks simple error from self-deception. You've just said "OK, sure that's necessary, but not good enough" -- but then we do in fact have a means for distinguishing the two, and so what exactly is nonsensical here?
I don't think it works quite like that, in most cases. One needs a bit of psychology here.There is 'what I am', and there is 'what I think I am' (my self image), and the latter is an aspect of the former. But inevitably, I think that what I am is what I think I am. So self- preservation becomes a matter of preserving the image.
Suppose I look at myself from a position of ignorance. It comes naturally, from this realisation that I am not who I think I am. Then I see there is the self-image I have, but I give it less importance, because it is incomplete at best. So I am ready to discover myself anew. Perhaps, after all I am not the wise philosopher I think I am; perhaps I am not the nice balanced social being I think I am. I will find out as I go - I will learn about myself in my relationship to the world, but it will always be learning, never knowing. This is too frightening for me as long as I still think I am what I think I am, and it seems that to change my image is to die.
Quoting creativesoul
Well I'm not committed to a particular way of describing things, but when people use these terms, I think I know what is meant.
"You are naughty! Be a good boy for Mummy!"
On the one hand one discovers oneself in relationship, one is learning, and on the other, one is told what one is and what one must be. One must be good because one is naughty. And Santa will know which you are. Most people are naughty and being good, taught to live a lie in negation of the lie they have been taught.
That's fair. I can get along with that -- I haven't really been thinking in terms of plausibility, psychology, or facts as much as just getting a basic and easy to recognize theory of lying-to-oneself across to @creativesoul
So for yourself 'lying to yourself' is much more subtle, really. It's almost like an approach to the world and the self -- whereas in one case we must be something we are not, or we believe we are this exact thing and it's a hill to die on, and in the other case we recognize that we are not this set of beliefs about ourself and are open to learning more -- it is exciting to change the image in the face of new information, rather than a death-threat.
Yes, exactly, learning without accumulation, one could say.
[quote=J.Krishnamurti] Learning through experience is one thing – it is the accumulation of conditioning – and learning all the time, not only about objective things but also about oneself, is something quite different. There is the accumulation which brings about conditioning – this we know – and there is the learning which we speak about. This learning is observation – to observe without accumulation, to observe in freedom. This observation is not directed from the past. Let us keep those two things clear.[/quote] https://www.jkrishnamurti.org/content/‘learning’/learning%20without%20accumulation
So my image is from the past, and the image does not observe; I observe in the present, unidentified with the image.
Here we go ...the divided self. Can you describe or explain how the divided self, divided in this way, between what I want to do, and what I ought to do (mummy tells me so), relates to your description of that other division between "what I am" and "what I think I am"?
I would assume that what I am relates to what I want to do, and what I think I am relates to what I ought to do. But you say that what I think I am is an aspect of what I am, so how would what I ought to do become an aspect of what I am when it's only related to what I think I am, and separate from it? See, I'm divided because I perceive what I ought to do as an aspect of others, "mummy told me not to do this", and not really an aspect of myself at all. Maybe it's a third person perspective. But then it's impossible that what I ought to do can be an aspect of what I think I am, unless others are somehow controlling my thoughts. How would "what I ought to do" become a real aspect of "what I think I am"? It would seem like it could only be an aspect of "what I am".
Would you say that "what I am" is itself a deception, that there is only what I think I am, and what others think I am? There really is no self, only an image. If not, then what supports the assumption that there is such a thing as what I am? Is it necessary to assume a "what I am", in order to produce a divided self, to expose the possibility of self-deception, which is really an attribute of the one undivided "what I think I am?
I think awareness through time is doing most of the work in making the concept of lying to oneself coherent, for me. Just as I can flip my awareness in a moment from the thoughts I am having to my fingers, to my memories, to my feelings it seems to me that a flip in awareness could happen from two halves of myself. So where I do agree with you is that the part of myself that is lying could not misrepresent their own thoughts and trick themselves -- there is a need for some kind of a division for trickery to be successful on this model, because you have to be aware of the trick if you're setting out to trick someone. Like a three card monte player knows how to replace a card without someone observing, they couldn't do so to themselves.
So I'm tracking with you on that. For me the flip in awareness is what's important -- so at one point we are aware of the trick, and at the other point we are not. For something like three card monte, where we have concrete points of reference in our literal hands this would be pretty extreme, though maybe possible. But for something a bit more abstract, like knowledge of myself, it doesn't seem so extreme to me because we aren't perfectly transparent to ourselves.
Since we aren't perfectly transparent to ourselves it actually becomes rather easy to lie to ourselves because the trick lies in what is actually a very plausible belief: "I am not transparent to myself" -- so if I come across something that I'd term inconvenient for myself, all I need do is remind myself that I am not transparent to myself and suddenly what was inconvenient becomes questionable.
That's why it makes sense for me, at least. Where in this line of reasoning does something just balk as unnacceptable to you? My guess is you'd just say this is not lying. But if I both believe P and ~P -- because I did, after all, come across something inconvenient -- then that seems to fit perfectly with the notion of lying, or tricking myself. In fact it seems like in order for me to intentional trick myself I would have to believe both, since to be intentional about the lie I'd have to believe P and want myself to believe ~P, then convince myself of ~P -- without changing the original belief.
Whereas to be mistaken would just be to believe something that is false, or to believe something that is true but for bad reasons.
Well I'm only talking, and only even talking about thought. Physically, there is no division, obviously. So there is a division in thought, and a mental conflict. Or perhaps there isn't in your case, it's for you to say.
'What I am' is writing a response - this one - (dasein?). This is real enough, I don't have to assume anything. And then you want a response that explains and justifies the writer in some way, and that is the image I am conveying in the writing that is not the writer, but an image of him. I don't see why you want to problematise this?
What I was asking is how do you relate this to the division between what I want to do, and what I ought to do. It's all thought, as you say, but suppose I am writing this response because I want to, but I am thinking that I ought not to be, because I have other responsibilities which I should be taking care of right now, instead of wasting my time doing this.
So in your response, you indicate that "what I am" is writing this response, despite the fact that I ought not to be writing this response. How do I get to the point where I can produce consistency between what I want to do, and what I ought to do, such that what I am is the same as what I ought to be, because I would be doing what I ought to be doing? Otherwise I see no reason to do what I ought to do, because it's simply not me.
You'll have to ask Jesus about that one, dude. All I can point out is that 'what I want' and 'what I ought' are images that often conflict, and 'what I am' is what happens as a matter of fact.
Got his phone number?
That's close, but not quite the argument...
Lying to another is deliberately misrepresenting one's own thought and belief to another. Lying to oneself would be deliberately misrepresenting one's own thought and belief to oneself. One always knows if s/he believes something, doesn't believe something, or are uncertain whether or not s/he believes something. Thus, one cannot deliberately misrepresent one's own thought and belief to oneself.
Strictly speaking, I shouldn't say 'nonsense' if I mean incoherent, unintelligible, or self-contradictory. Whether or not something is sensible(in this context) isn't determined by being coherent. It's determined by common use. The notion of self-deception and lying to oneself is often used in common talk.
It seems that this hinges upon the notion of being transparent to ourselves.
On my view, one always knows whether or not they believe what they say. That is, one always know whether they believe it, do not believe it, or are uncertain.
Seems to me that you've offered an account of one developing contradictory beliefs.
So, it's basically the idea that the same person acts differently in different situations. This revolves around ethics. That is to say that how one acts in some situation or other, assuming the person has some notion of appropriate behaviour, ought be situation specific. We decide what counts as acceptable/unacceptable behaviour in different situations, and we hold others to that standard. I mean, we all know that some of what we do in our own homes, we would not do in public. In school, there are codes of conduct. At work, at the movies, in the theatre, in a restaurant, etc; all these places have slightly different codes of acceptable/unacceptable behaviour. So, we have different criteria and thus differing expectations for behaviour depending upon the situation we're in...
That's the groundwork. I want to tie this into the conversation here...
If it is the case that someone strongly believes that they must consciously alter their behavior as a means to conform to what they believe is expected, this could set up a situation that fulfills - I think - some of the notions expressed in this thread regarding a case of lying to oneself, and/or deceiving oneself.
When one acts in ways that satisfy what one thinks that others expect, and they do this intentionally and deliberately, then they think that that's how they ought act. If those ways include saying things that they do not believe, but rather they say them because they think that it's the best thing to say at the time because of the situation they're in, then we have everything needed for one to lose sight of what they actually believe...
Now, I am not saying that this must be the case or always is the case when one 'wears different hats', but rather that it has what it takes for one to lose sight of oneself, over a significant enough period of time.
A lie needs a truth or unanswered question. If we consider the former then one can't possibly lie to oneself. If the latter then ignorance can always be replaced with a lie at random.
In fact I think the whole process of human thinking, the so-called hypothetico-deductive method, consists of entertaining possible explanations for a phenomenon and eliminating the lies to get to the truth. Looks like we do lie to ourselves but not for long IF we're rational.
The latter is what I was considering. I'm not as interested, here, in determining how we might know others -- only the conditions under which one might lie to themselves. It could be that said conditions are not easily determinable due to practical considerations.