You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

Artificial intelligence, humans and self-awareness

TheMadFool June 12, 2018 at 09:37 13525 views 91 comments
What distinugishes a human from a computer?

Not logical thinking. Computers are logic machines.

Not creativity. Creativity is randomness and can be replicated.

The only thing left is self-awareness - the observer realizes her own existence and role as an observer of both the self and the world outside.

But...we're not completely self-aware. We don't know what's happening in our brains or liver or our skin and so on. It's like consciousness or self-awareness is limited to the organism as a whole but not its parts. Therefore, we are NOT fully self-aware.

Now imagine a being x who is completely self-aware in every respect from the atomic realm to the macroscopic world we're familiar with. Such a being is what I call truly self-aware.

What about computers? They seem to be able to do logic flawlessly. However, they don't display any evidence of self-awareness like us humans. Aritificial intelligence attempts to replicate human-level self-awareness. I don't know if that's even possible but what I want to present is a comparative analysis of

1. computers
2. humans
3. being x I described above.

Let's put these on a line that represents the spectrum of self-awareness from completely oblivious (like a stone) and completely aware (like being x).

Would we be closer to the computer or being x?

Comments (91)

tom June 12, 2018 at 09:54 #187202
Quoting TheMadFool
1. computers
2. humans
3. being x I described above.

Let's put these on a line that represents the spectrum of self-awareness from completely oblivious (like a stone) and completely aware (like being x).

Would we be closer to the computer or being x?


What makes you think a computer could ever be aware? Only software can do that.
TheMadFool June 12, 2018 at 10:16 #187211
Quoting tom
What makes you think a computer could ever be aware? Only software can do that.


Software must need hardware right? Anyway I'm questioning the basic premise that humans are self-aware. I think that's not true, at least not to the extent of being x I described in my OP.
tom June 12, 2018 at 10:22 #187214
Quoting TheMadFool
Software must need hardware right? Anyway I'm questioning the basic premise that humans are self-aware. I think that's not true, at least not to the extent of being x I described in my OP.


Software and hardware are not the same thing. Your brain is hardware, your mind is software.
gloaming June 12, 2018 at 16:09 #187273
"... Creativity is randomness and can be replicated..."

??? On the face of it, this is self-contradictory. Random is patternless, meaning it can never be precisely the same twice as an intended end.
BC June 12, 2018 at 17:27 #187283
Quoting TheMadFool
What distinugishes a human from a computer?


Biology -- and the HUGE everything that biology implies.

Quoting TheMadFool
Computers are logic machines


Computers are contraptions that carry out logical operations designed by humans. On their own they are just a pile of metal and plastic.

So we aren't full aware. So what?
aporiap June 12, 2018 at 18:06 #187287
Reply to TheMadFool
What about self-monitoring programs? Ones that can modify behavior or output when certain conditions are met -- e.g a robot that self corrects its walking trajectory when it is not going in proper direction? I think that involves some amount of self awareness
Arne June 13, 2018 at 08:28 #187429
The deeper issue is why we continually seek some criterion for establishing a qualitative and normative ontological priority of human being.
Arne June 13, 2018 at 09:30 #187434
And in distinguishing human from computer we grab onto things that are not uniquely human. For example, while it is true that I am flesh and blood and a computer is metal and plastic, the same can be said regarding my dog and a computer. Similarly, we also grasp on to and use terms referring to entities/ideas/processes as if the terms represented a complete understanding of that to which they refer. For example, sub-conscious is a term that refers to a process that we do not understand and may never understand. Yet how often do we think we have answered a question by reference to the "sub-conscious?" The same could be said of the Marxist term "false consciousness" when referring to behavior significantly inconsistent with the interest of one's economic class. There is simply no reason to believe that the term describes adequately, yet alone correctly, every incident of behavior to which it is applied. Claims to the contrary are a matter of faith. And are we not doing the same with the term "self aware." Because I am aware that I am aware of the redness of a car does not establish that such second order awareness is anything significant, unique to humans, understood in any meaningful way, or even useful. And most important of all within this context, most people treat the notion of self awareness as beyond the computer programming ability of beings who are in fact aware that they are aware. And if and when they do succeed in programming awareness of awareness, will we then distinguish the human from computer by talking about being aware of being aware that we are aware? If second order awareness such as self awareness is being aware of awareness, then wouldn't third order awareness simply be awareness of self awareness?
TogetherTurtle June 13, 2018 at 22:34 #187638
This has been one of my points of interest for a long time. Let me put my cards on the table.

Self awareness is hard to define. It is easiest to describe it as "what we have and the beasts do not" so yes, by definition, we have self awareness.

We could also break down the word. It is awareness of the self. I am aware that I exist, what about you? A dog is not aware that it exists. It is just the culmination of biological processes and chemical reactions. It can feel, but not ask why it feels. While we are made of the same material as them, we have self awareness because we are evolved enough to have such a thing.

What you described as "self awareness" is more of the level of awareness a god would posses. I suppose we could label this "deity awareness"

Computers (software, at least) is definitely capable of that, eventually. We simply don't have the hardware to support them, the knowledge to create them, or the energy to maintain them. That can change soon.

I assume you have heard of the turing test. If not, essentially it is a test where you are put in a room with a monitor and a keyboard. In one window on the monitor, you are talking to a human, and on another window, you are speaking to a computer. If you can't tell the difference, the computer passes the test and has "self awareness". You may ask, how can this be? They have just learned to mimic human interactions incredibly well! And I may interject, how is that any different from how you or I interact with people? You had to learn how to interact with people from a young age. Self awareness is not some tangible end goal, it is something that evolves over time until you finally comprehend your world.

One could say babies are not self aware. They haven't developed object permanence, they can not speak, cannot write. However, you consider the baby more human than the computer on your desk that can do all of the above?

To be human (or self aware) is not biological, in fact, you don't even have to be biological to be a human. You just have to be very good at seeming like a human. Consciousness is an illusion, but a very very good one. This is why most people would consider Superman a human, even though he is in fact extraterrestrial.
raza June 14, 2018 at 02:40 #187727
Reply to tom You seem to assume you are aware of a stone while that stone is not aware of you.

However you can only be aware of what you are aware of. You cannot be aware of what the stone is aware of or not aware of.

You even think, it appears, that you are aware of you.

However, you (or what arises as a thought of you) is merely what arises in what we define as "awareness".

The stone also arises within this same awareness as the thought "me". The "me" which comes into contact with the stone.

"me", however (as with "you") is merely an arising thought. A thought which arises in awareness.





TheMadFool June 14, 2018 at 07:42 #187788
Quoting tom
Software and hardware are not the same thing. Your brain is hardware, your mind is software.


All I'm saying is that consciousness or self-awareness isn't a deserving attribute of humans. We're NOT completely self-aware.

Quoting Bitter Crank
So we aren't full aware. So what?


So, are we more like computers or are we very near to, in terms of awareness, to an entity that is completely self-aware?

I ask because if we're more like computers then it changes the whole idea of what it means to be human. We're more like machines than what would lie at the opposite end - total self-awareness.

Quoting gloaming
??? On the face of it, this is self-contradictory. Random is patternless, meaning it can never be precisely the same twice as an intended end.


Take a look at how we really think. Personally, I have many random thoughts going on in my mind. Not totally random I agree. I guess these thoughts arise out of association e.g. when I think of rain I think of umbrella, etc. But then I get to choose which thoughts I think upon and that is random most of the time unless you're thinking of some goal to achieve. That's what I mean.

Quoting aporiap
What about self-monitoring programs? Ones that can modify behavior or output when certain conditions are met -- e.g a robot that self corrects its walking trajectory when it is not going in proper direction? I think that involves some amount of self awareness


Yes. That can be correctly classified as some level of self-awareness. This leads me to believe that most of what we do - walking, talking, thinking - can be replicated in machines (much like wormw or insects). The most difficult part is, I guess, imparting sentience to a machine. How does the brain do that? Of course, that's assuming it's better to have consciousness than not. This is still controversial in my opinion. Self-awareness isn't a necessity for life and I'm not sure if the converse is true or not.

Quoting Arne
And if and when they do succeed in programming awareness of awareness, will we then distinguish the human from computer by talking about being aware of being aware that we are aware? If second order awareness such as self awareness is being aware of awareness, then wouldn't third order awareness simply be awareness of self awareness?


This logic fails I think. It doesn't make sense to say ''I'm aware that I'm aware that I'm aware...'' After some iterations we can't grasp the meaning of such statements. Anyway, the base of any such ordered awareness begins at ''I am self-aware''. We can try and achieve that first for computers.

What I'm really interested in is to show that we're more like machines than we think. We can imagine an entity x who possesses complete self-awareness of its being from the atomic to the macroscopic and humans don't possess that level of consciousness do we?

Quoting TogetherTurtle
To be human (or self aware) is not biological, in fact, you don't even have to be biological to be a human. You just have to be very good at seeming like a human. Consciousness is an illusion, but a very very good one. This is why most people would consider Superman a human, even though he is in fact extraterrestrial.


Consciousness is an illusion you say but what is that which experiences this illusion?

Arne June 14, 2018 at 10:00 #187810
Quoting TogetherTurtle
Self awareness is hard to define. It is easiest to describe it as "what we have and the beasts do not" so yes, by definition, we have self awareness.


It rests upon the unstated presumption that you have something beasts do not. If all H's (Humans) have A's and only A's and all B's (Beasts) have A's and only A's, then the statement that SA = that which H's have but B's do not produces a null set. And even if you could establish some sort of qualitative and/or quantitative difference between the awareness Humans have and the awareness Beasts have, that difference would not necessarily be a difference in a degree of awareness regarding awareness, i.e., self-awareness. Couldn't such a difference simply be a difference in awareness of how or how much? For example, if all Humans were aware to some degree as the result of having visual sense of entities while all Beasts were aware to some degree as the result of having a sonic sense of entities, then under your formula the difference between visually sensing entities would be an awareness the Humans have and that the Beasts do not and would therefore be, under your formulation, self-awareness.

Seriously, I share your interest in the subject matter. But I maintain the deeper issue is why some seem so insistent upon reserving to or creating for human (and only human?) some sort of unique normative ontological priority. This apparent need to preserve, reserve, and/or create a significant normative specialness for human is quite fascinating.
tom June 14, 2018 at 10:57 #187824
Quoting Arne
Seriously, I share your interest in the subject matter. But I maintain the deeper issue is why some seem so insistent upon reserving to or creating for human (and only human?) some sort of unique normative ontological priority. This apparent need to preserve, reserve, and/or create a significant normative specialness for human is quite fascinating.


If animals possess qualia - i.e. they can create "what-it-is-like" knowledge, then what is to stop them from creating, as humans do, any kind of knowledge?

Humans are quite unique. We are the only known objects in the universe that create explanatory knowledge. In order to achieve this remarkable feat, there are strong arguments to the effect that we require at least two features: computationally universal hardware, and software that is not genetically determined. There is absolutely no evidence that animals possess either of these.

Animals have been on the planet a lot longer than humans, and you none of them has developed a language, literature, culture or science. You may be fascinated that certain philosophers think this sets humans apart from other animals, but are you also fascinated that we don't ascribe morality to animals and put them on trial for their misdemeanors? Surely we have to prosecute them to escape the charge of "significant normative specialness"?
tom June 14, 2018 at 10:59 #187825
Quoting TheMadFool
All I'm saying is that consciousness or self-awareness isn't a deserving attribute of humans. We're NOT completely self-aware.


What does it mean to be "completely self aware" as opposed to just self aware?
tom June 14, 2018 at 11:00 #187826
Quoting raza
You seem to assume you are aware of a stone while that stone is not aware of you.


It's not an assumption, it is a consequence of known physics. Stones are not and cannot be self aware, or even aware.
raza June 14, 2018 at 11:33 #187830
Quoting tom
It's not an assumption, it is a consequence of known physics. Stones are not and cannot be self aware, or even aware.
You make a distinction between aware and self-aware.

At what point is one "aware" without some sense of self?

What I am getting at is there not just general moment to moment experience which may as well be called "awareness"?

After all we cannot be aware of anything other than what is our immediate experience - the very same experience within which our self is assumed to arise with.

So the so-called "self" cannot really be other than every other apparent object which arises as the experience. Sometimes this experience may consist of what is defined as a stone.

During your experience of a stone the stone must also be you. You can never be outside of your experience.

So if you are "aware" then you as the experience is "aware", and the "experience" is every thing within that.

You cannot exist outside of experience and you do not exist within experience.

There is just Experience.

Try having an experience where you are not. It is just not possible. So we should just be dealing with reality rather than what we think maybe real.

Thinking there is a you that exists that then goes about having an experience (awareness) is merely some idea.

To consider such an idea as real is essentially absurd.

Awareness must be whatever arises. Sometimes this will be a stone. Consequently a stone is equally awareness.

raza June 14, 2018 at 11:37 #187832
Quoting tom
it is a consequence of known physics


Where is one when conducting a physics experiment?

One cannot be other than whatever the experience happens to be, in this case the physics experiment experience.



Arne June 14, 2018 at 11:51 #187834
Reply to tom that we are unique is not the issue. I am quite confident that there are many species that are unique in their own way. The deeper issue is to behave as if our "uniqueness" justified a normative superiority vis-à-vis other species. You are obviously aware of the now decades old claim that "unlike humans, computers can't X" where X is continually replaced once the computer is then programmed to do X. They cannot do X (beat a grandmaster at chess), they cannot do X (display emotions), and now they cannot do X (display self-awareness). We somehow want to claim that the universe is a better place for all because humans and only humans can do X while the truth appears more likely to be that the universe is a better place only for humans because humans and only humans can do X. Do you not see the pattern here as well the desperation to perceive an indifferent universe as somehow better off because of our presence? What in the world is that all about?
tom June 14, 2018 at 12:34 #187842
Quoting Arne
The deeper issue is to behave as if our "uniqueness" justified a normative superiority vis-à-vis other species.


Other species don't possess qualia, so we are different. Animals don't possess computationally universal brains, so we are different.

Quoting Arne
We somehow want to claim that the universe is a better place for all because humans and only humans can do X


That is false and contrary to known physics.

Quoting Arne
Do you not see the pattern here as well the desperation to perceive an indifferent universe as somehow better off because of our presence? What in the world is that all about?


There is no pattern except you making assumptions.

Humans are the only objects in the universe known to create explanatory knowledge and possess qualia. You are assuming that I claimed humans were the only objects that could do these things, when I did not. It's a pattern.


gurugeorge June 14, 2018 at 13:20 #187854
Reply to TheMadFool I think it's always useful as a grounding maneuver to think of the development of thought as something that was opened up by the possibility of lying.

IOW, signalling of internal states (emotion) initially evolved as a co-ordination mechanism for social creatures, but at some point the neat trick of lying about internal states for some kind of advantage was discovered, and then the possibility of holding the falsehood (the counter-factual) and truth in mind at the same time, then we were off to the races.

So it's not just self-awareness as such (a machine can self-monitor) but it's more to do with an interpersonal game (something Turing was aware of with the Turing test - i.e. detecting intelligence would be closely connected to detecting cheating).

That's why I think that AI people, if they're really aiming at intelligence proper as we humans understand it, and not just at expert systems and machine learning systems, probably need to think more about intelligence as a function of sociality. Not "an AI" but a community of AIs. All the most intelligent animals (with the odd exception of the octopus) are social - corvids, parrots, wolves etc., humans.

(Another way of saying this might be that intelligence probably requires a limbic system analogue - there has to be some sense of something at stake, something mattering, to the AI. But then at that point, there's the danger of losing the crisp cleanliness that we associate with computers, and getting into the murky, shifty complexity that is genuine intelligence, so it hardly seems worth it to try and create a genuine artificial intelligence.)
Arne June 14, 2018 at 13:31 #187858
1. I am certain I clearly said I had no issue with whether we are unique. We are, after all, the only species that has ever intentionally killed others over a disagreement regarding the transubstantiation of a piece of bread. I suspect it does not get more unique than that. My point was and still is that uniqueness is not a synonym for superiority;

2. I made no claims inconsistent with known physics. I stated we want to claim that the universe is better because of our presence. Apparently you agree with me that such a claim is absurd.

3. I did not assume that people claimed computers will never be able to beat a Grandmaster at chess. I was there when sceptiks of AI made the claim. You can look it up. However, once Deep Blue did beat a Grandmaster, the claim then became computers will never be able to display emotions. That is not an assumption on my part. I was also there when AI skeptics made that claim. And now the new X is that computers will be unable to display awareness of self. That is not assumption on my part. Every time computers are able to do the X that the skeptics say they will never able to do, the skeptics come up with a new X. If you want to argue that X1 replaced by X2 replaced by X3 is not a pattern, then good luck with that. I have obviously failed in my attempt to persuade you to see the deeper issue regarding self awareness as pregnant with our need to treat uniqueness as a synonym for superiority. That failure is on me. I am done now.
Belter June 14, 2018 at 14:22 #187867
Quoting TheMadFool
What distinugishes a human from a computer?


Computer is a mathematical concept. Alan Turing defined it (the Turing machine) and also its limits (the halting problem). Human is the biological specie of Alan Turing.
Anthony June 14, 2018 at 14:35 #187870
Quoting TheMadFool
Would we be closer to the computer or being x?


Being x. It's always rather odd to me people want to focus on computer models (computer as model) as representing intelligence or awareness instead of, say, the integrated processes (mind) of an old growth forest. My style of consciousness and components of mind communicate in a way unimpeachably closer to the minute feedback systems you find in the "cognitive network" of an ecologically complex superorganism (forests). Living compost on a forest floor is far more impressive and complex in its self-awareness than a computer could ever be (interspecies communication requires species; is a computer a specie? nope). Yet this is only a small, local slice of what's going on "information-processing" wise in a organic superorganism, like any robust sylvan environment. Mycelial mats connect plants and trees and search for feedbacks that then determine what they will do in bringing balance to players of the network locally, and non locally. Mycelial mats can connect thousands of acres of forest in this way. This is very much like a neural network of nature.

Honestly, taking computers to be intelligent, or most absurdly, at all self- aware, and not nature, tends to gore my ox...so I'm apt to wax too emotional here, but perhaps I'll be back with some cool examples as to why computers cannot be self-aware compared to the far more self-aware "being x" that can be found in nature (of which I'm a part way more than a computer). That is to say, my self-awareness is far more an extension of the order and processes going on in the superorganism of a forest than anything in silicon. We can understand (a priori), computers don't understand anything. We are aware of our limitations, computers are not. Because we are aware of our limitations thanks to nature's gift of metacognition (note I'm not saying a computer's gift of metacognition), we can ask questions about how we are limited, such as boundaries the subconscious puts on conscious awareness. You can even ask sci-fi questions about computer sentience thanks to nature's vouchsafing of self-awareness. Somehow, self-awareness is a part of having a mind that is informed nonlocally by interminably incomplete information. A machine only has to handle incompleteness according to its implied programming or manufacturing: algorithms and egos are veeery much alike, and both are chokingly narrow-minded, unreasoning. Seeing as the human brain-mind isn't invented or programmed and doesn't do calculations, and that it is likely nonlocally entangled with the universe in a way that remains forever incomplete (unless perhaps in deep sleep or dead), we think about thought and have thought about thinking, emote about emotions and have emotions about emoting: nothing is more sublime than wondering about wonder, however. I wonder if computers could ever wonder? What about the utterly unreasonable idea that a computer could have emotions reliant on programming...laughable. Reminds me of someone, having missed the punchline, laughing at a joke just because he knows it's supposed to be funny.
tom June 14, 2018 at 17:51 #187906
Quoting Arne
I am certain I clearly said I had no issue with whether we are unique. We are, after all, the only species that has ever intentionally killed others over a disagreement regarding the transubstantiation of a piece of bread. I suspect it does not get more unique than that. My point was and still is that uniqueness is not a synonym for superiority;


Null sets, though.

Quoting Arne
It rests upon the unstated presumption that you have something beasts do not. If all H's (Humans) have A's and only A's and all B's (Beasts) have A's and only A's, then the statement that SA = that which H's have but B's do not produces a null set.


Quoting Arne
And even if you could establish some sort of qualitative and/or quantitative difference between the awareness Humans have and the awareness Beasts have, that difference would not necessarily be a difference in a degree of awareness regarding awareness, i.e., self-awareness.


Humans have something other animals do not - a computationally universal brain, and a self aware mind.


Arne June 14, 2018 at 18:13 #187910
Reply to tom I am done. I can be no clearer.
tom June 14, 2018 at 18:15 #187912
Quoting Arne
I am done. I can be no clearer.


Done? You mean hoist by your own petard.
Arne June 14, 2018 at 18:20 #187916
seriously, don't flatter yourself.
tom June 14, 2018 at 18:44 #187924
Quoting Anthony
It's always rather odd to me people want to focus on computer models (computer as model) as representing intelligence or awareness instead of, say, the integrated processes (mind) of an old growth forest.


It has been proved that, according to known physics, a universal computer can emulate any physical system exactly. It's not odd, it's reality.
BC June 14, 2018 at 19:06 #187927
Quoting TheMadFool
So, are we more like computers or are we very near to, in terms of awareness, to an entity that is completely self-aware?


We are not like computers, at all.

tom June 14, 2018 at 19:36 #187934
Quoting Bitter Crank
We are not like computers, at all.


Our brains cannot be more than computers, according to physics.
TogetherTurtle June 14, 2018 at 19:54 #187940
Reply to Arne I believe your fallacy is that you feel a need to justify a non centrist view. You don't think that humans are special because to be so would seem... self centered. I agree that is how it seems, throughout history humans have had a tendency of thinking they are the center of the universe, figuratively and literally. However, completely disregarding that we could be special at all is absurd. We have very clearly defined differences from animals. We have complex architecture, we have many written languages, we have been to space. We have created art that may be indistinguishable from photographs because it's so good. We use electricity to power our lights and personal secretaries we carry on our phones. Humanity is certainly special. The mistake is thinking that we are biblically special. We have no divine purpose, and there is no evil to banish from the universe. We are here by coincidence, and what we have made out of our circumstances is what makes us special. That is what "separates us from the beasts" as I said earlier. We are special because of our luck, our intelligence, and our desire to improve.
TogetherTurtle June 14, 2018 at 20:05 #187941
Reply to TheMadFool Why, you experience the illusion of course. Self consciousness is the mind making sense of its world. There is no reason why your brain sees the color red as specifically that color. It could assign it any other shade, and in fact, it does often. Color blindness is the result of the brains inability to differentiate between two colors that to it seem the same, and to the rest of us, are different. Mental illness also interferes with self awareness. Seeing monsters that aren't real is just the brain playing tricks on the mind.

The things that you see are more or less accurate, but some things are chosen arbitrarily. Color is just waves of the light spectrum being reflected into your eyes. If you were able to see those particles outside of the human mind, they wouldn't be red or green or blue or yellow. Color is made up, but it is useful, and that's why we have it. Color is a very good example of the brain processing the illusion of consciousness for the mind to observe and decide what to do next. Imaginary shades of color seem so real to us, but simply don't exist outside our minds.
Anthony June 14, 2018 at 20:34 #187945
Quoting tom
It has been proved that, according to known physics, a universal computer can emulate any physical system exactly. It's not odd, it's reality.


What is a universal computer? I've heard of the Cosmology Machine and was taken aback at the level of hubris. It's amazing the "science" (pseudoscience) of meteorology continues to claim it can forecast, when all it does is update based on essentially current conditions. Don't meteorologists rely on computer simulations? Their computers, then, fail miserably in attempting to compute nonlinear conditions. Now, how in the world would I believe, if the weather forecast is always wrong for regions of our planet, it would ever be possible for a machine to simulate the physical conditions of the entire cosmos?


tom June 14, 2018 at 20:49 #187946
Quoting Anthony
How does the brain depend on computations?


What does that question even mean?

Quoting Anthony
I've thought it, computations, the hammer of mathematicians which treats of everything in existence like its nails.


Nails? Computational universality has nothing to do with mathematicians, or nails.

Quoting Anthony
Even though many phenomena can be analyzed mathematically doesn't mean math was required to bring them into existence.


You've lost me.

Arne June 14, 2018 at 20:52 #187947
Reply to TogetherTurtle you and that other guy are arguing with yourselves. I have never denied that we are "unique." So stop thinking you need to persuade me that we are "unique." And now in addition to "unique" you are claiming that we are "special". Fine, we are "unique" and "special". We are "unique" and we are "special" and therefore. . . . . . . . . . . . WHAT? When are you going to fill in the therefore. . .? Your own examples are absurd. We are so good at art that we can paint a picture that looks almost as real as the machine we built to take pictures. We can go to concerts and listen to musicians play music that sounds almost as good as their latest studio album. Did it ever occur to you that we are so "unique" and "special" that we could actually create a being that is more "unique" and "special" than we are? We are "unique" and "special" and therefore WHAT??? Make a freaking argument!!
Wayfarer June 14, 2018 at 20:58 #187948
Quoting Arne
we could actually create a being


There's only one way known for humans to actually 'create another being', and that is by reproduction.

Computers are devices, by definition. They're manufactured artefacts that in essence are large arrays of switches that are able to emulate or model various cognitive and computational processes.

But I have never seen anything to persuade me that a computer is a subject of experience.
Anthony June 14, 2018 at 20:59 #187949
Quoting tom
Computational universality has nothing to do with mathematicians
Mathematicians are human computers...or mentats if you like. Once there was only the abacus for a computer.

Quoting tom
What does that question even mean?


You said, see below, that humans have a computationally universal brain. Maybe I'm one of those jugheaded laymen that needs an explanation here. Perhaps I'll look it up. Apologies.

Quoting tom
Humans have something other animals do not - a computationally universal brain, and a self aware mind.
I didn't know human brains differed that much from other mammal's brains, functionally? The human mind is what differs most patently, not the brain. As to why we are so self aware compared to other organisms is a question we should be very careful in limiting to any sort of computation.

Btw, I redacted my previous post.

apokrisis June 14, 2018 at 21:02 #187950
Quoting tom
Our brains cannot be more than computers, according to physics.


Are algorithms physical? In what sense are you using the term physics to mean a scientific model of both hardware and software?

You can start with a Turing machine if you like.

I see the simple gate mechanism that switches the state of a symbol. I see the infinite length of tape on which those marks are recorded. I try hard not to mention the problems the second law creates for this imagined material device.

But then this machine needs someone to write it some rules, supply it some data, understand the results.

In what sense are you saying that all that rather mental stuff is reduced to the same materialistic physics used to imagine the hardware?

Arne June 14, 2018 at 21:04 #187951
Quoting Wayfarer
There's only one way known for humans to actually 'create another being', and that is by reproduction.
Definition of being. 1 a : the quality or state of having existence.

Seriously, what am I, chopped liver.

and please define "experience".

I will wait here.


Wayfarer June 14, 2018 at 21:14 #187954
Quoting Arne
Definition of being. 1 a : the quality or state of having existence.


That is only a partial definition. Really, defining or coming to an understanding of the meaning of fundamental terms like 'being' and 'existence' are a basic task in philosophy. Many people - presumably including yourself - simply assume that it is obvious what the word 'being' refers to, and that computers and beings are pretty much the same kind of thing. But when you analyse such beliefs, they rest on many unjustifiable assumptions.

For instance, the word 'being', in this context, can be used either as a noun, i.e. 'human being', or as the present participle of the verb 'to be'. What I'm saying is that computers are not 'beings' in the sense conveyed by the former. And in fact we don't refer to them as such. If you were standing outside a burning building, and were to ask, 'is there anyone in that building'? you wouldn't be asking 'are there any computers in that building?' If you said (although it would be a strange turn of phrase) 'are there living beings in there?', then it could be taken to be asking: are there humans, rats or pigeons in there?

So, the definition of 'being' as 'something that exists' doesn't capture something distinctive about living beings. And in fact, I am of the view that the words 'to be' and 'to exist' are not strictly synonymous, but I will leave that aside for now. But part of the meaning of 'being' in this context is precisely that beings are 'subjects of experience'. It can be said of humans, rats and pigeons, but not of artefacts or devices.
Arne June 14, 2018 at 21:22 #187956
Quoting Wayfarer
Many people - presumably including yourself - simply assume that it is obvious what the word 'being' refers to, and that computers and beings are pretty much the same kind of thing.


Seriously, you are going to presume that I have a shallow understanding of being?

You are the one whose understanding of being was shallow to the point that you presumed that by being I meant human being.

If you want to give it another try, I will continue to wait here.
Arne June 14, 2018 at 21:24 #187957
Reply to Wayfarer Being is that upon the basis of which human being renders intelligible the always already existing world into which it is thrown. And that is my definition.
Arne June 14, 2018 at 21:44 #187958
your 19 minutes is up. Going forward, I do believe it would be a bit more philosophical if you were to ask some questions regarding my understanding of something rather than presume my understanding is shallow.
Heiko June 14, 2018 at 21:56 #187961
read Heidegger...
There seems to be something special about it if one can make a difference between "a human" and "a human being". What is it that gets emphasized? This is not to say your definition was wrong nor that I'd think this should really make a difference in this context (aside for the sake of the argument).
Wayfarer June 14, 2018 at 22:07 #187964
Quoting Arne
I do believe it would be a bit more philosophical if you were to ask some questions regarding my understanding of something rather than presume my understanding is shallow.


My entry was simply based on the post that I was commenting on.

Quoting Arne
Being is that upon the basis of which human being renders intelligible the always already existing world into which it is thrown. And that is my definition.


I wouldn't disagree, but it is a very broad definition which raises further questions. But in relation to the question at hand, what does it say about the question of whether computers are conscious subjects of experience? Because I take that question to be central to the OP.
Heiko June 14, 2018 at 22:28 #187967
I guess the Turing-test is much too technical. Over the procedure the main point is forgotten: There is no self-conscious AI until it proves itself to be one.
TogetherTurtle June 14, 2018 at 23:07 #187974
Reply to Arne Quoting Arne
you and that other guy are arguing with yourselves. I have never denied that we are "unique." So stop thinking you need to persuade me that we are "unique." And now in addition to "unique" you are claiming that we are "special". Fine, we are "unique" and "special". We are "unique" and we are "special" and therefore. . . . . . . . . . . . WHAT? When are you going to fill in the therefore. . .? Your own examples are absurd. We are so good at art that we can paint a picture that looks almost as real as the machine we built to take pictures. We can go to concerts and listen to musicians play music that sounds almost as good as their latest studio album. Did it ever occur to you that we are so "unique" and "special" that we could actually create a being that is more "unique" and "special" than we are? We are "unique" and "special" and therefore WHAT??? Make a freaking argument!!


Well, to start, I don't really know who you mean by the other guy. I guess someone else found the fallacy as well.

The point to us being special is that, yes we have self awareness, and yes, anything we make that has it as well is also special.

Therefore, yes, we have self awareness, yes, machines can be self aware, and no, animals are not self aware. That was my argument from the beginning and it seems that is the argument I will have at the end as well. You are simply being unreasonable at this point. While it seems wrong, we are special, we are different, we are self aware, and animals are not. To be frank, you should have more pride in being human. We have built every civilization on this planet and made all of its scientific breakthroughs. If you really don't think we are aware of ourselves, you are sorely mistaken.
Arne June 14, 2018 at 23:34 #187981
Reply to Wayfarer Quoting Wayfarer
what does it say about the question of whether computers are conscious subjects of experience? Because I take that question to be central to the OP.


I disagree for two reasons:

1. The original post posits self awareness as the issue rather than conscious[ness]. And I have no reason to presume the poster chose his terms carelessly. And though one could carelessly consider them synonymous, that would be a tough argument to make. All reasonable people would agree that my dog and I are both conscious beings. Yet I doubt all reasonable people would agree that my dog has a sense of self awareness. And if all beings who are conscious are not necessarily self aware, then conscious and self-awareness cannot be synonymous. So absent a reason to believe the original poster meant something other than what he said, it would be anti-philosophical to presume the central issue is other than self-awareness; and

2. It is where "human" stands on the spectrum of self-awareness relative to the computer that is the central question. As the poster clearly asks "Would we be closer to the computer or being x?" Again and with all due respect to the poster, it would be anti-philosophical to suggest a different question is "central to the OP."

……………………………………………………..? <---- HUMAN ----> ?

Rock --------------------------Computer---------------------------------------------------------------Being X

And in an attempt to advance the issue, I suggest that Human is closer to the Computer. However, I suspect that Human is unlikely to move significantly (if at all) closer to Being X but that the Computer certainly will move closer to Being X. If that is the case, then the deeper issue becomes whether Computer will move past Human on the spectrum of self awareness.

Further, self-awareness rather than consciousness strikes me as an interesting twist to this now age old debate. In order for there to be self-awareness, there must be awareness. If we call awareness "AL1" (Awareness Level1) and self awareness "AL2" (Awareness Level2) and awareness of self awareness "AL3" (Awareness Level3), are we not already at AL3? And at what AL(x) is Being X?

Finally and most important of all, is this simply a more grown up version of the "I know" game?

I know
I know you know
I know you know I know.

I am aware
I am self aware
I am aware that I am self aware

Because that is the way it feels every time contemporary programming achieves that which yesterday's learned skeptics said it will never achieve. If we ever had a working definition of "conscious" (which we do not) and coders were able to represent it, you can bet your bottom dollar we would promptly change the definition.
Arne June 14, 2018 at 23:40 #187982
Quoting TogetherTurtle
Well, to start, I don't really know who you mean by the other guy. I guess someone else found the fallacy as well.


Wrong.

You may rest assured that the others guy's mistakes are not as "unique" and "special" as yours.

How fallacious of me to expect people to actually make arguments in support of their claims.

When will I ever learn?
Wayfarer June 14, 2018 at 23:54 #187989
Quoting Arne
I doubt all reasonable people would agree that my dog has a sense of self awareness


I agree that animals aren't reflexively self-conscious to the same degree that humans are. But I still say that a dog is a subject of experience.

Actually, your post made me go back and read the OP again - when I jumped in previously, it was in respect of a general view of the question of the difference between computers and sentient beings prompted by this remark:

Quoting Arne
Did it ever occur to you that we are so "unique" and "special" that we could actually create a being that is more "unique" and "special" than we are? We are "unique" and "special" and therefore WHAT???


So my point was simply that, 'beings' are of a different order to 'devices', including computers. And that furthermore, there is no instance of humans ever having 'created a being' other than by the act of procreation, if that counts as 'creation'. So you're correct in saying I wasn't really addressing the OP. And going back to the OP again, I would single out this paragraph:

Quoting TheMadFool
Now imagine a being x who is completely self-aware in every respect from the atomic realm to the macroscopic world we're familiar with. Such a being is what I call truly self-aware.


I think this is problematical, as I think that 'complete self awareness' of that kind is a logical impossibility. So the hypothetical 'being X' is not something that could ever exist, which renders the entire OP rather pointless, in my opinion. So, nothing further to add, at this point.
Arne June 15, 2018 at 00:18 #187996
Quoting Wayfarer
Now imagine a being x who is completely self-aware in every respect from the atomic realm to the macroscopic world we're familiar with. Such a being is what I call truly self-aware. — TheMadFool
I think this is problematical, as I think that 'complete self awareness' of that kind is a logical impossibility. So the hypothetical 'being X' is not something that could ever exist, which renders the entire OP rather pointless, in my opinion. So, nothing further to add, at this point.


Perhaps you and the poster have a different understanding of imagine. It never occurred to me that imagination must be limited to the logically possible. Oh well.

I am going to bed now.
Arne June 15, 2018 at 00:24 #187999
Quoting TogetherTurtle
To be frank, you should have more pride in being human.


Seriously? Perhaps you should place your pride in who you are rather than what species you were born into. The former depends entirely upon your choices while the latter has absolutely nothing to do with anything you have ever done.
TogetherTurtle June 15, 2018 at 02:29 #188019

Reply to Arne Quoting Arne
Seriously? Perhaps you should place your pride in who you are rather than what species you were born into. The former depends entirely upon your choices while the latter has absolutely nothing to do with anything you have ever done.


Quite to the contrary. The species I was born into is the whole reason I can be who I am. The human intellect is unmatched. If I was a dog, I would not be here typing this I assure you. In fact, I just asked my dog if she would like to defend herself. She met me with annoyed silence, as she was trying to sleep. I won't go as far as to blame my dogs lack of sleep on you, but I will tell you this. You have a distinct smell of arrogance around you and your posts. I refuse to go to name calling, and despite how hostile you may respond I will not. However, I will give some examples of your assholery.

Quoting Arne
Wrong.

You may rest assured that the others guy's mistakes are not as "unique" and "special" as yours.

How fallacious of me to expect people to actually make arguments in support of their claims.

When will I ever learn?


This one is interesting because you still never explain why you thought I saw someone else's argument against you, you continue to ignore the fact that everyone who has responded to you is trying to explain your arguments faults and is making an argument, and you decide to add a sarcastic stinger on the end. If this was "Snarky Teenager Forum" I would applaud you. However, this is not such a place.

Quoting Arne
Perhaps you and the poster have a different understanding of imagine. It never occurred to me that imagination must be limited to the logically possible. Oh well.

I am going to bed now.


I don't think the writer ever implied that imagination had to stay within the realm of logic. Again, you like to end your posts with some kind of statement meant to irritate and provoke. It's almost as if you want attention?

Quoting Arne
I am done. I can be no clearer.


You mean someone doesn't understand your idea? That couldn't be evidence that you are spouting nonsense and refuse to reason could it?

Quoting Arne
and please define "experience".

I will wait here.


He of course meant the experience of living, of seeing, feeling, hearing, touching, tasting. Have you ever heard of the term "I experienced ____". It's really the only way you can take that. If I'm wrong I would gladly take an alternate explanation, but I know you wouldn't, so I'll stop here. If anyone reads this far, this man is a lunatic. Give him no more attention, he only thrives on it.
Arne June 15, 2018 at 02:44 #188022
Quoting TogetherTurtle
Quite to the contrary. The species I was born into is the whole reason I can be who I am. The human intellect is unmatched. If I was a dog, I would not be here typing this I assure you.


the fact that you take pride in your ability to type only proves my point. Your standards are too low. And stop with the type/token stuff. The human intellect may be unmatched, but it is clear that cannot be said of your's.

.Quoting TogetherTurtle
This one is interesting because you still never explain why you thought I saw someone else's argument against you


because the only difference in your equally ridiculous arguments is that he used the word "unique" while you used the word "special". Another mystery solved.

Quoting TogetherTurtle
He of course meant the experience of living, of seeing, feeling, hearing, touching, tasting. Have you ever heard of the term "I experienced ____". It's really the only way you can take that. If I'm wrong I would gladly take an alternate explanation, but I know you wouldn't, so I'll stop here. If anyone reads this far, this man is a lunatic. Give him no more attention, he only thrives on it.


Listen to Mr. Fallacy talk about wanting attention. You may rest assured, I would more than happy with a little less attention from you. And how wonderfully philosophical of you to speak for others and to direct them how to respond to me. I am sure they appreciate that.

Dude, this ain't facebook.

TogetherTurtle June 15, 2018 at 02:57 #188027
Reply to Arne I know I said I was done but I just think it's really funny how you didn't respond to this



I am done. I can be no clearer.
— Arne

You mean someone doesn't understand your idea? That couldn't be evidence that you are spouting nonsense and refuse to reason could it?


I believe it is customary to tell you that I'm "Going to bed now"

Also if I ever talk to you again, I'm calling you "Mr. No Clearer"
tom June 15, 2018 at 06:18 #188055
Quoting apokrisis
Are algorithms physical?


I argued in another thread that algorithms are not physical - they are logical. Of course, their instantiation must be physical, but given that this is arbitrary, the instantiation and the algorithm are different things. An identical algorithm may be instantiated on Babbage's analytic engine or on a, yet to be constructed, quantum computer. The instantiations will be subject to quite different physical laws, one effectively classical, the other quantum, but the algorithm itself is not subject to the laws of physics.

Quoting apokrisis
In what sense are you using the term physics to mean a scientific model of both hardware and software?


Usually when I use the term "physics" I am referring to that body of knowledge relating to the fundamental structure of reality.

Quoting apokrisis
You can start with a Turing machine if you like.


Why would I do that? Turing machines don't exist, they are mathematical abstractions.

Quoting apokrisis
In what sense are you saying that all that rather mental stuff is reduced to the same materialistic physics used to imagine the hardware?


Pretty sure I made no such claim.
tom June 15, 2018 at 06:22 #188056
Quoting TogetherTurtle
Therefore, yes, we have self awareness, yes, machines can be self aware, and no, animals are not self aware. That was my argument from the beginning and it seems that is the argument I will have at the end as well.


Did you give an argument that animals are not self aware, or did you just assert it?
apokrisis June 15, 2018 at 11:39 #188100
Quoting tom
I argued in another thread that algorithms are not physical - they are logical.


So we agree that physics doesn’t account for that part of the structure of reality that is an algorithm?

Great.

Now what is it that says an algorithm is logical as such? The universe of randomly produced rule sets would be infinite. What would select among all those to create ones we would call a logical system?

Then I guess your algorithms have to have data to work on. Again, how would the input get selected so that it had physically relevant meaning?
TogetherTurtle June 15, 2018 at 13:10 #188119
Reply to tom My argument is that animals are not self aware because they simply aren't aware of themselves. Many animals are smart, and most have emotions, but they don't ask why they are smart or why they have emotions. That is the primary distinction between humans and animals, asking why. They lack the mental capacity to ask why things happen, only to investigate what something was or where it is. I don't know if you own any pets, but from every pet I've ever owned, it is clear that they lack this ability. My cat has a frequent problem with going to the bathroom on dirty clothes. If he had asked, "Why am I doing this?" I would like to think he wouldn't do it. It would be much easier on him and me if he had just gotten my attention and allowed me to take him outside. I dearly love my cat, but I understand that there is a difference between the inquisitive man and the curious cat.
tom June 15, 2018 at 13:15 #188127
Quoting TogetherTurtle
My argument is that animals are not self aware because they simply aren't aware of themselves.


That's not a particularly convincing argument.



TogetherTurtle June 15, 2018 at 17:43 #188177
Reply to tom My line of thinking is that humans are self aware because they can distinguish themselves from everything else. They are aware that they are themselves. Animals more or less exist within their ecosystem. Of course, at the subatomic level, we are nothing but particles bound together but never touching. We are all just pieces of the primordial soup the universe is made out of, and will return to that someday. Animals and plants are one step above that in terms of scale. They are aware to some extent that they are just one thing, but still are very connected to the world. They don't seem to fully grasp that they are alive. When their lives are in danger, they fear primarily because of instinct, and not because of losing their ability to experience the world. Often when a person is on their deathbed or bleeding out, you hear them talking about what they are leaving behind, what they never got to do, fearing if there is anything after death. A dog or cat fear death because they are biologically hardwired to fear things that can kill them, same with humans, but when you see a man on his deathbed, you come to realize that there is much more than just a mechanism for self preservation at play. While humans are biologically similar in relation to their scale and reliance on biology to animals, it is the complexity of the human brain that brings the next level, it is why man is aware that it is a separate entity from the universe, or at least the mind is. While you need to consume resources from the universe to retain that self awareness in the form of biological life, you can recognize yourself as a separate thing entirely.

Or I'm wrong. There's only two possibilities right? What do you think?
tom June 15, 2018 at 18:14 #188180
Quoting TogetherTurtle
Or I'm wrong. There's only two possibilities right? What do you think?


I think that most people when confronted with the idea that animals are not sentient, do not possess qualia, don't even know they exist etc. find that notion repulsive and experience various degrees of emotional outrage.

However, I gave an outline of various hints and arguments that this is indeed the case. There is a computational and epistemological argument that they cannot know anything beyond what they are programmed to know, and they are not programmed to be self-aware or other-aware, because they, lacking appropriate hardware, cannot be.

Another argument comes from the impressive work of the psychologist R. W. Byrne. Animals learn by behaviour parsing, not by understanding.
http://pages.ucsd.edu/~johnson/COGS260/Byrne2003.pdf

For some reason we find the notion that animals don't suffer horrifying, when it is in fact a blessing.
Heiko June 15, 2018 at 18:52 #188189
AI is exciting only when one cannot forsee what it will do.
TogetherTurtle June 16, 2018 at 02:17 #188267
Reply to tom Quoting tom
However, I gave an outline of various hints and arguments that this is indeed the case. There is a computational and epistemological argument that they cannot know anything beyond what they are programmed to know, and they are not programmed to be self-aware or other-aware, because they, lacking appropriate hardware, cannot be.


In that we are in agreement. Self awareness is simply the result of superior hardware and software.

Quoting tom
For some reason we find the notion that animals don't suffer horrifying, when it is in fact a blessing.
8 hours ago


I think that it would be impossible for animals to not have emotions. They are a process of evolution, and are useful in the wild. If they didn't it would be better for us, but I don't really buy that my cat is faking it when hes glad to see me.

While I don't think that it is right to treat animals poorly on purpose, some killing is inevitable. Meat and its consumption is deeply ingrained into the culture of almost every people on earth. We are omnivores after all. Animals feel emotion, but in the human world, we overlook feelings for the greater good, so why wouldn't we apply that to animals as well? Death is simply the end of life, destined to happen from birth. Animals are our friends, and we should treat them well, but in the end, that's just how things are on our planet. Food chains and all. There is no reason to fear the facts of life.

As the more intelligent beings, I would like to believe it should be our responsibility to see to it that the life we are so closely related to and so dependent upon is treated well for as long as we can afford to let it live. Someday we will know enough to gift them with the blessings nature has given us naturally, and we will be able to very easily create identical copies of their meat just by having the elements that make them up. Today however, is not that day.
TheMadFool June 16, 2018 at 09:38 #188350
Quoting Wayfarer
I think this is problematical, as I think that 'complete self awareness' of that kind is a logical impossibility.


I fail to see a contradiction in the idea of complete self-awareness. Think of hunger, thirst, pain and the senses etc. These sensations are a form of awareness of the chemical and physical states of the body or the environment.

Why do you think total self-awareness is an impossibility?
TheMadFool June 16, 2018 at 09:45 #188352
Quoting Bitter Crank
We are not like computers, at all.


We're NOT computers, I agree. But are we machines, just of a higher order? That's what I want to know.
TheMadFool June 16, 2018 at 10:23 #188353
Quoting TogetherTurtle
Why, you experience the illusion of course.


I mean there must be an x for which consciousness or whatever else is an illusion. Is this x real or also an illusion?

Are you saying there is no such thing as consciousness?
TheMadFool June 16, 2018 at 10:28 #188354
Reply to gurugeorge:up:

So you think social existence contributes towards intelligence. I think so too but what about the ''fact'' that geniuses are usually depicted in culture as socially inept? Is this just one of those myths that have spawned out of movies and literature or is there some truth in it?

I suppose genius-social-misfits aren't completely normal.
TheMadFool June 16, 2018 at 10:34 #188355
Quoting Anthony
Being x. It's always rather odd to me people want to focus on computer models (computer as model) as representing intelligence or awareness instead of, say, the integrated processes (mind) of an old growth forest. My style of consciousness and components of mind communicate in a way unimpeachably closer to the minute feedback systems you find in the "cognitive network" of an ecologically complex superorganism (forests). Living compost on a forest floor is far more impressive and complex in its self-awareness than a computer could ever be (interspecies communication requires species; is a computer a specie? nope). Yet this is only a small, local slice of what's going on "information-processing" wise in a organic superorganism, like any robust sylvan environment. Mycelial mats connect plants and trees and search for feedbacks that then determine what they will do in bringing balance to players of the network locally, and non locally. Mycelial mats can connect thousands of acres of forest in this way. This is very much like a neural network of nature.

Honestly, taking computers to be intelligent, or most absurdly, at all self- aware, and not nature, tends to gore my ox...so I'm apt to wax too emotional here, but perhaps I'll be back with some cool examples as to why computers cannot be self-aware compared to the far more self-aware "being x" that can be found in nature (of which I'm a part way more than a computer). That is to say, my self-awareness is far more an extension of the order and processes going on in the superorganism of a forest than anything in silicon. We can understand (a priori), computers don't understand anything. We are aware of our limitations, computers are not. Because we are aware of our limitations thanks to nature's gift of metacognition (note I'm not saying a computer's gift of metacognition), we can ask questions about how we are limited, such as boundaries the subconscious puts on conscious awareness. You can even ask sci-fi questions about computer sentience thanks to nature's vouchsafing of self-awareness. Somehow, self-awareness is a part of having a mind that is informed nonlocally by interminably incomplete information. A machine only has to handle incompleteness according to its implied programming or manufacturing: algorithms and egos are veeery much alike, and both are chokingly narrow-minded, unreasoning. Seeing as the human brain-mind isn't invented or programmed and doesn't do calculations, and that it is likely nonlocally entangled with the universe in a way that remains forever incomplete (unless perhaps in deep sleep or dead), we think about thought and have thought about thinking, emote about emotions and have emotions about emoting: nothing is more sublime than wondering about wonder, however. I wonder if computers could ever wonder? What about the utterly unreasonable idea that a computer could have emotions reliant on programming...laughable. Reminds me of someone, having missed the punchline, laughing at a joke just because he knows it's supposed to be funny.


Perhaps we've already achieved the greatest thing possible - duplicating rationality - with computers. What remains of our mind, its irrationality, self-awareness, and creativity, aren't as important as we think they are.
TheMadFool June 16, 2018 at 10:36 #188356
Quoting tom
What does it mean to be "completely self aware" as opposed to just self aware?


Complete self-awareness would be knowing the position, function and state of every atom within our bodies and knowledge of our subconscious.

In a way we're not actually free unless we know these things.
gurugeorge June 16, 2018 at 13:12 #188397
Reply to TheMadFool Yeah I think it's two things overlapping. Sociality sets the stage for the development of intelligence, but perhaps with the neural mechanisms that make for intelligence, beyond a certain point other factors take over to make super-high intelligence out of balance with other factors.

Like, suppose intelligence evolved to require the co-operation of A, B, C, D, E genes, with the total contributing to intelligence level, and the set being roughly in balance with most people, but then suppose in some people the E factor is much more heavily weighted than the other factors. That would produce a super-high intelligence. But what if the E factor happens to clash with other aspects of the total personality, making the person inhibited or socially inept?

Another possibility: human beings and animals generally are like these Heath Robinson contraptions, stuck together with duck tape, sticks and glue, that "pass muster" in the circumstances they evolved in for the bulk of their evolution, but don't necessarily function so well outside those conditions. For example sociality in our ancestral environment would have meant knowing, say, about 20 people quite well, and half a dozen really well. What happens when a creature designed for that type of environment is rammed cheek by jowl with millions of strangers in a modern conurbation? Maybe they withdraw into themselves, or whatever.

Lots of possibilities here, of course one would have to know the science and investigate to figure out what's really going on.
BC June 16, 2018 at 18:21 #188517
Quoting TheMadFool
We're NOT computers, I agree. But are we machines, just of a higher order? That's what I want to know.


We are not machines, either. We are organisms, and more, beings. We are born, not manufactured. Our biological design incorporates a billion years of evolution. Life exists without any designing agent: no owners, no designers, no factories, etc. Life is internally directed; machines are made, and have no properties of beings or organisms.

Machines are our human creations; we like our machines, and identify with the cleverness of their design and operation. Our relationship to the things we make was the subject of myth for the ancient Greeks: Pygmalion from Greek mythology, A king of Cyprus who carved and then fell in love with a statue of a woman, which Aphrodite brought to life as Galatea; (the name of a play by George Bernard Shaw, the name of a musical, My Fair Lady--the same theme). We pour our thoughts into our computers, they deliver interesting viewing material to us -- none of it comprehended or created by our machine computers.

That there are "biological mechanisms" like DNA replication, respiration, oxidation, etc. doesn't in any way make us "machines" because "biological mechanisms" is itself a metaphor of a machine mechanism. We're victims of our language here. Because we call the body a machine, (levers, pulleys, engines, etc.) it's an easy leap to body status in things like office copiers and computers, ships, cars, etc.

So... No, we are not machines, not computers, not manufactured, not hardware, not software.

TogetherTurtle June 16, 2018 at 18:49 #188523
Reply to TheMadFool Quoting TheMadFool
Are you saying there is no such thing as consciousness?


In a way, yes. it isn't a tangible thing. Consciousness is more of a culmination of our senses in a way that makes sense to us, and that we can question. consciousness is just the brain translating for the mind per say. I guess the question really is, how do you know you are conscious? You can think internally, you can see, hear smell, feel, taste. I would argue a computer can do all of those things through various peripherals, so therefore a computer of sufficient hardware and software capabilities could be conscious. If all else fails, you could build a human brain out of synthetic materials, and I would argue that would be conscious.

So I guess you are the x. All of your brain cells and your eyes and ears and mouth, They collect information and that is the illusion. If we had more senses, there would be more of an illusion. All of this information is brought together in the brain, it decides what chemicals to shoot through your body, and what results is consciousness.
Arne June 16, 2018 at 20:49 #188548
Quoting tom
That's not a particularly convincing argument.


Touche.

That is funny.

Wayfarer June 16, 2018 at 22:34 #188577
Quoting TheMadFool
I think this is problematical, as I think that 'complete self awareness' of that kind is a logical impossibility.
— Wayfarer

I fail to see a contradiction in the idea of complete self-awareness. Think of hunger, thirst, pain and the senses etc. These sensations are a form of awareness of the chemical and physical states of the body or the environment.

Why do you think total self-awareness is an impossibility?


This is a difficult point and I'm not claiming that I am correct in what follows. But one of the principles that I have learned from Vedanta is expressed aphoristically as 'the eye cannot see itself, but only another. The hand cannot grasp itself, but only another'. [sup] 1 [/sup] So I take from that, that what we are aware of appears to us as an object or the 'other'. It seems to me to be inherent in the nature of awareness itself.

Now obviously I can be aware of my internal states, like hunger or lust or depression, and so on. But even in all of those cases, the psyche is recipient of sensations like the feeling of hunger or is thinking about its circumstances, and so on. But the psyche cannot turn it's gaze on itself as it is the subject of experience, not the object of perception. And that subject-object relationship seems fundamental to the nature of awareness.

There's a wikipedia entry on Kant's Transcendental Apperception which I think comes very close to expressing this same idea:

Transcendental apperception is the uniting and building of coherent consciousness out of different elementary inner experiences (differing in both time and topic, but all belonging to self-consciousness). For example, the experience of the passing of time relies on this transcendental unity of apperception, according to Kant.

There are six steps to transcendental apperception:

1. All experience is the succession of a variety of contents (per Hume).
2. To be experienced at all, the successive data must be combined or held together in a unity for consciousness.
3. Unity of experience therefore implies a unity of self.
4. The unity of self is as much an object of experience as anything is.
5. Therefore, experience both of the self and its objects rests on acts of synthesis that, because they are the conditions of any experience, are not themselves experienced.
6. These prior syntheses are made possible by the categories. Categories allow us to synthesize the self and the objects.


Now, number 5 is crucial here *: we're actually not aware of the 'act of synthesis' which underlies and indeed comprises conscious experience; that is what 'the eye not seeing itself' means. Which stands to reason, as I think these correspond to the role of the unconscious and sub-conscious. That is the process of 'world-making' which the mind is continually engaged in; it is in this sense that reality is 'constructed' by the subliminal activities of consciousness into what appears as a coherent whole. This kind of understanding is characteristic of the philosophy of Kant and Schopenhauer.

But it also has some similarities with Vedanta and Buddhism, which are also aware of the sense in which 'mind creates world'. But to say that in the context of secular Western culture is to invariably be misunderstood (at least in my experience), as the formative myth of secular culture is the so-called 'mind-independent' nature of the world. But what this precisely has lost sight of, is the role of the mind in the construction of reality. In fact the very idea is taboo (as explained in Alan Watts' 'The Book: On the Taboo against Knowing who you Are').

So - as to whether any intelligence can be 'completely self-aware', then in light of this analysis, it seems unlikely. And in fact I read somewhere not that long ago about it being understood in Eastern Orthodox theology, that even God does not know himself, that He is a complete mystery to Himself (although I suspect I won't be able to find the reference.)

--------
* I'm not at all sure I agree with 4 but it's not important for this analysis.
TheMadFool June 19, 2018 at 08:15 #189230
Quoting Wayfarer
from Vedanta is expressed aphoristically as 'the eye cannot see itself, but only another. The hand cannot grasp itself, but only another'.


Brilliant point. The way I understand this is each level of what I call existence is separated from the other and awareness, as in knowledge of, may not be able to cross the boundaries between these levels. For instance the individual cells in our bodies don't know what ''love'' means. To know what ''love'' means requires different experiences and environments than the cell is exposed to, not to mention the cell's lack of machinery to comprehend.

That said, I do see a way in which cells may become aware of ''love'' by way of hormones, adrenaline, etc. And the process works in reverse too - cells in a low glucose environment signal hunger. Of course ''total self-awareness'' is a far cry yet.
TheMadFool June 19, 2018 at 08:19 #189231
Quoting TogetherTurtle
In a way, yes. it isn't a tangible thing.


I can't make sense of telling myself that I'm an illusion. Are you Buddhist and bringing up annata here?

TheMadFool June 19, 2018 at 08:25 #189232
Quoting Bitter Crank
So... No, we are not machines, not computers, not manufactured, not hardware, not software.


Well, I think we're more like machines than we think. Biology = chemistry+physics.
TheMadFool June 19, 2018 at 08:32 #189234
Quoting gurugeorge
Yeah I think it's two things overlapping. Sociality sets the stage for the development of intelligence, but perhaps with the neural mechanisms that make for intelligence, beyond a certain point other factors take over to make super-high intelligence out of balance with other factors.

Like, suppose intelligence evolved to require the co-operation of A, B, C, D, E genes, with the total contributing to intelligence level, and the set being roughly in balance with most people, but then suppose in some people the E factor is much more heavily weighted than the other factors. That would produce a super-high intelligence. But what if the E factor happens to clash with other aspects of the total personality, making the person inhibited or socially inept?

Another possibility: human beings and animals generally are like these Heath Robinson contraptions, stuck together with duck tape, sticks and glue, that "pass muster" in the circumstances they evolved in for the bulk of their evolution, but don't necessarily function so well outside those conditions. For example sociality in our ancestral environment would have meant knowing, say, about 20 people quite well, and half a dozen really well. What happens when a creature designed for that type of environment is rammed cheek by jowl with millions of strangers in a modern conurbation? Maybe they withdraw into themselves, or whatever.

Lots of possibilities here, of course one would have to know the science and investigate to figure out what's really going on.


:up: thanks
answermebot June 19, 2018 at 17:16 #189294
TogetherTurtle June 22, 2018 at 02:32 #190048
Reply to TheMadFool Nope. I don't really know much about Buddhism in general. Maybe two paths have reached the same end? All I know is that we don't see the world exactly as it is. Everything comes together to create a facade. How we examine the world is not the only way to do so, neither is it the most effective. There are many more possible senses than the 5 we have, and they are very easy to trick as it is.
Eugenio Ullauri June 22, 2018 at 03:45 #190071
Short Answer: Software

Long Answer:Software and the Materials they are made with.
TheMadFool June 22, 2018 at 04:24 #190077
Quoting TogetherTurtle
All I know is that we don't see the world exactly as it is.


Is this evidential or just a gut feeling?
TheMadFool June 22, 2018 at 04:48 #190082
Quoting TogetherTurtle
All I know is that we don't see the world exactly as it is.


Agreed. I too believe our senses can be deceived or that the picture of the world we create out of them isn't the actual state of affairs. It's like taking a photograph with a camera. We have an image in our hands but it isn't the actual object the image is of.

Quoting TogetherTurtle
Everything comes together to create a facade


As far as I'm concerned there's a limit to illusion. EVERYTHING can't be an illusion, especially our sense of self. In the basic definition of an illusion we need:
1. an observer A
2. a real object x
3. the image (illusion) of the object x, x1

I can accept 3 but what is undeniable is the existence of the observer A who experiences the illusion x1 of the real object x.

Are you saying the observer A itself is an illusion? In what sense?

In the Buddhist context, the self is an illusion because it lacks any permanent existence. The self, according to Buddhism, is a composite "material" and when decomposed into its parts ceases to exist.
TogetherTurtle June 22, 2018 at 17:31 #190257
Reply to TheMadFoolQuoting TheMadFool
Is this evidential or just a gut feeling?

It is evidential to some extent. I apologize if I didn't make it clear before, but I don't believe nothing exists. I'm more on the line of thinking that how we view existing objects is arbitrary.

Quoting TheMadFool
As far as I'm concerned there's a limit to illusion. EVERYTHING can't be an illusion, especially our sense of self.


I agree with this. When I said everything, I more meant every way we experience the world. Your sense of hearing, for instance, can be tricked by focused, weak soundwaves. That is what you are experiencing when you put on headphones. While no one else can hear your music or audio book or other media, you hear it like the performer was in the room with you. This of course, is not the case, and other senses verify that. Therefore, it is very possible some things in the natural world go unnoticed because we can't sense them. What we sense is very selective, labeled arbitrarily, and subject to trickery.

I may in time take interest in the Buddhist view on this subject. For a religion they have a strangely materialistic view on the concept of a soul.

aporiap July 28, 2018 at 04:29 #200797
Reply to TheMadFool
Yes. That can be correctly classified as some level of self-awareness. This leads me to believe that most of what we do - walking, talking, thinking - can be replicated in machines (much like wormw or insects). The most difficult part is, I guess, imparting sentience to a machine. How does the brain do that? Of course, that's assuming it's better to have consciousness than not. This is still controversial in my opinion. Self-awareness isn't a necessity for life and I'm not sure if the converse is true or not.

Hmm, I would think self awareness comes part and parcel with some level of sentience. I think a robot that can sense certain stimuli - etc. light, color, and their spatial distribution in a scene - and can use that information to inform goal directed behavior must have some form of sentience. They must hold some representation of the information in order to manipulate it and use it for goal based computations and they must have some representation of their own goals. All of that (i.e. having a working memory of any sort) presupposes sentience.

I
Heiko July 28, 2018 at 12:17 #200863
Quoting aporiap
They must hold some representation of the information in order to manipulate it and use it for goal based computations and they must have some representation of their own goals.

The AIs whose construction is inspired by the human brain are merely a bunch of matrices chained together resulting in a map from an input to an output. m(X) = Y. These get trained (in supervised learning at least) by supplying a set of desired (X,Y)-Tuples and using some math. algorithm to tweak the matrices towards producing the right Y values for the Xes. Once the training-sets are handled sufficiently well chances are good it will produce plausible outputs for new Xes.
The point here is: those things just "work" - not meaning that this works well, but the whole idea of the concept is not to implement specific rules but just train a "black box" that solved the problem.
Mathematically such AIs separate the input-space by planes, encirceling regions for which certain results are to be produced.
These things do not exactly have a representation of their goals - they are that representation.
One cannot exactly forcast how such an AI develops if not stopping alteration of the matrices at some point: The computation that would be needed to do this is basically said development of the AI itself.
aporiap July 28, 2018 at 21:41 #200979
Reply to Heiko
The AIs whose construction is inspired by the human brain are merely a bunch of matrices chained together resulting in a map from an input to an output. m(X) = Y. These get trained (in supervised learning at least) by supplying a set of desired (X,Y)-Tuples and using some math.
algorithm to tweak the matrices towards producing the right Y values for the Xes. Once the training-sets are handled sufficiently well chances are good it will produce plausible outputs for new Xes.

Isn't this true for only a subset of AIs. I'm unsure if this is how, for example a self navigating, walking honda robot works, or the c. elegans worm model, etc. And even in these cases, there is still a self monitoring mechanism at play -- the optimizing algorithm. While 'blind' and not conventionally assumed to involve 'self awareness', I'm saying this counts -- it's a system which monitors itself in order to modify or inform its own output. Fundamentally, the brain is the same just scaled up in the sense that there are multiple self monitoring, self modifying blind mechanisms working in parallel.


[b]These things do not exactly have a representation of their goals - they are that representation.
[/b]

They have algorithms which monitor their goals and their behavior directed toward their goals no? So then they cannot merely be the representation of their goals.
Heiko July 28, 2018 at 22:02 #200988
Quoting aporiap
Isn't this true for only a subset of AIs. I'm unsure if this is how, for example a self navigating, walking honda robot works, or the c. elegans worm model, etc.

Sure there are other methods. But the ones that are derived from the functioning of the human brain, which generally means interconnected neurons passing on signals are usually expressed that way.

Quoting aporiap
They have algorithms which monitor their goals and their behavior directed toward their goals no?

The whole program is written to fulfill a certain purpose. How should it monitor that?
aporiap July 28, 2018 at 23:10 #200997
Reply to Heiko
Sure there are other methods. But the ones that are derived from the functioning of the human brain, which generally means interconnected neurons passing on signals are usually expressed that way.

I still think neural networks can be described as self monitoring programs - they modify their output in a goal-directed way in response to input. There must be learning rules operating in which the network takes into account its present state and determines how this state compares to a more optimal state that it's trying to achieve. I think that comparison and learning process is an example of self monitoring and modification.

The whole program is written to fulfill a certain purpose. How should it monitor that?


I was wrong to say it monitors its own goals, rather it monitors its own state with respect to its own goals. Still there is a such thing as multi task learning - and forms of AI that can do so can hold representations of goals.
wellwisher July 29, 2018 at 12:23 #201159
The fundamental difference between computers and the brain are neurons are designed differently from computer memory. Neurons, at rest, are at highest potential. When a neuron fires it lowers potential. Computer memory works the opposite way. At rest, computer memory is at lower potential. This is useful for long term storage.

If computer memory was designed like neurons, it would not be stable in storage. It would be subject to spontaneous change, as the chemical potential attempts to lower. The brain has a way to deal with this, allowing spontaneous creative change using the laws of physics and chemistry. At the same time, it maintains high energy continuity.

For example, say we designed a future computer using high energy memory. We would need a backup version of the memory, using traditional low energy memory. We allow the high energy memory to be triggered, so it spontaneously lowers potential. This movement of potential rearranges the furniture, so to speak. We then compare the two memories, to filter out any useful change. We then rewrite the high potential memory back to the starting point, while adding useful changes.

In this scenario, the change in the high energy memory is not based on computer instructions or software, but on the physical pathways needed to lower chemical potential. This gives the memory liberty to find the best paths, which may not be part of any previous logic; creativity. We continue the cycling, until the pathways reach a steady state; maximizes energy flow.

Next, we add a secondary high energy memory, that will use the energy change profile of the primary as the trigger to ignite the spontaneous change in the secondary memory. Now we are getting closer to self awareness. The brain does this through well worn ancient genetic pathways in the primary, that trigger a wide range of self feedback; feelings, sensations, emotions, etc.This occurs at the same time it triggers spontaneous change in the secondary.

The energy flow is based on free energy which is composed of enthalpy and entropy. Free energy has a natural logic, based on the laws of physics, which are universal. This flow does not need manmade language. Although, manmade language impacts how the high energy memory of the secondary moves the potential around. This helps to create a disconnect with the secondary; consciousness. The primary cannot turn the secondary into a clone of itself, due to manmade language. One becomes self aware of the separation while still feeling overlap.