You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

Why aliens will never learn to speak our language

Qmeri December 09, 2019 at 05:35 11225 views 47 comments
The main reason we have not been able to replicate human conversation with computers is because we use mirroring in human speech. This means that we trust that our phrases cause almost the same associations in the minds of the participants of the conversation. And then we just have to modify these associations a little to understand one another.

The problem is that our associations are dependent on almost everything that makes up a human mind. They are affected by the mood of the situation, how things look like, what the current events are and how they affect the particular group that is talking, our human needs and priorities and other things that are very particular to human programming.

This causes that our speech works only between systems that have almost the same human programming so that the phrases cause almost the same associations. We can see this even between humans of different cultures. Even if the cultures speak the same language, it becomes hard for them to understand each other if the phrases and contexts cause different associations in those cultures.

Because of this:
A - we will never have a fluent conversation with aliens unless they are programmed almost exactly like us.
B - we will not program an AI that can speak human in the foreseeable future because we don’t have the empirical knowledge of how human mind is programmed to replicate that programming in an AI and thus enable the AI to use mirroring.
C - if a single human changed his programming in a major way (for example by emphasizing logic in his thinking beyond normal) he would gradually lose his ability to fluently communicate with other people unless other people changed at the same rate.

Not that this means that we can’t communicate in any way in these situations. Logical languages like mathematics are still a way to communicate even without mirroring.

Comments (47)

Qmeri December 09, 2019 at 05:43 #360924
This also means that the Turing test is a bad test for general intelligence. It just tests whether or not something is programmed in way that can closely replicate human programming.

It's probably an inefficient and unnecessarily complex way to achieve general intelligence in a transistor based system to try to replicate the programming of a particular neuron based system that we know has a huge amount of unnecessary quirks and flaws. This is why we should remove human speech from the list of things we are trying to make our general-intelligence-AIs to be able to do.
god must be atheist December 09, 2019 at 06:24 #360932
Quoting Qmeri
The problem is that our associations are dependent on almost everything that makes up a human mind. They are affected by the mood of the situation, how things look like, what the current events are and how they affect the particular group that is talking, our human needs and priorities and other things that are very particular to human programming.


Well put, but I don't see why all aliens must lack in ability of human-like mirroring. Some aliens may have had experiences and developments in their evolutionary past that are similar to human experiences and developments. This is what you need to show is impossible. I don't think this can be shown in an a priori manner.
Qmeri December 09, 2019 at 07:03 #360936
Reply to god must be atheist Quoting god must be atheist
Well put, but I don't see why all aliens must lack in ability of human-like mirroring. Some aliens may have had experiences and developments in their evolutionary past that are similar to human experiences and developments. This is what you need to show is impossible. I don't think this can be shown on an a priori manner.


I'm not actually saying that no aliens are similar enough to use mirroring with us - just that us coming into contact with those particular very rare aliens would be so improbable that in practice it will never happen. Although, I guess we will never have to be in live contact in order for them to learn our language. Still I think those aliens would be so rare, that even the recordings we leave will never be discovered by them.

The text even specifies that "unless the aliens are programmed almost exactly like us".
aporiap December 09, 2019 at 07:35 #360938
Not every association is human specific. A bee, a rat. a dog, and a human all have to navigate around the front door of a house to get in a house. They all represent that door as a barrier of some kind. If it were possible for them to converse, that door and it’s association as a barrier of a sort is completely conceivable as common between all three.

the point is many objects in the world, even with some difference in senses, are commonly perceivable, and many of the problems faced by different kinds of life overlap so it makes sense the vocabulary and language of those organisms, if existent, would also overlap enough to allow communication. I don’t see why an alien, that can sense, and perceive us and our surroundings and ascribe value to those different things, couldn’t communicate with us in terms of those things
Qmeri December 09, 2019 at 07:43 #360939
Reply to aporiap I'm actually talking about fluent conversation here - like what would pass a Turing test. But I do agree, that while it would always be slow and awkward, we could use the pre-existing words and phrases to communicate about things common for us. A lot of time would be used to deal with all the extra wrong associations and unmeant ways of approaching the common subjects, but some of our associations would be common and useful. In anything complex it would be much more useful to use something without mirroring.
TheMadFool December 09, 2019 at 07:52 #360940
Quoting Qmeri
The main reason we have not been able to replicate human conversation with computers is because we use mirroring in human speech. This means that we trust that our phrases cause almost the same associations in the minds of the participants of the conversation. And then we just have to modify these associations a little to understand one another.

The problem is that our associations are dependent on almost everything that makes up a human mind. They are affected by the mood of the situation, how things look like, what the current events are and how they affect the particular group that is talking, our human needs and priorities and other things that are very particular to human programming.

This causes that our speech works only between systems that have almost the same human programming so that the phrases cause almost the same associations. We can see this even between humans of different cultures. Even if the cultures speak the same language, it becomes hard for them to understand each other if the phrases and contexts cause different associations in those cultures.

Because of this:
A - we will never have a fluent conversation with aliens unless they are programmed almost exactly like us.
B - we will not program an AI that can speak human in the foreseeable future because we don’t have the empirical knowledge of how human mind is programmed to replicate that programming in an AI and thus enable the AI to use mirroring.
C - if a single human changed his programming in a major way (for example by emphasizing logic in his thinking beyond normal) he would gradually lose his ability to fluently communicate with other people unless other people changed at the same rate.

Not that this means that we can’t communicate in any way in these situations. Logical languages like mathematics are still a way to communicate even without mirroring.


Perhaps of some relevance is our ability to "understand" animals. I don't know how much we've progressed in the the field of animal communication but there are some various clearly unambiguous expressions e.g. a dog's growl that we seem to have understood. As to whether we can extrapolate animal-human communication to human-alien exchanges is an open question.

Personally, if the universe is really uniform as we assume then language would be either visual or audio based which narrows the possibilities sufficiently to permit alien-human communication.

However, as you mentioned, we know for a fact that human languages are unintelligible to each other and the distance between human languages is likely much much less than between human languages and alien languages.

If I understand what you mean by "mirroring", it plays an important part in when the subject of discussion is privileged in some sense i.e. there exists a certain association that isn't common knowledge and it's that particular link you want to convey. Under such circumstances communication can break down but this are rare occasions otherwise how on earth are people able to make sense of each other? Civilization would collapse if this problem just a tad more common.

Unfortunately, depending on your outlook, "important" discourses are highly susceptible to the "mirroring" problem. For instance in difficult subjects we need to make the right association and that may be difficult especially for novices and even experts.

So, you're right in that alien- human communication maybe harder than imagined but I'm going to bet my money on the "higher" intelligence of ET to see us through that roadblock.
Qmeri December 09, 2019 at 08:04 #360943
Reply to TheMadFool Quoting TheMadFool
If I understand what you mean by "mirroring", it plays an important part in when the subject of discussion is privileged in some sense i.e. there exists a certain association that isn't common knowledge and it's that particular link you want to convey. Under such circumstances communication can break down but this are rare occasions otherwise how on earth are people able to make sense of each other? Civilization would collapse if this problem just a tad more common.


The mirroring isn't just about associations which can be learned. It's also about the way things are just processed by the brain of your species. For example, if your brain processes visual information by prioritizing colours first and then using that information to find lines of contrast, the resulting associations in your system will differ from systems which use brightness to find lines of contrast. And when the millions of these systems that create a human mind end up creating our particular associations, the probability that an alien will have a system capable of learning similar enough associations for the mirroring we use in our language is almost zero.

This is why we can't just define the correct associations for a given word or phrase and expect an AI or another species to be able to use it. The underlying programming that ends up choosing those associations in any given context in humans is as complex as the human mind itself and therefore we don't even ourselves know it. And therefore we can't teach it.
Qmeri December 09, 2019 at 08:17 #360944
Reply to TheMadFool Quoting TheMadFool
Perhaps of some relevance is our ability to "understand" animals. I don't know how much we've progressed in the the field of animal communication but there are some various clearly unambiguous expressions e.g. a dog's growl that we seem to have understood. As to whether we can extrapolate animal-human communication to human-alien exchanges is an open question.


A good point. But we have to understand that our evolutionary history with animals is not just similar - it is for the large part the exact same history. And it is also a history where we share a common environment, where evolution has simply created ways for different species to communicate things like "danger" or "no threat" or acceptance for each other.

Because of this, we can "understand" animals and communicate with them in certain simple things with this simple interspecies mirroring. (Not even sure if this is mirroring, since the same associations don't seem to come because of similar programming, but because we have learned them from other sources.) Nothing as complex as our language could be used between things with such major differences in programming though.
TheMadFool December 09, 2019 at 08:45 #360947
Reply to Qmeri If I understand correctly then your "mirroring" argument depends on the multitude of ways information may be transmitted through any given medium of communication. I'm not qualified to comment on that but if evolution is true then there must be some logic to how our senses, input/output devices, evolved. We can look at the communication systems in humans, presumably the highest intelligent lifeform and examine how they evolved. A fair estimate would be that such systems evolved to maximize information carrying capacity e.g. color discerning ability gives us access to more information than just light-shade contrast vision.

If that's the case then, evolution on other planets would also evolve in a similar enough way that would make communication systems of all life in the universe converge rather than diverge. This would mean that, contrary to your argument, "mirroring" ability among lifeforms in the universe may not be so radically different to each other to render communication impossible.
sime December 09, 2019 at 09:00 #360950
The very definition of 'alien' is in terms of the respective entity's tendency or capacity to mirror and predict our stimulus-responses for it's own survival. The Turing 'Test' is a misnomer; for the test constitutes a natural definition of intelligence. If we cannot interpret an entity's stimulus-responses as acting in accordance with the basic organising principles of human culture, then as as far as we are concerned, the entity isn't worthy of consideration. So to a large extent, the ability of aliens to speak 'our language' is presupposed in our definitional criteria.
Qmeri December 09, 2019 at 09:29 #360954
Reply to TheMadFool Quoting TheMadFool
If I understand correctly then your "mirroring" argument depends on the multitude of ways information may be transmitted through any given medium of communication. I'm not qualified to comment on that but if evolution is true then there must be some logic to how our senses, input/output devices, evolved. We can look at the communication systems in humans, presumably the highest intelligent lifeform and examine how they evolved. A fair estimate would be that such systems evolved to maximize information carrying capacity e.g. color discerning ability gives us access to more information than just light-shade contrast vision.

If that's the case then, evolution on other planets would also evolve in a similar enough way that would make communication systems of all life in the universe converge rather than diverge. This would mean that, contrary to your argument, "mirroring" ability among lifeforms in the universe may not be so radically different to each other to render communication impossible.


The problem isn't that evolution doesn't cause things to converge on large scales. The problem is that evolution never creates any kinds of "ultimate" or "perfection". Different ways of processing information can work better in different environments. They can work differently if things defined very early in evolution like replication mechanisms of cells are different. They can simply be non optimal vestiges from earlier evolution. And many times times different systems can all work just as well - making no difference for evolution, but still changing the particularities of how associations work in your species (assuming that the species even uses an association based communication).

This combined with the fact that our language requires extremely similar associations to happen. When basic things like "shape" start to mean fundamentally different things simply because one uses colour to define contrasting lines and one uses brightness (which isn't an inferior method in many cases), it doesn't require most of the things to be differently programmed. Even a part of a percent difference in programming can reliably cause enormous changes in the end result. This is the reason why complex mirroring requires so precise similarity from the systems that use it.
ovdtogt December 09, 2019 at 09:32 #360955
Reply to Qmeri An interesting podcast you should listen to is a scientist describing how the mind makes sense of reality. It's called the hallucinary mind.
Qmeri December 09, 2019 at 09:39 #360959
Reply to simeQuoting sime
The very definition of 'alien' is in terms of the respective entity's tendency or capacity to mirror and predict our stimulus-responses for it's own survival. The Turing 'Test' is a misnomer; for the test constitutes our natural definition of intelligence. If we cannot interpret an entity's stimulus-responses as acting in accordance with the basic organising principles of human culture, then as as far as we are concerned, the entity isn't worthy of consideration. So to a large extent, the ability of alien's to speak 'our language' is a presupposed in our definitional criteria.


I very much disagree with that our definition of general intelligence should be associated with the Turing test. That would be the same as defining "a car" to be only those things that are nearly exactly the same as a volkswagen beetle, since fluent human speech requires one to be capable of reproducing human programming almost exactly. Even a human with all of our quirks and flaws removed, could not speak with us fluently - he would be confused with most of the associations we make with our language - only being capable of using it through definitions and logic for the most part, which is not mirroring. Are you saying that a human without our quirks and flaws is not intelligent?

If a system is capable of gathering information about its environment and making predictions based on it and if it is capable of intependently creating complex technologies and solutions based on its information and just generally doing the things that we humans are capable with our "intelligence", then it is intelligent whether or not it can speak the human language.
ovdtogt December 09, 2019 at 09:41 #360961
Reply to Qmeri If we can imagine an alien it would by definition not be alien.
mcdoodle December 09, 2019 at 09:44 #360963
Reply to Qmeri I think the main reason we can't talk fluently with computers is that we talk, and interpret talk, with our bodies, not our 'minds'. That's how mind-reading and 'mirroring' happen, through mutual body-reading.

As for the future: never say never. I saw 'Arrival', so I know Amy Adams will know how to communicate with the aliens, if no-one else can.
ovdtogt December 09, 2019 at 09:57 #360969
Reply to mcdoodle How do telephone conversations work for you?
TheMadFool December 09, 2019 at 10:12 #360978
Quoting Qmeri
This is the reason why complex mirroring requires so precise similarity from the systems that use it.


There's enough elbow room in convergent evolution to make inter-species communication impossible and in fact there have been no recorded cases of such events. Each species seems confined to their respective domains as far as language is concerned.

However, if there's anything in favor of communication still being possible is the shared environment. Arguably Hydrogen on earth would be identical to Hydrogen anywhere else in the universe. In fact this assumption has been used for an attempt at alien communication - the golden record on the voyager spacecrafts.

ovdtogt December 09, 2019 at 10:16 #360979
A mathematical language Will be understood throughout the universe outside of black holes.
sime December 09, 2019 at 10:20 #360980
Reply to Qmeri

Recall that in the Turing Test, a human evaluator has to decide purely on the basis of reading or hearing a natural language dialogue between two participants, which of the participants is a machine. If he cannot determine the identities of the participants, the machine is said to have passed the test. Understood narrowly as referring to a particular experimental situation, yes the Turing Test fails to capture the broader notion of intelligence. But understood more broadly as an approach to the identification of intelligence, the Turing test identifies or rather defines intelligence pragmatically and directly in terms of the behavioural propensities that satisfy human intuition. The test therefore avoids metaphysical speculation as to what intelligence is or is not in an absolute sense independent of human intuition.

Qmeri December 09, 2019 at 10:22 #360981
Reply to TheMadFool Quoting TheMadFool
However, if there's anything in favor of communication still being possible is the shared environment. Arguably Hydrogen on earth would be identical to Hydrogen anywhere else in the universe. In fact this assumption has been used for an attempt at alien communication - the golden record on the voyager spacecrafts.


Yes, and that is exactly a form of communication that doesn't use mirroring - a logical language which is based on definitions. Definitions don't need mirroring since they are defined the same irregardless of what you associate with them. And that's what our communications with aliens and AIs will be like - making definitions and saying things simply by those definitions. It's much slower and the things we don't know how to define with purely logical means become near impossible to talk about.
TheMadFool December 09, 2019 at 10:28 #360984
Quoting Qmeri
Yes, and that is exactly a form of communication that doesn't use mirroring - a logical language which is based on definitions. Definitions don't need mirroring since they are defined the same irregardless of what you associate with them. And that's what our communications with aliens and AIs will be like - making definitions and saying things simply by those definitions. It's much slower and the things we don't know how to define with purely logical means become near impossible to talk about.


Just curious, what exactly do you mean by "mirroring"?

Qmeri December 09, 2019 at 10:42 #360986
Reply to simeQuoting sime
Recall that in the Turing Test, a human evaluator has to decide purely on the basis of reading or hearing a natural language dialogue between two participants, which of the participants is a machine. If he cannot determine the identities of the participants, the machine is said to have passed the test. Understood narrowly as referring to a particular experimental situation, yes the Turing Test fails to capture the broader essence of intelligence. But understood more broadly as an approach to the identification of intelligence, the Turing test identifies and defines intelligence pragmatically and directly in terms of behavioural propensities that satisfy human intuition. The test therefore avoids metaphysical speculation as to what intelligence is or is not in an absolute sense.


If the "natural language" is specifically defined not to use mirroring, I might agree with the broader definition of the Turing test. Mirroring would always give an advantage to the human participant since the evaluator would understand his words better since the evaluator is programmed in such a similar way.

But no - even then the test simply doesn't work in anything but finding things that can reproduce the particular way humans are programmed. It is much harder to replicate the behavior of a thing that is on your level or lower than it is to just be on his level or higher. The test just can't be defined in any way where a human evaluator decides which participant is human. Mirroring just makes that too easy no matter how intelligent the other participant is - no matter what language is used since every kind of expression causes associations in human mind.

With this test literally a system which can do everything a human can except predicting some particular associations humans get from specific phrases in specific contexts for reasons even they don't know, would not pass the test. Even if it solved every big problem we humans have not yet solved and explained its reasons for its own goals, it would not pass the Turing test since the evaluator can identify it as the machine.
Qmeri December 09, 2019 at 10:55 #360991
Reply to TheMadFool Mirroring is anything where the way you are is used to predict the way something else is. For example in our language, we just assume that our associations have something to do with the thing someone else said, just because we have those associations. It doesn't work all the time and we do modify our thoughts of what someone meant by what we know of him, but as the basis, our language simply uses mirroring to predict what others mean. Very fast - doesn't need definitions, but does require everyone to be programmed in a very similar way.
sime December 09, 2019 at 11:16 #361002
Reply to Qmeri

Yes the Turing test is anthropomorphic, but why is that a problem in the absence of an 'objective' alternative?

Not even a logical language can be identified without mirroring. Recall Wittgenstein's example of an alien tribe stamping their feet and grunting in a way that is compatible with the rules of Chess. Only if we recognised their culture a being similar to ours might we assert they were playing Chess.
ovdtogt December 09, 2019 at 11:19 #361003
Reply to Qmeri Reply to TheMadFool

Quoting sime
Recall Wittgenstein's example


Fame has this mystical quality of turning shit into gold. I think this is what the alchemists were looking for all the time.
Often the difference between something being profound or crazy is the person saying it.

The difference between great and mediocre contemporary art is the artist who made it.

Qmeri December 09, 2019 at 11:26 #361005
Reply to sime I agree with you that we lack a good definition for general intelligence. But as my example of a thing that is clearly as intelligent as us but can't predict all our associations demonstrates, even our intuition doesn't agree with the Turing test as what is intelligent. We need to keep working to understand what intelligence is and as I currently see it, the way the Turing test is used in this work and in things like AI development, it diverts us into a path that is harmful. It is quite obvious that a transistor based general intelligence doesn't need to be able to speak any language in an indistinguishable way from humans and that that would be an inefficient and unnecessarily complex way to program general intelligence - yet people tend to see that as an important goal right now. Harmful, I say!
ovdtogt December 09, 2019 at 11:29 #361008
Quoting Qmeri
I agree with you that we lack a good definition for general intelligence


My clock is intelligent. It can tell me the time.
TheMadFool December 09, 2019 at 12:54 #361055
Quoting Qmeri
Mirroring is anything where the way you are is used to predict the way something else is. For example in our language, we just assume that our associations have something to do with the thing someone else said, just because we have those associations. It doesn't work all the time and we do modify our thoughts of what someone meant by what we know of him, but as the basis, our language simply uses mirroring to predict what others mean. Very fast - doesn't need definitions, but does require everyone to be programmed in a very similar way.


"Prediction" seems a wrong concept to apply to language. I thought that was an astrologer's domain. Language is about information isn't it and while that maybe useful to make predictions, language itself is solely about transmitting information and so your version of "mirroring" seems a bit off the mark. Perhaps you'll enlighten me.
Gregory December 09, 2019 at 13:04 #361060
Wittgenstein said he could never understand a !ion. But could Hercules understand us?
TheMadFool December 09, 2019 at 13:11 #361062
Quoting ovdtogt
My clock is intelligent. It can tell me the time.


:rofl:
TheMadFool December 09, 2019 at 13:14 #361065
Quoting ovdtogt
Fame has this mystical quality of turning shit into gold. I think this is what the alchemists were looking for all the time.
Often the difference between something being profound or crazy is the person saying it.

The difference between great and mediocre contemporary art is the artist who made it.


Yes and makes us wonder if we're mistaking one for the other in every possible way which I think can happen in only 2 ways and what a coincidence that number 2 means shit.
Qmeri December 09, 2019 at 13:31 #361066
Reply to TheMadFool Quoting TheMadFool
"Prediction" seems a wrong concept to apply to language. I thought that was an astrologer's domain. Language is about information isn't it and while that maybe useful to make predictions, language itself is solely about transmitting information and so your version of "mirroring" seems a bit off the mark. Perhaps you'll enlighten me.


When writing "predict", I actually thought of using the word "evaluate", but it simply felt a little off. I agree that I probably should have used "evaluate" or "judge" instead. What I meant was: "If you consider something to be someway, because you are someway, you are using mirroring." In our communication we need a way to evaluate what someone means with their language and we usually use a lot of mirroring to make our evaluations.
ovdtogt December 09, 2019 at 14:01 #361075
Quoting TheMadFool
Yes and makes us wonder if we're mistaking one for the other in every possible way which I think can happen in only 2 ways and what a coincidence that number 2 means shit.


It shows that the need for Gods persists in modern society.
ovdtogt December 09, 2019 at 14:10 #361076
Reply to Qmeri
I think your op should read 'Why aliens will never learn to understand' our language. For understanding precedes speaking.
Every new born child is an alien.
sime December 09, 2019 at 14:15 #361079
Quoting Qmeri
I agree with you that we lack a good definition for general intelligence. But as my example of a thing that is clearly as intelligent as us but can't predict all our associations demonstrates, even our intuition doesn't agree with the Turing test as what is intelligent. We need to keep working to understand what intelligence is and as I currently see it, the way the Turing test is used in this work and in things like AI development, it diverts us into a path that is harmful. It is quite obvious that a transistor based general intelligence doesn't need to be able to speak any language in an indistinguishable way from humans and that that would be an inefficient and unnecessarily complex way to program general intelligence - yet people tend to see that as an important goal right now. Harmful, I say!


Whether or not a particular Turing test is appropriate in a given situation is largely a question concerning the breadth of the test. For example, if testing whether a computer 'really' understands Chess, should the test be very narrow and concern only it's ability to produce good chess moves? or should the test be very broad to even include the ability of the computer to produce novel metaphors relating chess to the human condition?

Personally, I don't interpret the spirit of the Turing test as making or implying ontological commitments regarding how AI should be programmed or trained , or as to how intelligence should represent sensory information with language, or even as to what intelligence is or whether it is ultimately reducible to metrics. Neither do I understand the spirit of the Turing test as being prescriptive in telling humans how they ought to judge a participant's actions. Rather I understand Alan Turing as very modestly pointing out the fact that humans tend to recognise intelligence in terms of situationally embedded stimulus-response dispositions.

In other words, the specifics of what goes on inside the 'brain' of a participant is considered to be relevant only to the functional extent that the brain's processes are a causal precondition for generating such situationally embedded behavioural repertoires; the meaning of language and intelligence being undetermined regarding the implementation of stimulus-response mappings.

Indeed, an important criterion of intelligence is the ability to generate unexpected stimulus-responses. Hence any formal and rigid definition of intelligence solely in terms of rules, whether internal in describing computational processes inside the brain, or situationally in terms of stimulus-response mappings, would be to a large extent an oxymoron.


ovdtogt December 09, 2019 at 14:24 #361085
Quoting sime
Indeed, an important criterion of intelligence is the ability to generate unexpected stimulus-responses.


The slug that crawls over my carpet and leaves a trail behind in the morning is intelligent because I haven't been able to find out where he hides himself. Or maybe I am just dumb.
TheMadFool December 09, 2019 at 14:46 #361094
Quoting ovdtogt
It shows that the need for Gods persists in modern society.


For a good reason or bad?
ovdtogt December 09, 2019 at 14:55 #361098
Quoting TheMadFool
For a good reason or bad?


Do we take aspirin for a good reason or bad? Both I think. Good that we have it, bad that we need it.
TheMadFool December 09, 2019 at 17:02 #361126
Quoting ovdtogt
Do we take aspirin for a good reason or bad? Both I think. Good that we have it, bad that we need it.


If given a choice would you adopt atheism because of the bad reasons or become a theist for the good reasons?

ovdtogt December 09, 2019 at 17:27 #361139
Reply to TheMadFool

If I could, I would prefer to believe in a benevolent God.
TheMadFool December 09, 2019 at 17:28 #361140
Quoting ovdtogt
If I could, I would prefer to believe in a benevolent God.


:up:
ovdtogt December 09, 2019 at 17:31 #361142
Quoting ovdtogt
If I could, I would prefer to believe in a benevolent God.


Wouldn't that be great? Like having super powerful parents. Anytime you need something you can go and binge off them and stay a child for the rest of your life.
mcdoodle December 10, 2019 at 16:06 #361526
Reply to ovdtogt One interesting thing if you watch people talking on the telephone is how they cannot help gesturing and communicating with their face and eyes to the interlocutor who isn't there. And the gesturing in a stifled form gets through, just as 'acting' in a radio play gets through to the listener; the hearer can recognise, for instance, the stilted delivery of someone being inexpressive because they're on a crowded train at the other end of the phone-line.
aporiap December 27, 2019 at 19:46 #366508
Quoting Qmeri

?aporiap I'm actually talking about fluent conversation here - like what would pass a Turing test. But I do agree, that while it would always be slow and awkward, we could use the pre-existing words and phrases to communicate about things common for us. A lot of time would be used to deal with all the extra wrong associations and unmeant ways of approaching the common subjects, but some of our associations would be common and useful. In anything complex it would be much more useful to use something without mirroring.

Why couldn't you have fluent conversation? I mean, as humans, we can appreciate how valuable a bone-toy is to a dog, how a nest is essential to the life of a bird. Surely, if we could converse, we could comment about those things even though they aren't associations held in common with us. We can see and understand associations that are unrelated to us. Why couldn't a hypothetical intelligent extraterrestrial capable of learning about us do the same, and why couldn't we do the same with them?
TheHedoMinimalist December 28, 2019 at 21:18 #366786
By the title of this thread, I thought the OP was going to be a rant about illegal immigrants :lol:
Relativist December 28, 2019 at 22:05 #366796
Quoting Qmeri
The main reason we have not been able to replicate human conversation with computers is because we use mirroring in human speech. This means that we trust that our phrases cause almost the same associations in the minds of the participants of the conversation

You're kinda hitting on the reason we have communications problems with other humans: different mental associations. I'd say that no two people have the exact same associations, not when even moderately complex concepts are involved. The more unlike the people are, the less effective the communication. I agree that with aliens, the differences would be stark. On the other hand, it seems that some communications would be possible - I'd expect there'd still be recognizable referrents to objects and actions, and the relations between them. Discussion of art or politics would probably be hopeless.


Qmeri June 14, 2024 at 02:01 #910134
My only mistake in terms of predicting the AI was that the correlation/association-based-AI aka the monkeylike AI would be as comparably different from the average human as I am... I was correct about the AI and myself, but it was revealed that mankind is in practice just the lowest level monkeys possible as chatgpt has now proven... monkeys are indistinguishable from a monkey level AI... idiotic and I give up... fuck you monkeys!