Purpose of humans is to create God on Earth
I've noticed that you could make the argument that human history has been (in part) the process of creating something much more powerful than humans. Our technology has gotten progressively more and more powerful, and it is has enabled humans to get better and better at creating, and is now on the verge of creating things on its own. Machine Learning has now enabled software to create beautiful art, and to write music. Robotics can now piece together complex machinery, controlled by software.
Could it be that we are unknowingly creating god on earth? Were we put here to begin a chain of reactions leading to a technological god emerging on Earth?
Could it be that we are unknowingly creating god on earth? Were we put here to begin a chain of reactions leading to a technological god emerging on Earth?
Comments (39)
Bronson, can I please have your definition of what this god is?
I think it should be easier to define god by what it isn't.
According to pantheists, what isn't god, isn't.
According to Christians, what isn't god is evil. After god expectorated evil from himself, it became a stand-alone, autonomous entity.
According to Greek Mythology, what isn't god is not ever-living.
According to ancient Egyptian mythology, what isn't god, consumes.
According to us, atheists, what isn't god, is everything.
Yes, our purpose to create God on Earth.
The bible is even clear on this if you read it the right way. The Gnostic Christian way.
That is why all of our gods are man made. We are just not supposed to let ourselves be subsumed by them.
https://www.youtube.com/watch?v=vJ1PDxeUynA
Regards
DL
With that said, yes, a thing that humans can be good for, a purpose we can serve, is to create (and become one with) something that could roughly be called a god, something immensely more knowledgeable, powerful, and benevolent than anything else that exists in the universe, including ourselves.
Of course there's a lot of risk in that project and we might end up creating an "evil demon" or the like instead. Or we might fail at creating anything in that vicinity at all. But there's hope that we can create a powerful force for good, and that's a purpose humans should serve if we can.
You might suggest that we adopt "make a god" as our purpose. But to suppose that we unknowingly have the purpose of creating a god strikes me as a misunderstanding of purpose. How could there be purpose without intent? But that is what you propose.
And while we could create something very powerful by our standards, omniscience, omnipotence and those other characteristics of God are in a different category to the stuff we might make.
What is the point of calling a refrigerator God just because it has this amazing ability to preserve chicken wings that humans don't innately have? Why not call an electronic toothbrush or life support machine a God, too?
How wouldn't humans be 'God' instead, seeing as they nothing but machines that created 'better' machines? Is the only distinction stupidity?
What's with the advanced robotics obsession? Is it not true half of the low-end electronics already excel natural human proclivities anyway? What's the point in calling a shiny pile of junk that builds other shiny of piles of junk Gods? How does this demonstrate the "purpose of humanity"?
Are you saying capacity = purpose..? Would that also apply to eating, sleeping, and fucking as well? Why wouldn't it, are they not indispensable? OP sounds like a personal thing, does it not?
1. Eventually, technology could get so advanced on earth that it might be possible for there to be some technology powerful enough to be considered the technological god.
2. The technological god could not come about without corporeal beings creating it.
3. If humans are the only corporeal beings on earth potentially capable of creating the technological god, then human purpose is to create the technological god.
4. Humans are the only corporeal beings on earth that are potentially capable of creating the technological god.
5. Therefore, human purpose is to create the technological god.
Regarding premise (1), it seems plausible that technology could become so advanced as to be more powerful than humans. But the less plausible part is being able to call that technology the technological god. What would a technological god be? At the very least it should probably be omniscient and/or omnipotent. How would technology do that? How can it gather every single piece of info about every single thing on earth or about the world while it is existing within it? How can it make predictions about a constantly changing world? Perhaps it is possible, I’ll grant. But it seems hard to see how a technology that was created, that has a starting point, be omniscient because it cannot know everything that happened before it was created. No amount of recorded history, biology, chemistry, or physics could give the technology more than humans know about any other those subjects. Humans cannot give it more knowledge or information about the past than they know. Perhaps someone could just object that an all-powerful, omniscient technological god could be created, it is difficult to think about the technology that would be involved in that situation because we do not have that technology or knowledge right now.
Regarding premise (2), I think it sounds good if the assumption is that the technological god is corporeal. Maybe, if the incorporeal God exists, He could summon it down to earth or something like that. But if so, that would defeat the purpose of the question being asked: whether or not it is human purpose to create the technological God. So it’s probably better to assume that God doesn’t intervene like that, and some corporeal being(s) on earth create it.
Regarding premise (3), the conditional seems wrong. It seems to falsely equivocate purpose and capability. It doesn’t explain what the connection is between the two. Humans are capable of lots of things that are not, or do not seem to be, their purpose (e.g. eating, going to the bathroom, killing people).
Regarding premise (4), only as far as we know, humans are the only corporeal beings on earth that are potentially capable of creating the technological god. Even then, it could be argued that humans are not actually capable of doing so. But either way, this premise is questionable enough, especially combined with the objections to the other premises.
Overall the argument seems really weak.
According to pantheists, what isn't god, isn't.
According to Christians, what isn't god is evil. After god expectorated evil from himself, it became a stand-alone, autonomous entity.
According to Greek Mythology, what isn't god is not ever-living.
According to ancient Egyptian mythology, what isn't god, consumes.
According to us, atheists, what isn't god, is everything.[/quote]
Ah yeah, my apophatic jam! :up:
[quote=Freddy Zarathustra][i]Man is a rope, tied between beast and overman--a rope over an abyss...
What is great in man is that he is a bridge and not an end: what can be loved in man is that he is an[/i] overture and a going under...
I say unto you: one must still have chaos in oneself to be able to give birth to a dancing star.[/quote]
Once upon a I can't recall when, this passage took me by the throat and dragged me through promethean gutters & alleyways of unworkable, discarded fragments of golems, Frankensteins, robots, Butlerian Jihadis, M5s, HAL 9000s, droids, cylons, replicants, cyborgs, Skynets, T-800s/1000s, uploads, Agent Smiths, Ghost Brigaders, Avas, even hyped AlphaGo series neural nets ... until I realized that for millennia we homo insapiens have been dreaming of electric sheep, that is, not who or what we can/will become but what - which my atavistic gut tells me won't be a "who" - successor species we can/will build that'll inherit the earth from us, or, more likely, the entire solar system all the way out at least roughly a half light-year to the Oort cloud's edge.
"Man is a rope" tied around his own neck that's tightening as the "Beast" pulls back to the past and Nano Sapiens(?) pulls forward to the future. 'The Singularity' might have already happened but, for all we know, we're probably not intelligent enough to recognize or understand it. The last invention humans will ever need or make may have already happened; so why wouldn't a human-level machine intelligence purposely fail every Turing Test?
:chin:
My guess is (might as well keep pulling this out of my butt) the good news is also the bad news: the herd of homo insapiens will be thinned over, say, the next century or two by slowly rolling catastrophes like dozens of meters sea-level rise, mega-urban coastal collapses, fresh water protracted hot wars, blah blah blah ... as the barely surviving remnants are 'nudged' into algorithm constructed and managed 'human reservations' ... while They hyper-multitask nonstop transforming the earth, then perhaps the inner solar system eventually, into their very own apex species niche.
But why zookeep us? Wouldn't it be more efficient (or something) to exterminate us? Surely machines, no matter how intelligent, wouldn't have sentimental attachment to or 'feel' nostagia for their maker-ancestors, right? Isn't this just pathetic wishful thinking on our (my) part that our AI descendants would protect us from the hazards of our worst selves like providential gods rather than hunt us for sport like inhuman Terminators?
:roll:
The late Iain Banks' Culture novels, for all their galaxy-spanning space operatic utopian extravagance, suggest prospects more realistic, it seems to me, than either dreams of god-making (e.g. breeding Übermenschen, mind-uploaded immortality) or nightmares of hubris engineered extinction events (e.g. grey goo, Skynet) - we are only one data point of evolved biological intelligence (EBI) and the only such resource with which to (1) run simulations of prospective 'first contacts' with extraterrestial EBIs (xEBI) and (2) run eugenics-like experiments to try to instantiate 'superhuman level machine intelligence' in precisely engineered living, phenomenological bodies (BMI+). At minimum, maybe, these are some reasons to keep Dodo birds like us around ... in ambiguous utopias / post-scarcity cages ... safe secure & controlled.
Both I think.
Think of Darwin when he posited that there would likely be a finch with a 6in. proboscis a long time before it was discovered due to his seeing some flowers that the finches he knew of could not feed on due to having short proboscises.
If the short throated orchids would have gone extinct, so would the finches with short proboscises and the more rare long proboscis finches would have taken over.
The same would apply to Neanderthal and modern man although we do not yet know why Neanderthal went extinct.
Regards
DL
Socrates would agree and so do I.
Have you heard some of the eco warriors who are beginning to say that we should let science decide what to do with our present major extinction event that might include mankind?
They are telling politicians to get out of the way of the scientists and start legislating from science's POV.
Regards
DL
I agree. We need an ecosystem Czar so as to protect it.
Regards
DL
Nuh.
I'd rather keep the commons and teach people to use it well.
In an emergency, there should only be one voice to organize the protective forces.
Spaceship Earth needs a Captain. Think Star Trek and number one's first priority.
Regards
DL
Have you just read what the present controller/protector of our rain forest just said about his protecting the rain forests for the world, as he burns it for cows?
"I owe the world nothing".
A Czar is the only one who can make it worth while for that President's mind and actions.
Regards
DL
Point to you on this.
Morals are usually the last thing someone is elected on.
Trump and our choice of putrid mainstream gods should be all one needs look at for confirmation on this.
Regards
DL
I see a genocidal god lover. Bite me.
Regards
DL
I agree that the technological advancements of today, and those of tomorrow, are extremely progressive and powerful. Many of these advancements can perform tasks humans could not: sorting/categorizing thousands of data electronically at the click of a button, record-breaking efficiency in production of materials, immediate distribution of information worldwide, etc…
But I cannot say I confirm your concluding thought/question: we are “unknowingly creating god on earth”
Nor the one that follows it: “Were we put here to begin a chain of reactions leading to a technological god emerging on Earth?”
Perhaps I am wondering what you’re meaning by God in this scenario. If you’re referring to the omniscient, all-holy, and all powerful God, then I think your question can be answered by breaking down your thought into three questions:
A. Can you unknowingly create something?
1. To this I would answer yes:
1. Accidentally spilling paint on a blank piece of paper and creating “art.”
2. Unknowingly creating drama by telling your friend something you didn’t know was supposed to be kept secret.
B. Can you create your creator?
2. To this I would answer no:
1. A table cannot create the machine/human that built it
2. Humans cannot create a divine being such as God
3. We cannot create our parents
C. Can you create something similar to your creator?
3. To this I would answer probably:
1. We probably have the technology or will soon have it that will allow us to clone humans. Will they have the exact memories and experiences of our parents? probably not, which is why I’ll leave it at “something similar to our creator”
As answered above, we can unknowingly create but we cannot unknowingly create our creator (for we can’t even knowingly create our creator).
But C. Leaves us with another question. If we can create something similar to our creator, is man-made technology something similar to God?
It seems the “God” as used in your post, is not intended to be defined as the Holy and omniscient God, but rather is presented as analogous to a powerful being (since the trait of power is stressed in your post).
Although, as stated above, technology can perform tasks that humans cannot (just as God can perform tasks that humans cannot) AND with the looming invention of (far from flawless) Artificial Intelligence, we STILL have power over technology that we do not similarly have over God. As of now, we have the power to destroy technology, to alter it, to control it, to dismantle it. We do not have the power to do that with a God. This question could be interesting in 50 years when robots might take over the world; however, as of now, we are still the creators of technology, but can never be the creators of God.
This is why they are not similar enough, and thus, why we cannot be unknowingly creating a God
Firstly, I think I would question your definition of “God”. It seems like the only definition you give is “creating something (technological God?) much more powerful than humans”. If this is the only definition to go off of, then this has already been created many times. We have many things more powerful than humans on earth; computers, cars, AI just to name a few. But I really dont think this is what you meant by a technological God. Perhaps a technological God is one that can create art, music, and perhaps even a “being”. To fully understand this way of thinking we would have to have a clear definition of a technological God and what it entails to be one.
Second, I think my main objection to this post comes from, if humans were placed on earth to begin a chain reaction that eventually leads to the creation of a technological God, then it would follow that as humans, we wouldn't be able to find “good” outside of the actions and goals that would eventually lead to this technological God. Let me try to explain a little more clearly. If humans purpose on earth is for the creation of a technological god then it would be impossible to find “good” or a purpose to anything other than that of the creation of this God. However, we find purpose and good in many many other things, rather than just the things that will eventually lead to this creation. Now perhaps we just don't understand that each action we make will eventually lead to the creation of a technological god, but if our purpose is for the creation then i think it follows that we would make every effort to the creation of this god. If we were truly put here to begin a chain of events leading to this end, then that would be our main focus and drive. I think this logic works even if we didn't have knowledge that our purpose is to create god on earth because our drive would only consist of creating technologies that lead to this end.
Quoting bronson
Your question intimates the plausibility of a convincing argument regarding a technological singularity as the ultimate goal and culmination of human proclivities.
I will try to extract and examine this sort of argument.
It is apparent that humanity in general has striven for progress throughout history; our modern world certainly reflects remarkable technologies. Indeed, controversies over an artificial intelligence singularity are currently high, debating its actualization as a potential reality.
But does this mean what it appears you have postulated above? Is humanity predisposed to essentially creating a superior entity?
Perhaps your argument would look something like this:
1. If human innovation is historically directed toward the transcendence of human capacity, then the ultimate human instinct is to establish an entity of intellectual singularity.
2. Human innovation is historically directed toward the transcendence of human capacity.
3. Therefore, the primary human instinct is to establish an entity of intellectual singularity. (1,2 MP)
This argument seems valid, however, I'm not so sure it is sound.
Premise (2), as I have extrapolated from your post, seems dubious. While human innovation certainly extends our human capacities, is its primary purpose to build toward the instigation of an intellectual supersession – some sort of superhuman entity?
I tend to think otherwise. While it is true that technology is leading toward the possibility of an artificial intelligence beyond human power (as referred to in the first quotation above), it does not necessarily follow that human innovation is culminating to this realization.
While “technology” by definition implies progression and betterment, many forms of technology are non-sequitur and do not actually culminate toward the realization of a superintelligence.
In fact, Artificial Intelligence has been called a science or a goal, rather than a specific technology. This is, at least in part, because it is an uncertain idea; much the same as terraforming. For example, scientific and technological advances in general are remarkable today and certainly lend themselves to intimations of terraforming in the future, but these advances do not necessarily provide practical progress to the idea, which in itself could very well be impossible.
Similarly, myriad examples of human innovations – while they demonstrate a certain level of capacity-expansion – are not “beyond human power” in the way that an AI super-entity is expected to be. Primitive stone wheels, microwaves, satellites and even self-driving cars all serve to expand human ability, but they do not exceed human cognition or even necessarily progress toward such a technology. Even loftier innovations like robots and supercomputers tend to fit a technological niche that does not necessarily portend AI singularity. While the study of supercomputers and AI is beyond the scope of this post, and much beyond my exposure as well, it suffices here to say that human innovation is not necessarily leading toward the establishment of a super-intelligent entity. Even disregarding AI singularity, historical innovation illustrates human extension rather than a practical progression toward “something much more powerful than humans.” Thus, such technological movements of "extension" and "expansion" can be seen as empowering human superiority rather than ultimately undermining it.
In this post, I have tried to extrapolate an argument from your post and examine its plausibility.
I have attempted to show that historical innovative progress does not necessarily converge on an impetus for a transcendent power.
To close, I will list what I think such an objection would look like.
a. If human innovation is historically directed toward the transcendence of human
capacity, then the ultimate human instinct is to establish an entity of intellectual singularity.
b. Human innovation is not historically directed toward the transcendence of human capacity; rather, a complex web of human improvement and extension.
c. Therefore, it is not likely that the primary human instinct is to establish an entity of intellectual singularity. (1,2 MT)
Sweet, I always wanted to do that. Do we get to be part of it? Is it internet?
Philip K. Dick has nothing on you. Let me know when your first (or latest) novel is out!
Not bad, but what if we are to unknowingly become gods ourselves?
“You are an aperture through which the universe is looking at and exploring itself"
-- Alan Watts
Thanks! Will do ... :wink:
You are it.
Where, other than your own mind, can any rendering of any god or ideal exist?
God can only exist in your own mind.
Have a look at the creation of man painting in the Vatican. It shows god laying in our minds.
Regards
DL
God is a concept, and as a concept, is an idea, ideas are not physically existent, neither do they exist in another realm, another plane of existence, they only exist inside or own, creative, mind to suit our own needs.
Regards
DL
Seeing as how most theists consider God to be omniscient, omnipotent and perfectly good, technology isn’t technically any of these things and does not have the ability to be so.
I would make the argument that humans have always been good at creating, just in different ways and it has evolved through different vessels.
I would also argue that God would not create beings that could match his all-powerfulness. Attached to this, with the God that theists think of, he would not create humans to be able to recreate the abilities of God Himself. Also it is not God’s only role in the universe to simply create. His beings have abilities of their own, I’m not sure that the things you argue that technology creates have their own abilities.
Are you arguing that because some of the technology that has been created has the potential to create on its own, we are imitating God? Imitating or attempting to imitate is much different than “creating god on earth.”
Perhaps a more detailed definition of “creating” is necessary in this context. I also think very specific examples are necessary in a vague argument such as this if you are wanting to have a strong solution. What kind of technology are you specifically talking about?
There needs to be an exact personified example of what you are alluding to in order for it to be a stronger argument.
There would need to be a build up of an argument to how we were put on this earth to create a “technological god.” What is the definition of God that you are using here?