The Knowledge Explosion
This article will argue that the "more is better" relationship with knowledge which is the foundation of science and our modern civilization is simplistic, outdated and increasingly dangerous.
Let's start with a quick analogy which can provide a glimpse of where we're headed.
Our Evolving Relationship With Food
For most of our history humans have lived near the edge of starvation much of the time. In this scarcity context a "more is better" relationship with food was entirely reasonable. In our time food is plentiful and readily available in much of the world, and where that's true more people die of obesity related diseases than die of starvation.
The point here is that a "more is better" relationship with food which was entirely rational for a very long time in an era of food scarcity became outdated and dangerous when transported to a different era characterized by a food explosion. We lucky moderns are required to replace the simplistic "more is better" food paradigm from the earlier era with a more intelligent and sophisticated relationship which can involve somewhat complicated cost/benefit calculations.
And if we decline to adapt, he said while glancing down in to his lap at those twenty pounds that didn't used to be there, well, the price tag may very well be an unwelcome trip to the emergency room or morgue.
Our Evolving Relationship With Knowledge
This is where we are in our relationship with knowledge as well. The simplistic "more is better" relationship with knowledge which served us so well for so long now must adapt to meet the challenge of the new environment which it's success has created. To understand why, let's remind ourselves of some basic facts about the knowledge explosion currently underway.
First, the modern knowledge explosion obviously brings many benefits, way more than can be listed here, more than our ancestors could have even dreamed of. And although mistakes, missteps and even epic calamities such as technology powered global wars do occur, so far we've always managed to clean up the mess, fix the error, learn the lessons, and continue with progress. So what's the problem??
To understand the threat posed by operating from an outdated relationship with knowledge we need to examine the issue of scale. It is the vast scale of the powers emerging from the knowledge explosion that makes the historic [progress => mistakes => more progress] process that we are used to obsolete.
Here's What's Great About Nuclear Weapons
Luckily for the purposes of this article at least, nuclear weapons provide a very easily understood example of how powers of vast scale change the threat landscape by erasing the room for error. As you know, the nuclear stockpiles of the great powers will have to be managed successfully every single day forever, for as long as those weapons exist.
The key thing to note here is that as far as the future of humanity goes, successfully managing such vast power most of the time is no longer sufficient. Doing a pretty good job no longer works. Making a mistake and then fixing it is no longer an option.
In the nuclear era the room for error we've always counted on in the past is erased, and one bad day is all it takes to end the possibility for further progress. This is what defines the revolutionary new situation we now find ourselves in, a situation which demands perfection from us.
And Now The Bad News
It would be a mistake to assume this article is an act of nuclear weapons activism, because I've referenced nuclear weapons here only because they are an easily accessed illustration of the far larger problem which is their source, and that is our outdated relationship with knowledge.
If nuclear weapons were to all be abducted by aliens, the underlying "more is better" knowledge development process which created the nuclear threat would continue to create more vast powers with the potential for crashing civilization.
Each emerging power of such vast scale will have to be successfully managed every single day forever because a single mistake with a single such power a single time is sufficient to crash the system and prevent the opportunity for renewal.
And Now The Really Bad News
A key fact of the knowledge explosion is that it feeds back upon itself creating an ever accelerating unfolding of new knowledge, and thus new powers. So not only will emerging powers be larger than what we could produce in the past, and not only will there be more such vast powers than currently, but they will arrive on the scene at an ever faster pace.
Ever more, ever larger powers, delivered at an ever faster pace.
Each of these accelerating factors; scale, number, and speed; needs to be graphed against the glacial pace of human maturity development.
And They Said It Couldn't Get Worse
Yep, you guessed it, the most challenging factor is us.
There is actually nothing about thousands of years of human history which suggests that we are capable of the consistently perfect management which powers of vast scale require.
We've been able to survive repeated episodes of murderous insanity and other such mistakes in the past only because the powers available to us were limited. As example, we threw conventional explosives at each other with wild abandon in WWII, and were saved from total destruction only because conventional explosives simply aren't powerful enough to crash civilization.
A simplistic "more is better" relationship with knowledge contains within itself the assumption that human beings will be able to successfully manage any amount of power which emerges from that process.
Simple common sense reveals this assumption to be a wishful thinking fantasy. To those who have children this should be obvious. We sensibly limit the powers available to kids out of the realistic understanding that their ability to manage power is limited.
But then we assume that when children become adults they somehow magically acquire the ability to successfully manage any amount of power that the knowledge explosion may deliver. The irrationality of this assumption is proven beyond doubt by the thousands of hair trigger hydrogen bombs we adults have aimed down our own throats, a stark reality we rarely find interesting enough to comment upon.
A Problem Of Respect
A great irony of our time is that while we typically compete with each other to see who is the greatest supporter of science, we don't actually respect the awesome power of knowledge. What we respect instead is ourselves, our ability to develop knowledge.
What we seem not to grasp in our self flattering immaturity is that knowledge is not our personal property but rather a force of nature which must be respected, managed, controlled, limited, just like water and electricity, and perhaps even more so.
This necessity spells the end of the simplistic "more is better" relationship with knowledge which has defined our past. The extraordinary success of the knowledge explosion has created a revolutionary new environment which we must adapt to. And as is true for all species in all environments, a failure to adapt typically results in a predictable outcome.
Let's start with a quick analogy which can provide a glimpse of where we're headed.
Our Evolving Relationship With Food
For most of our history humans have lived near the edge of starvation much of the time. In this scarcity context a "more is better" relationship with food was entirely reasonable. In our time food is plentiful and readily available in much of the world, and where that's true more people die of obesity related diseases than die of starvation.
The point here is that a "more is better" relationship with food which was entirely rational for a very long time in an era of food scarcity became outdated and dangerous when transported to a different era characterized by a food explosion. We lucky moderns are required to replace the simplistic "more is better" food paradigm from the earlier era with a more intelligent and sophisticated relationship which can involve somewhat complicated cost/benefit calculations.
And if we decline to adapt, he said while glancing down in to his lap at those twenty pounds that didn't used to be there, well, the price tag may very well be an unwelcome trip to the emergency room or morgue.
Our Evolving Relationship With Knowledge
This is where we are in our relationship with knowledge as well. The simplistic "more is better" relationship with knowledge which served us so well for so long now must adapt to meet the challenge of the new environment which it's success has created. To understand why, let's remind ourselves of some basic facts about the knowledge explosion currently underway.
First, the modern knowledge explosion obviously brings many benefits, way more than can be listed here, more than our ancestors could have even dreamed of. And although mistakes, missteps and even epic calamities such as technology powered global wars do occur, so far we've always managed to clean up the mess, fix the error, learn the lessons, and continue with progress. So what's the problem??
To understand the threat posed by operating from an outdated relationship with knowledge we need to examine the issue of scale. It is the vast scale of the powers emerging from the knowledge explosion that makes the historic [progress => mistakes => more progress] process that we are used to obsolete.
Here's What's Great About Nuclear Weapons
Luckily for the purposes of this article at least, nuclear weapons provide a very easily understood example of how powers of vast scale change the threat landscape by erasing the room for error. As you know, the nuclear stockpiles of the great powers will have to be managed successfully every single day forever, for as long as those weapons exist.
The key thing to note here is that as far as the future of humanity goes, successfully managing such vast power most of the time is no longer sufficient. Doing a pretty good job no longer works. Making a mistake and then fixing it is no longer an option.
In the nuclear era the room for error we've always counted on in the past is erased, and one bad day is all it takes to end the possibility for further progress. This is what defines the revolutionary new situation we now find ourselves in, a situation which demands perfection from us.
And Now The Bad News
It would be a mistake to assume this article is an act of nuclear weapons activism, because I've referenced nuclear weapons here only because they are an easily accessed illustration of the far larger problem which is their source, and that is our outdated relationship with knowledge.
If nuclear weapons were to all be abducted by aliens, the underlying "more is better" knowledge development process which created the nuclear threat would continue to create more vast powers with the potential for crashing civilization.
Each emerging power of such vast scale will have to be successfully managed every single day forever because a single mistake with a single such power a single time is sufficient to crash the system and prevent the opportunity for renewal.
And Now The Really Bad News
A key fact of the knowledge explosion is that it feeds back upon itself creating an ever accelerating unfolding of new knowledge, and thus new powers. So not only will emerging powers be larger than what we could produce in the past, and not only will there be more such vast powers than currently, but they will arrive on the scene at an ever faster pace.
Ever more, ever larger powers, delivered at an ever faster pace.
Each of these accelerating factors; scale, number, and speed; needs to be graphed against the glacial pace of human maturity development.
And They Said It Couldn't Get Worse
Yep, you guessed it, the most challenging factor is us.
There is actually nothing about thousands of years of human history which suggests that we are capable of the consistently perfect management which powers of vast scale require.
We've been able to survive repeated episodes of murderous insanity and other such mistakes in the past only because the powers available to us were limited. As example, we threw conventional explosives at each other with wild abandon in WWII, and were saved from total destruction only because conventional explosives simply aren't powerful enough to crash civilization.
A simplistic "more is better" relationship with knowledge contains within itself the assumption that human beings will be able to successfully manage any amount of power which emerges from that process.
Simple common sense reveals this assumption to be a wishful thinking fantasy. To those who have children this should be obvious. We sensibly limit the powers available to kids out of the realistic understanding that their ability to manage power is limited.
But then we assume that when children become adults they somehow magically acquire the ability to successfully manage any amount of power that the knowledge explosion may deliver. The irrationality of this assumption is proven beyond doubt by the thousands of hair trigger hydrogen bombs we adults have aimed down our own throats, a stark reality we rarely find interesting enough to comment upon.
A Problem Of Respect
A great irony of our time is that while we typically compete with each other to see who is the greatest supporter of science, we don't actually respect the awesome power of knowledge. What we respect instead is ourselves, our ability to develop knowledge.
What we seem not to grasp in our self flattering immaturity is that knowledge is not our personal property but rather a force of nature which must be respected, managed, controlled, limited, just like water and electricity, and perhaps even more so.
This necessity spells the end of the simplistic "more is better" relationship with knowledge which has defined our past. The extraordinary success of the knowledge explosion has created a revolutionary new environment which we must adapt to. And as is true for all species in all environments, a failure to adapt typically results in a predictable outcome.
Comments (60)
I think this is a good, provocative read. It might benefit from some refining, though. I say this because in regards to whether the thesis is true and whether it is being discussed (which you raised on The Philosophers' Cocoon), in the sense (1) the thesis is true (or plausible, at any rate), it is being discussed, and in the sense that (2) the thesis is not being discussed, the thesis is not true.
As far as (1), your thesis is about knowledge but your discussion seems to be about technology in particular. Limited to the progress of technology, it seems plausible to say that there is a legitimate concern about whether technology is advancing at too fast a pace for humans to survive. I think this is a well worn issue in philosophy and outside of it. I'm no expert but in contemporary discussions, one likely finds this sort of concern being debated in applied/ medical ethics and in STS fields. The earliest example I can think of being raised in the history of philosophy is Plato's critique of writing (a technology) in 'Phaedrus'.
As for (2) if you don't intend to limit your point to technology, or applied knowledge, then I don't think your claim is being discussed (though I might be wrong). But it doesn't seem plausible to me. Consider whether you think that learning more about mathematics is dangerous. Or, learning more about the oceans. Or how non-human animals experience the world. Or the universe. Or physics. There are any number of fields where discovery and the progression of knowledge is valuable that seem to me to undermine your concern.
If you don't intend to limit your point to technology, you might want to refine it along the following lines. Any knowledge that can have practical applications is dangerous, and to think that more is better is wrong.
First, yes, of course the piece above can be improved. I see it as a kind of first draft which I am submitting to the group mind for feedback.
However, to argue against what I just said, it seems to me the issue itself is more important than the form of presentation. As example, if I noticed your house was on fire the important thing would be that I said something, and much less how exactly I said it.
Quoting Doug
Apologies, I don't quite understand you here. If your time permits could you try again?
Yes, I agree the concern I'm expressing has been addressed in regards to particular technologies, for example, genetic engineering.
What I'm not seeing (perhaps because I don't know where to look) is a broader discussion of our relationship with knowledge itself. It seems to me the underlying problem is that we're failing to adapt our "more is better" relationship with knowledge to meet the new environment created by the success of that paradigm. As I see it, we're assuming without much questioning that what has always worked well in the past will continue to work for us in the future, and I don't believe that to be true.
The entire system is only as strong as the weakest link, and human maturity is a very sketchy business indeed. As I noted above, the vast scale of the powers being developed would seem to require greatly enhanced judgment and maturity from us, and it doesn't seem that we can evolve as fast as knowledge and technology can.
Quoting Doug
I do have another line of discussion regarding our relationship with knowledge that is probably best discussed in a religion flavored conversation, and I'm not introducing that here so as to not muddle the waters. The first post opens a big enough can of worms for one thread.
Quoting Doug
I would argue the following. It's indisputable that the knowledge explosion is delivering too many benefits to begin to list. I agree with this entirely. However, none of that matters if we crash civilization, because then all those benefits will be swept away.
And there is a very real possibility that such a crash will happen, given that the machinery for that unhappy day is already in place, ready to go at the push of a button. Or the next time somebody screws up. In my opinion, the appropriate context for this discussion would be that state of mind we would bring if someone had a gun to our head, because that is literally true.
Finally, and I apologize for this, but I've just come from spending every day for months on a prominent group philosophy blog that publishes every day, where after 2 years nuclear weapons have been mentioned only briefly, and only once, and only after much hounding from me. It's upon that experience and many other similar ones that I'm questioning whether this subject is being adequately addressed by intellectual elites.
Enough from here. Thanks again for engaging and your further comments are most appreciated should your time permit.
Doug's sentence above seems a good summary of the objections many or most people would have to the opening post, so let's focus on this a bit.
Everyone understands that the knowledge explosion has brought many benefits, and some problems too. We view the history, decide that the benefits out weigh the problems, and thus conclude the knowledge explosion should continue as it has over the last 500 years. This is a culture wide assumption that is generally taken to be an obvious given, thus the assumption typically doesn't receive much attention.
This cost/benefit analysis was entirely reasonable and rational in the past, prior to the emergence of vast powers with the ability to crash the system. In the new reality we've been living in since say, the 1950's, the old cost/benefit analysis falls apart, is made obsolete. In today's world it doesn't matter if the benefits outweigh the costs 1000 to 1 if the cost is the crashing of modern civilization, because such a crash would erase all the benefits.
In the past, the cost of the knowledge explosion was that various problems would arise that would then have to be understood, fixed and cleaned up. And then progress would continue. This was a reasonable formula because progress always did continue in spite of various problems which arose.
Today, the potential cost of the knowledge explosion includes the end of modern civilization, the end of the knowledge explosion, the end of progress, and the end of an ability to fix our mistakes.
The argument being presented here is that we are attempting to operate from ancient "more is better" assumptions that were long entirely rational, in a new environment that is radically different. Our philosophy is not keeping up with our technology.
The argument is that this failure of our philosophy to adapt to new conditions is the central issue facing modern culture, and that generally speaking intellectual elites are failing to give this situation adequate focus, seeing it instead as one of a thousand issues that might be examined.
I'm pleased to engage with you on this. I didn't intend my initial post to be an objection to your view (and certainly not to the presentation of it here, which I think is written quite nicely). Rather, I was sharing one academic philosopher's take on whether the issue is being discussed in philosophy scholarship as well as what I took to be the plausibility of the view.
To clarify my point: I think we can distinguish between knowledge and technology. (Within technology, I also think we can distinguish between technological knowledge and the application of technological knowledge. For instance, we may discover how to make a nuclear bomb, but to apply this knowledge--and actually make one--is something different. Yet, I would not deny that if we have the technological knowledge, someone will apply it, or try to. I think this is at least part of your insight. And I think we see with, for instance, AI, this is the case.) Knowledge is a broader category than technology, which seems to be a species of knowledge. It seems to me that your view is strongest when applied to technology. But that there are other species of knowledge that don't seem so obviously problematic in the way you suggest. So, it would be interesting to see if you could extend your view to other species of knowledge. For instance, mathematical knowledge. It doesn't seem to me that learning the next number in pi is problematic in the way that nuclear technology is. But without proving that all species of knowledge endanger us or without limiting the type(s) of knowledge you have in mind, your argument is not as convincing as it could be.
Moreover, given the dangers we face in the world today, it seems that knowledge is our best way out of some of them. For instance, while nuclear technology is troubling, perhaps the more serious problem humanity faces is climate change. In many respects, it seems the best and only way to save ourselves is with knowledge. People who deny the human contribution to climate change need to come to know that we are playing a big role in the problem before they will be ready to change. Alternatively, presumably those who do deny the human role should at least think it is important to figure out what is causing it. Moreover, we need to learn how we can stop it-- what are the most effective ways to slow it? Some people also think that technology will help us by e.g., coming up with a technique to remove excess carbon.
If this is right, then even if some kinds of knowledge are dangerous, others might really help us out on the global scale. So, we need to determine the likelihood of nuclear war and climate catastrophe. (But doing this requires knowledge.)
Another form of knowledge that would help would be increased moral knowledge. Along with this, if we had more and better knowledge about how to educate people morally, then we'd be in better shape. Again, this might be the only thing that can obviate the dangers certain technologies pose. One might deny that moral knowledge is possible or that we can learn better ways to make people morally better, but these are arguments that need to be made.
At any rate, I think your view is worthwhile and would be a welcome addition to philosophical discussions. In case you don't know about it https://philpapers.org/ is a great place to see if there are any other thinkers who are raising similar concerns.
I'm sure some people would doubt this, and to really prove to them you are right, it would be helpful to cite some studies that demonstrate how adults continue to have problems managing the amount of power they have. You may consider reading Dan Ariely's book Predictably Irrational. It looks at how human psychology often encourages irrational decisions in ways we don't often recognize. Some of his insights could be relevant to the argument you are making.
One final point: you use arguments by analogy several times in this post. This isn't bad, but it can be dangerous as there is a fallacy called "false analogy" that is easy to fall into if you're not careful. You might consider reading this post to learn about the fallacy and this article to learn how to avoid it.
I agree with the objections raised by other posters. The quoted part is a generalisation that you don't really provide an argument for. That some knowledge enables powerfull technologies which hold risks that we may not be able to manage, doesn't mean that all or even most do. Therefor I also don't see how it follows that we would need to change our overall attitude to knowledge.
It would seem to be enough that we try to identify which knowledge has the possibilities for this kind of 'powers', and change our attitude to those. Of course you could then argue that it might not be possible to identify them in advance etc... but still this case has to be made i think for your overal argument to work. Or maybe you'd just say that changing our attitude to some knowlegde is allready changing our overall simplistic attitude... ok fine, i think i would agree with that.
I do think however, that there is an even more fundamental problem here than the mere realisation that more knowlegde is not allways better. Research is funded by countries, and countries are vying for controle and economic gain. I think at least some of the people involved know there are risks, but choose to ignore them because they can't count on other countries not going ahead with it.
Take for instance AI. China, the USA, and also the EU, although lagging behind, all invest enormous amounts of money in AI-research. They know there are potential risks, but they also know that the ones leading the race will have an enormous economic advantage over the rest. The point being here, that it's not their attitude towards knowledge that is driving their research policies.
Quoting Doug
Doug, I hope you will free to comment in any direction your reasoning takes you. I welcome objections and challenges of all kinds. I'm very enthusiastic, but not delicate. I'm posting to receive assistance in uncovering weaknesses in these ideas. And I would very much welcome an introduction to any intellectual elites or others who are addressing these topics.
Quoting Doug
Tell me if this helps.
I'm not against knowledge, but am instead arguing for the development of a more mature understanding of our relationship with knowledge and power. I'm really arguing for more knowledge in a particular direction.
The food analogy I referred to in my opening post might help. Obviously food is not bad, but essential. But a "more is better" relationship with food no longer works in an era of food abundance. And so we have to make more sophisticated decisions about what to eat, how much to eat, when to eat etc. Note that this inevitably involves saying no to some food we might like to consume.
If we apply the food analogy to knowledge, we might define the challenge as:
1) how do we understand what knowledge to say no to, and...
2) how do we create a consensus on that decision?
This is a difficult business indeed, because while we divide knowledge in to tidy categories within our minds, in the real world everything is connected to everything else. As example, while mathematical knowledge seems harmless when considered by itself, it is in part mathematical knowledge which makes nuclear weapons possible.
Thank for the link to philpapers. I'm just in the process of discovering it and it does seem the kind of resource I'm asking for. Thanks again for your contributions here.
Is the premise here of a 'knowledge explosion' valid? I think not. The equation of knowledge and food is also unsound as food is physical and has physical consequence whereas knowledge is non physical.
To assert there has been an increase in knowledge with time would presume that old knowledge is preserved whilst new knowledge is added. This does not make sense. Technology reduces the need for knowledge and arguably knowledge decreases with technology. 200 years ago the average man had the knowledge to construct his own dwelling, had to travel without maps, had to cure his ills with herbs, grow his own food and quite often make his own music and food and entertainment... Etc.
Today the average man has most of these tasks accomplished for him without his having to think about them.
Technology and the loss of cultural knowledge through globalisation, might easily argue for an intellectual contraction rather than a knowledge explosion.
Quoting Nathan
First, don't buy in to the ideas, kick the tires as hard as you can. Members would actually be doing me a favor if they could liberate me from this line of thought as it's become an all consuming obsession, which isn't such a great plan if the ideas are incurably flawed.
Next, I hear you about empirical data and studies, but as you know, I'm not the most qualified person to produce that. This issue is way too big for any one person, so I'm hoping that by engaging philosophers, scientists and other intellectual elites we can bring much greater firepower to bear on the issue than any of us could ever provide on our own. So given that you are an academic yourself, I would bounce this ball back in to your court and hope that you might use your connections to engage some of your highly educated peers on this issue, either here, on your blog, or where ever they are willing to engage.
I'll read the links you shared regarding analogy problems, thanks. I do struggle trying to find the best way to express these ideas.
Quoting ChatteringMonkey
To keep things tidy, you were referring to this...
Quoting Jake
I could likely use help in making the argument clearer. Here's another try.
Let's consider the relationship between conventional explosives and nuclear bombs.
NORMAL SCALE: With conventional explosives we can make big mistakes (ie. WWII) and then clean up the mess, learn from the mistake, and continue with progress. [progress => mistakes => more progress]
VAST SCALE: With nuclear weapons the "clean up the mess", "learn from the mistake" and "continue with progress" parts of the process are removed, at least in the case of war between the major powers. With powers of vast scale the formula is: [perfect management OR death].
Quoting ChatteringMonkey
Yes, this is a huge problem, agreed. As humanity is currently organized, in competitive groupings so often led by power hungry psychopaths, what I'm describing is beyond challenging.
I do doubt that this status quo can be substantially edited through the processes of reason. However, there is another possibility, pain. As example, consider Europe. Even though Europe is the home of Western rationality, Europeans still conducted ceaseless insane war upon each other for centuries. But then the pain of warfare became too great in WWII, and now they are united in peace like never before.
It seems inevitable to me that sooner or later somebody is going to set off a nuke in a big city somewhere in the world. That will be a revolutionary historic event which will bring with it the possibility of revolutionary historic change.
Quoting ChatteringMonkey
Another good point. Yes, it's their relationship with power, which is what drives our relationship with knowledge. We usually don't pursue knowledge just for itself, but for the power it contains. I like this way of looking at it, as you're helping us dig deeper in to the phenomena. It might be useful to rephrase the question as our "more is better" relationship with power.
There are no absolutes in nature, outside of nature ITSELF there are no perfections and as we humans exist within nature, imperfect management of all managed systems is the norm. Failure in the management of nuclear weapons, constantly occurs thankfully so far without vast consequence. The most recent failure in nuclear weapons management was the election of Donald Trump and his tweets/ conversations about the size of his nuclear 'button'. The election of a moron to a position of authority over the US nuclear arsenal, is an example of imperfect management.
Management systems within nature require such imperfections if they are to evolve with nature in her totality.
Your notion of death is equally problematic, death of the entire human species is not a definite outcome of nuclear war,. Human existence would not have been possible without the extinction of the dinosaurs.
Human 'knowledge' does not become more of a threat because it is expanding, that is an oxymoronic suggestion, if human knowledge were truly expanding, then by definition the threat to human existence (caused by human beings) would be decreasing rather than increasing.
The increasing threats posed to humanity, by humanity itself should be evidence enough to prove that knowledge is not currently expanding it is contracting. Knowledge is changing but it is not increasing or expanding; it is instead diminishing with our dependence upon technology. This dependence removes us from nature and makes us less knowledgeable of nature, and less aware of our effect upon the natural system(s) that sustains us. Our contracting knowledge in respect of nature, renders ecological collapse an inevitability.
Technology and capitalism shield mankind from knowledge of the ecological and humanitarian effect of the transaction and or the consumptive act. They accomplish this by destroying or engineering the contraction of knowledge through an encouraged and engineered dependence upon technology.
Capitalism is entirely dependent upon a possible 'knowledge expansion' of the few (the capitalist owners of technology), and the contraction of knowledge within the majority of humans that make up 'the market'. However once again the knowledge expansion of capitalist technocrats comes at their willingness to sacrifice philosophical knowledge (the knowledge of consequence for example)
If the human subject is to be wooed into the consumptive act, particularly if he/she is to be manipulated into buying product that is in unnecessary, he must be rendered LESS knowledgeable. Either he must become unable to make the product himself, or he must loose the knowledge that allowed him to survive without the product, and he/she must loose the moral knowledge of the consequence(s) of the consumptive act. Capitalism and the market is entirely dependent upon knowledge contraction.
M
I'll list my objections to the argument in a more organised manner :
1. I don't think the analogy with food works. With food our relationship to it is relevant, because when we eat to much it affects our health. With knowledge however, our relationship to it doesn't really matter, because we, as personal actors, don't produce the knowledge that gives rise to the kind of risks we are talking about. It's only state funded research that does that. So what matters is the way in which research policies are determined, and i would argue that our relationship to knowledge only marginally influences that at best.
2. I don't think you have justified the generalisation from one or a few examples, to all of knowledge. I don't disagree with the examples you gave, but as of yet I don't see reasons to conclude that this is necessary the case for all knowledge or even most knowledge. This argument needs to be made, unless you are settling for the less general claim that only some knowledge holds dangers that we should take into account.
3. If you are settling for the less general claim, then I don't think this is that controversial. Most funding agencies and research organisations allready have advisory and ethical boards that are supposed to look into these issues. So the idea that science also entails risks is allready incorporated into the current proces of funding and doing research. What still might be a problem though, is that ultimately these considerations maybe are not given enough weight by those deciding the research policies, because they deem other things more important, i.e. power and economics. But then the problem is not one of an oudated view on knowledge, but rather a problem of valuation (i.e.. they value power and the economy so much that they are willing to ignore the risks).
Thank you ChatteringMonkey, good plan. I've been meaning to break the thesis up in to a series of concise assertions so that readers can more easily tell us where they feel problems exist.
Quoting ChatteringMonkey
Ok, but in democracies at least, we are the state, it's our money and our votes which drive the system. Each of us individually has little impact personally, but as a group we decide these questions. If any major changes were to be deployed by governments they would require buy in from the public. Even in dictatorships, the government's room for maneuver is still limited to some degree by what they can get the population to accept.
Quoting ChatteringMonkey
I would surely agree that some forms of new knowledge are more dangerous than others. I'm not arguing we stop learning or go back to the 8th century, so the discriminating calculations you seem to be suggesting are appropriate.
I am arguing against the notion that we should push forward on all fronts as fast as we can, ie. the "more is better" relationship with knowledge.
The situation is admittedly very complicated because of the way one seemingly harmless technology can empower other more dangerous tools. Computers might be an example of this?
Quoting ChatteringMonkey
Yes, agreed, but... Can you cite cases where the science community as a whole has agreed to not learn something? There may be some, we could talk about that.
For example, genetic engineering is rapidly becoming cheaper and easier. What process is going to stop your next door neighbor from someday creating new life forms in his garage? Sure, they will pass laws, but when have laws worked to the degree necessary in cases like this? If legit responsible scientists learn how to do XYZ it's only a matter of time until bad actors, or just stupid careless people, acquire that same knowledge.
You are raising good challenges, thanks for that, keep them coming if you can.
Yes, agreed, not a likely outcome either, imho. I'm referring to the collapse of modern civilization in my concerns. Some would surely survive a nuclear war for example, but probably wish they hadn't.
I don't really agree with this, at least in part. In theory democracy is supposed to work this way, but in practice that doesn't seem to be the way it plays out. Generally, i would say, people don't really have well-thought out ideas about most issues, including what our policies about research should be. I mean, it's a bit of a complex topic to do justice to here in this thread, but I think it's more the other way arround, politicians and policymakers who decide and then convince the public to adopt their views.
Quoting Jake
In my country there is a policy that prevents funding research with direct military applications. When a more right wing government wanted to abolish this policy, the science community collectively (or at least a large part) opposed that proposal. But I don't think it's very likely they will ever agree to voluntarily not learn something (unless maybe if it's something obviously evil). There is allways a struggle to get funds in the science community, and they will take what they can get pretty much.
If you want to prevent certain research, I think you got to implement restrictions at the level of funding. And one of the more general restrictions typically is that applications to fund a certain research project also has to pass an ethical board.
Yes, indeed, and that includes me as well. Thus, this thread. The goal here is to generate as much discussion on the topic in as many places as possible from all points of view. I don't propose that this will solve the problem, but it's better than nothing.
Quoting ChatteringMonkey
That's surely a reasonable perspective, though I do have personal experience of being part of a small group of about 50 average everyday citizens who changed a major state law, so that can happen. Another example, the current resident of the U.S. White House. The overwhelming majority of political elites on all sides didn't want him to get the job, but the little people decided otherwise (by a minority vote).
All that said, your basic point seems sound. Most of the time most of us don't decide things via reason, but by reference to authority. So for example, even if every word I've typed in this thread were to be proven exactly 100% true :-) that wouldn't accomplish much as I have no authority, nor any talent for acquiring it. Point being, pretty close to nobody is listening to me.
That's a key reason why I hope this thread can attract intellectual elites of various types. Not only does this subject need their intelligence, advanced education, and talent for communicating, it needs the authority they have accumulated.
So my plea to all readers would be, if you have any connections among intellectual elites of any flavor, please consider inviting them in to this thread, or in to similar conversations on your own websites. Any place you can find or generate such conversations is good.
Do you want to prevent certain research? A question to one and all...
Quoting ChatteringMonkey
That's interesting. If you feel you can share the name of your country, please do.
Generally no, I don't think so, mostly for practical reasons. But I would accept exceptions if there are really good arguments to do so.
The principal reason is that I don't think the knowledge itself is inherently dangerous, it's the technological applications that can be. And in practice there is usually a serious gap between knowledge being acquired and technology being develloped. It's difficult to say that a certain knowledge will eventually be used to devellop (dangerous) technologies. Given that level of uncertainty, it seems difficult to justify the wide restrictions to knowledge that would be needed. Note that i'm only talking about theoretical knowledge here, I have less problems with restictions to devellop technologies with obvious enormous risks.
But maybe the biggest problem I have with trying to prevent research is that I don't think it will work. There is no world government. Even if only one country acquires the knowledge, the cat is allready out of the bag, and the likelyhood of all countries reaching an agreements to prevent certain research seems very very small. It's a kind of prisoners dillema, the best thing maybe would be to all refrain from a certain research (nobody loses), but since you can't count on that it's better to also do the research (otherwise you loose twice, by not having the benefit of the research, and the dangers of the research are there anyway).
Ok, but if the knowledge exists and offers some ability to manipulate our environment, isn't somebody going to turn that knowledge in to a technological application? Is there really a dividing line between knowledge and technology in the real world?
Consider atomic research. If I understand correctly, that field began simply as curiosity about how nature works. As the understandings of the atom developed, somebody down the line realized that if the atom could be split that would release enormous energy. And then at some later point The Manhattan Project figured out how to do that, and nuclear weapons and nuclear energy were born.
I just finished watching a documentary on Netflix called The Nuclear Option. It discusses new reactor designs which may be much safer.
https://www.netflix.com/title/80991265
Let's imagine that this works, and that nuclear energy becomes affordable and safe. Assuming this we could then ask...
Was learning how to split the atom worth it?
POSITIVE: Clean safe energy, a major contribution to fighting climate change.
NEGATIVE: Global economy prospers, making more humans, consuming more finite resources, destroying more habit, accelerating species extinction etc.
NEGATIVE: All benefits of science can be erased at the push of a button.
Complicated, eh?
Quoting ChatteringMonkey
Will not preventing some research work? Don't we have to ask this too?
Let's recall the Peter Principle, which suggests that people will tend to be promoted up the chain until they finally reach a job they can't do. Isn't civilization in about that position? If we can't or won't limit knowledge, doesn't that mean that we will keep receiving more and more power until we finally get a power that we can't manage?
Hasn't that already happened?
If I walked around with a loaded gun in my mouth all day everyday would you say that I am successfully managing my firearm just because it hasn't gone off yet? Aren't nuclear weapons a loaded gun in the mouth of modern civilization?
I propose that my concerns are not futuristic speculation, but a pretty accurate description of the current reality.
This is a great discussion, thank you, and please imagine me bowing deeply in your direction.
I do think there is a gap, certainly at the moment of acquisition of the theoretical knowledge. Governments are constantly trying to find ways to close that gap (to justify the money spend on research), because theoretical knowledge doesn't directly translate into economic value. For that you need some technology that can be marketed.
It costs lots of money to devellop technologies, and that cost generally only increases the more advanced the technology is. The initial devellopment of the A-bomb for example has cost billions of dollars.
Of course, once that initial devellopment is done, the cost of reproducing the allready develloped technology can be reduced, but i'd think it would still be quite the barrier for the average Joe. To build an A-bomb for instance, i'd guess you need infrastructure that almost nobody can finance on his own.
Would atomic research have been worth it if A-bombs destroy the world? Obviously no, but that is hindsight, with perfect information. At the moment of the atomic research we didn't have that information.
These are certainly reasonable questions, and I agree that there are some serious issues, but I don't think we have all that much controle over the direction we are heading. The only way is forward it seems to me. Technologies will possibly bring new risks, but possibly also new solutions and ways to manage those risks.
And I mean, I certainly don't pretend to have all the answers here, so I agree that more attention for this would be a good thing.
I also enjoyed the discussion, thank you sir :-).
This seems a pretty good summary of the group consensus, generally speaking. It's this group consensus that I'm attempting to inspect and challenge.
Do we have control over the direction we are heading?
A reasonable argument can be made that knowledge is a force of nature that will take us where ever it will. That's a very real possibility that I recognize, but it doesn't seem that it is in human nature to be defeatist and just wait around for the end to come. We've very confidently tackled and successfully managed many forces of nature, so why not this too?
I would agree that it does seem quite unlikely that we will calmly reason our way to a solution, but we need to also factor in our response to calamity and pain. The group consensus you've articulated exists today because we still think we can get away with giving ourselves more and more and more power without limit. Historic events may challenge that assumption in a profound manner.
Quoting ChatteringMonkey
Well ok, but um, blindly repeating outdated assumptions is not really forward movement, but rather a clinging to the past.
Quoting ChatteringMonkey
And this will work most of the time, but with powers of vast scale, that's no longer good enough.
Again, this isn't futuristic speculation, it's fully true right now. As we speak, we have to successfully manage nuclear weapons every single day, and just one bad day is all it takes to bring the whole system crashing down. As we build ever more technologies of ever larger scale at an ever faster pace, this reality will become ever more true. That's what I'm asking readers to face, the path we are currently on is unsustainable, it's a formula for disaster.
Like you, I don't pretend to have all the answers. The only "answer" I can suggest is that we face these inconvenient questions without blinking, and raise their profile to the degree possible.
WRONG: If the thesis of this thread can be defeated, it should be, because we don't want to go around alarming people for no reason.
RIGHT: If the thesis of this thread can not be defeated, if it is generally found to be true, it really deserves our full attention. It's not going to accomplish anything to build many new amazing tools if they're all just going to be swept away in a coming crash....
Which could literally happen at any moment.
Maybe that's defeatist, but I do think there are good reasons for believing this. Look at what has happened with the climate change issue. We know for how long now that there is a serious problem of man-made climate change, a large majority of scientist agree with this. And still we don't manage to reach agreements for policies that sufficiently adres the issue.
Now the thesis in your opening post, while it may have it's merits, it deals only with possibities not certainties. How should one expect goverments to react to this, considering their reaction to climate change?
On a more positive note, i do think there is there's a good chance that this will get more attention and will be adressed eventually. I just don't think the time is now, given the urgency of some of the other issues that need to be dealt with.
Here a link to a philosopher that deals exclusively with existential risk, might be of interest to you and inform the discussion some more :
https://nickbostrom.com/existential/risks.html
I can agree with this. My best guess is that little to nothing will be done about it until some epic calamity forces us to face the issue. Or maybe we will never face it, and just race blindly over the cliff.
That said, there is something you and I can do about it, and we are doing it together right now. We aren't in a position to be personally decisive on such a huge historic issue, but we are in a position to expand conversations like this. What is the point of philosophy if it helps us see such threats, and then we do nothing about them?
I'd suggest there could be two purposes for this thread.
1) Help people decide whether they think the thesis is generally correct, or not?
2) For those who vote yes, help organize a constructive response. This could be something as simple as inviting more folks in to this thread, for example.
Quoting ChatteringMonkey
To bat the ball back over the net, I would argue that thousands of hydrogen bombs poised to erase everything accomplished over the last 500 years is not a possibility, but a well documented real world fact beyond dispute. Again, what I'm describing is not futuristic speculation, but a current reality.
I've had this conversation many times, and it always arrives at about the same place. Intelligent people such as yourself will examine and test the thesis, and often conclude it has some merit. But then they are faced with the fact that intellectual elites and other experts are typically talking about almost everything else in the world except this. This collision between one's own reasoning and the world around us can be disconcerting, disorienting.
I'm basically asking readers to face that we are in a bus careening down a steep mountain road, and there is no bus driver. This is not a vision folks are eager to accept, for very understandable human reasons.
To me, even if we can't do anything about this, it's a fascinating philosophical experience. Who should you believe? Your own reason, or a culture wide group consensus and the experts?
Thank you for the link to Bostrom. I did try to engage with him some time ago but didn't succeed. Perhaps this would be a good time to try again, good idea, thanks.
I'm not claiming I know a lot about the Amish, or that the Amish are all the same, or that we should all become Amish, but...
They do seem an example of a group of folks who have thought about such things in their own way and bowed out of the knowledge explosion to some degree. They've chosen to keep some of what the knowledge explosion has produced, while declining other aspects. That is, they aren't prisoners of the simplistic "more is better" relationship with knowledge, but have crafted a more nuanced relationship.
At the least the Amish seem a real world example that such choices are available, and don't automatically lead to catastrophe.
https://www.ie.edu/cgc/news-events/events/cfp-resistance-to-innovation/
The description of that project concludes....
If I understand correctly (I may not) the project assumes that resistance to uncontrolled technological progress is automatically invalid, and thus scholars should seek "the best mechanisms to overcome it". Do you see that assumption too? Or am experiencing my own bias here?
Here's an irony I find endlessly interesting.
The "more is better" relationship with knowledge is the status quo and has been for at least 500 years, right?
Those defending the existing status quo typically label themselves as progressive advocates for change, while those arguing the status quo should be replaced with something new are typically labeled as "clinging to the past".
As example, see this line in the CFP...
So if you don't wish to blindly follow science as it marches confidently towards civilization collapse, you are like the ignorant mobs who attacked the first printing presses.
It seems to me that those arguing for the status quo are clinging to the past, and those arguing that the simplistic "more is better" status quo should be replaced with a more intelligent and sophisticated relationship with knowledge and power are the advocates for change.
We human beings are simply not emotionally and cognitively configured to manage the consequences of having powerful knowledge over the long run. We can recognize the need for long-term management (running into the scores of years, into centuries, and then many thousands of years) but we are very poor at even conceiving how to put very long term plans into effect. Religious organizations have managed to look after saints bones and sacred texts for a little over 2000 years. That's the best we have done.
Let me take a different example: Alexander Fleming discovered penicillin in 1928, but 15 years would pass before a strain and a method was found to manufacture it in large quantities. In his Nobel Prize speech, Fleming explained that bacterial resistance to penicillin was unavoidable, but that it could be delayed by using enough penicillin for long enough.
The discovery of penicillin, streptomycin, aureomycin, and a few dozen more antibiotics was a tremendous thing--extremely valuable knowledge. Seventy-five years later we are finding that some pathogenic bacteria (causing TB, gonorrhea, and all sorts of other infections) are emerging which are pretty much resistant to all of the antibiotics.
Perhaps this would have happened anyway--later or sooner--but certain people ignored Fleming's warning. Among them, companies producing antibiotics to feed to beef, pork, and chickens to get them to grow faster; doctors who prescribed antibiotics for diseases for viral diseases (like colds, influenza for which they were irrelevant; patients who received the proper dose and instructions for bacterial infections but only took part of the Rx, as soon as symptoms went away. Countries which sell antibiotics over the counter, leading to widespread under-dosing or inappropriate use.
Our management of antibiotic knowledge has gone the way of most other powerful knowledge.
Consequently, we have wasted the long-term value of antibiotics. We are stuck with thousands of nuclear weapons and nuclear waste dumps. Virtually indestructible plastics are messing up life in the oceans, we are immersed in a soup of hormone-like chemicals; we flush tons of pharmaceuticals into rivers every day (in urine, feces, and to get rid of pills), and on and on and on.
Your antibiotic example would seem to illustrate the issue pretty clearly. It's one of million cases of what we might call a "self correction loop", which goes something like this.
1) Invent something
2) Things go great for awhile.
3) Abuse the something.
4) Enter a calamity.
5) Learn the lessons.
6) Try again, often with better results.
This pattern has been repeated endless times both in our personal lives and at the larger social level. So when we look to the future the group consensus says, "Sure, we'll have problems as we learn, but we'll fix the problems like we always have". What makes this assumption compelling is that it's long been true, and continues to be true for most situations today.
What's not being sufficiently taken in to account is that the self correction loop only works with powers of limited scale. What nuclear weapons teach us is that with powers of vast scale a single failure may crash the system, thus preventing the opportunity for learning and correction.
Quoting Bitter Crank
I agree of course. What complicates this though, and makes it harder for folks to grasp, it that we will succeed in managing many large powers. So for instance people see that we haven't had a big nuclear war, and they take that as evidence that we can manage vast powers.
What they're not taking in to account is that 1) as time passes and 2) the number of vast powers increases, and 3) the scale of those powers grows, the odds are increasingly against us.
As example, it's one thing to manage nuclear weapons for 70 years, and another to manage them successfully every single day forever. It's one thing to manage one vast power, and another to manage 23 vast powers.
Military planners underestimated the damage that nuclear weapons would do. Daniel Ellsberg [he Doomsday Machine: Confessions of a Nuclear War Planner] documents how these planners had calculated damage on the basis of megatons of explosive power, but had not taken into account the resulting firestorms that the hot blasts would cause. The damage becomes far greater and, in addition to radioactive fallout, there would be enough dust and soot blasted into the upper atmosphere to trigger--not global warming, but global cooling. It isn't that we would enter an ice age, but 7-10 degrees of cooling would be intensely disruptive and the consequences would last for several decades. Crop failures would be very severe.
The problem of excreted medical compounds was probably unforeseen and unavoidable. But the chemical industry--like most industries--has externalized the costs and harms of production. The by-products of manufacturing chlorofluorocarbons that made things like more efficient air conditioning, Teflon, and fire retardants possible were, for the most part, dumped into the environment. These various chemicals deplete ozone and in humans have several negative effects such as immune system suppression.
Capitalism (in any form) pretty much requires continual expansion. This has led to maximum expansion of extractive and manufacturing industries across the board, along with continual expansion of consumption -- and then consequent waste products (sewage, garbage, spoiled environments, etc.).
Yes, the odds are against us. "Paying the piper" is way overdue, and the natural bill collectors are out and active.
Quoting Bitter Crank
Here's the trailer to a video called Countdown To Zero which documents all kinds of screw ups and threats involved with nuclear weapons. (The 90 minute full documentary is available on youtube for $3.)
https://www.youtube.com/watch?v=vWJN9cZcT64
What I'm trying to point to is that as the knowledge explosion proceeds there will be more existential threat situations of this nature. It won't matter if we successfully manage most of these threats most of the time because a single failure a single time with technologies of vast scale is sufficient to crash the system. That's what we should be learning from nuclear weapons.
So if we stick with the "more is better" relationship with knowledge paradigm we are headed for a crash, which makes much or most scientific research today largely pointless.
What's fascinating is that intellectual elites don't get this, and yet the concept is no more complicated than how we restrict powers available to children out of the recognition that their ability to manage power is limited.
1) When it comes to children everyone immediately gets that their ability is limited and thus the power they have should be as well.
2) When it comes to adults we completely ignore our own limitations and instead insist, "we need as much power as possible, more is better!"
The smartest best educated people in our culture, don't get this. I've spent at least a decade now trying to find scientists or philosophers who do get it. There are none. There are plenty who will claim they do, but when you actually look at their work, you find they haven't written the ideas being shared in this thread.
I've just come from spending months on a prominent group blog of many academic philosophers with PhDs. None of them are interested in any of this, and scientists are even worse.
Quoting Jake
Why do even the most intelligent and best educated people find this so hard to grasp?
We have successfully avoided the worst consequences of natures because the reciprocal revolutions of science and industry are (in long time scales) very recent--in fact still underway. The discovery of how to unlock the power locked up in the atomic nucleus is not a century old, yet--and that's just one problematic discovery. The problems generated by discoveries and deployed technologies in chemistry are perhaps as problematic as nuclear knowledge. "Better living through chemistry?" *** Bah, Humbug!
21st century people are just lucky that the scientific revolution didn't happen 2000 years ago. If it had happened during the Roman Empire, (everything being equal) we would be dealing with fully unfolded global warming, an extremely disrupted planetary ecology, and rather poor prospects. The mass die-off of human populations would be past, and we survivors would not know why. Et Cetera.
IN OTHER WORDS... The human situation is tragic. We are very flawed heroes and our flaws have been, are, and will be the cause of our downfall.
An aside: One theory about why we haven't encountered intelligent beings from other star systems: Evolving beings develop intelligence and technology, and then (over a period of time) exhaust the resources of their world. By the time they reach the threshold of space exploration, they no longer have the social stability and resources to pursue it.
***The phrase "Better Living Through Chemistry" is a variant of a DuPont advertising slogan, "Better Things for Better Living...Through Chemistry." DuPont adopted it in 1935 and it was their slogan until 1982 when the "Through Chemistry" part was dropped. Since 1999, their slogan has been "The miracles of science".[1]
You just keep ignoring these points. It's only partly true because, a) like i said our 'relation to knowledge' is at best only marginally driving 'the knowledge explosion' (it's more a story of economics and goverments...), and b) it's not generally the case for all knowledge.
And from a policy-point of view the idea that we should 'change our relation to knowledge', is useless, because what is one supposed to do with such a general claim? Most people who are somewhat knowledgable about the subject don't simply believe more knowledge is allways better anyway...
If you really want influence the world in some way here, you need identify individual research that is potentially dangerous, explain why etc etc... and then make concrete and realistic proposals of how to deal with that. And if you'd start that excercise, you'd probably find that a number of people are allready doing that.
Innovation is the goal, because it increases productivity and sales... because it is necessary for growth. The primary reason government are investing in innovation and research is because they believe it benifits the economy.
But there are indications that innovations are harder and harder to achieve... because, as the low hanging fruit is being plucked, more specialised knowledge is necessary and infrastructure gets more expensive. To maintain the same level of innovation, exponentially more investments are needed. Once goverments figure out their return on investment is decreasing, they very well might cut back on investing in it automatically.
So here's a strategy that might help achieving your goal. Find evidence that the rate of innovation is decreasing, and convince the public and goverment that it's not worth the money anymore. That'll get their attention a lot faster than vague predictions of doom.
You got to work with the world a little to achieve something...
What is abstract about the fact that you won't give your children machine guns to play with? You see their ability is limited, so you restrict the powers available to them. This is all my thesis is, a recognition that our ability is limited, and thus the power we have must be too. Simple common sense, that's all.
Quoting ChatteringMonkey
I keep ignoring these points because they aren't relevant to the thesis. If you prefer to replace "relationship with knowledge" with "relationship with power" I don't object at all, as already stated, but that switch makes no difference, the problem remains.
Quoting ChatteringMonkey
Either disprove the thesis and then discard it, or accept that it is generally true, and then advance the inquiry on that basis.
What happens instead is, people try to disprove the thesis, and when they fail they wander off to some other discussion.
Quoting ChatteringMonkey
Yes, many people are already doing that, agreed. And while that is good, it's not enough. Such a process assumes that we can navigate our way to having more and more power at faster and faster rates, and that simply isn't true.
As example, it seems to me that nuclear weapons are the most pressing threat. Let's imagine I had the perfect solution to that. That's great, but the knowledge explosion keeps rolling along, producing new and larger threats at an ever faster rate. Thus, if I limit my effort to this or that technology, I'm not really accomplishing anything other than delaying the inevitable crash. And of course, nobody has actually yet succeeded in getting rid of nukes, stopping or slowing genetic research, saying no to AI, or any of the things you seem to be suggesting.
At some point we're going to have to recognize that while we can manage X amount of power, we can't handle Y, and thus we have to say no to learning Y. The cultural consensus that you are expressing doesn't yet grasp that...
1) There are limits to human ability, thus...
2) There have to be limits to human power.
3) Simple!
Just exactly the same reasoning we routinely apply to children. It's the simplest thing, until someone suggests we apply this common sense to adults as well, and then everyone wants to make it as complicated as possible.
The "knowledge explosion keeps rolling on" is just a metaphore, not an actual thing happening, but a vague abstraction of different processes.
The set of all knowledge is a higher abstraction containing different subsets of specific knowledge.
Contrairy to popular belief, going higher in abstraction doesn't necessarily inform you more, you lose information about the world in the proces of 'abstraction'.
That's why ideology and politics spoils philosophical sophistication, because they give very general answers to complex questions... removing the motivation to ask further.
You are basicly doing the same thing.
You restrict children from using machine guns, but not from using water pistols... point being that you need to look at individual cases about what exactly is dangerous and what is not.
Do we agree on this, or not?
I generally agree with this Crank. We are flawed, but not entirely flawed. There is an element of sanity and reason there too. So while our downfall is likely inevitable at some point, we aren't required to race towards it blindly at full speed.
I would return a bit of focus to the Amish. They have opted out of the knowledge explosion in a manner all of us would likely consider to be extreme, and yet there they still are, not experiencing any dramatic calamity. While this is not a perfect example, it does at least suggest that the idea of slowing down is not out of the question.
Why is that principle not considered too vague or simple etc when applied to humans under the age of 18?
I agree there are many details to be considered, but before we rush off in to that, let's see if we agree on where we're trying to go. Is the shared goal of this conversation to look for ways to limit the powers available to human beings?
My goal in this thread is improving upon the argument, and maybe helping you to be more effective along the way.
There are allready a lot way peoples powers are restricted btw. I don't think it would be a good idea to start thinking about ways to restrict the power of human with a kind of top down approach, right here. I think, if that's what you want to do, you need to engage in a particular existing tradition, and identify particular ways in which the restrictions are not sufficient.
His fictional town of Union Grove, New York is, along with the rest of the world, experiencing technological collapse. But life in this dystopian world is surprisingly wholesome, if somewhat precarious. The standard of technology has been set back to roughly 1890, roughly where the Amish prefer to be: No oil refineries, no cars, no planes, electricity IF you can figure out how to generate it, no antibiotics (the factories can't operate), no plastics, horse & human power only, etc. Not many factories, either, thus the "world made by hand".
Another post apocalyptic book I like is Earth Abides written in 1949. Techno collapse in this novel is caused by a world-wide epidemic which kills 99999 people out of 100,000--not many left and those widely scattered about. Bands of people, here and there, use such tools as they can find to grow food, and meet there bare needs. The 'hero' grows old in the novel, and at the end, his grand children have adapted to a vastly simpler life.
I like the Amish people I have met. They are, of course, quite religious and of necessity rather conservative, but they aren't naive country bumpkins. They've found sustainability, even if through the route of their old-world religious roots. They are maintaining technology we should pay attention to, because most of us are 3 generations from knowing how to work with a horse or oxen, raise food without RoundUp and artificial fertilizer, preserve food (canning, drying), make cloth (I don't know that the Amish spin wool, flax, and cotton or tan leather, but those are skills that would be critical without polyester, rayon, and nylon.
I'll rephrase then...
1) Should it be a goal of society to look for ways to limit the powers available to human beings?
2) Or, should we accept the group consensus which assumes we should learn as much as possible, thus giving ourselves as much power as possible?
I agree with your "thinking well" goal and am trying to facilitate that by focusing the question. In the end we are going to try to limit the knowledge and powers available to us, or we're not. Which do you prefer?
I do agree that my phrase "our relationship with knowledge" will be perceived as too abstract by many readers. I think that's a good point.
I'm just one person. My abilities are obviously limited too. The solution I suggest is that lots of people write on this topic from their own perspectives, using their own preferred language etc. That's my goal, to help stir up such conversations.
My best friend online is actually a former Amish who left that community as a teen. He's now a web coding expert about to publish his 1,000th ezine issue on those topics. He's a great guy, and we regularly joke about what the rear ends of horses look like as you're plowing the field etc. :smile:
No see, it's not black or white.
We are limiting the powers available to human beings. The president of the US is the only one who can push the button for a nuclear strike. They did that precisely because after WO II they knew the A-Bomb was not something to used lightly (and didn't trust the military with it (if the president is a better guarantee is another question)).
The question is not whether powers should be limited, but rather whether the current limitations to powers are sufficient. And if not, in what ways? I think they are insufficient in a number of ways, but that would move us way beyond this thread.
What I see is an entire culture taking the "more is better" assumption to be an obvious given, even though that assumption can be undermined with simple common sense.
To me, this illustrates that generally speaking human beings don't typically reference reason, but authority. The main authority being the group consensus. If everyone is saying the emperor is wearing clothes, everyone assumes that he must be wearing clothes. High school kids, grandma at the bake sale, Nobel Prize winning intellectual elites, it seems to make no difference. I guess this is what Crank is trying to tell us.
Your dodging and weaving my friend. In your defense, you're in very good company.
In one respect, the group consensus is for a limitation of knowledge and power. The Nuclear Nonproliferation Treaty has sought to stop the expansion of nuclear bomb technology beyond the 5 recognized nuclear states: the USSR, USA, UK, France, and China. The effort has not been highly successful. Israel and the Union of South Africa presumably developed atomic weapons in combination. India and Pakistan both developed nuclear weapons, and more recently, so did North Korea. Iran was well on the way (apparently) to The Bomb.
Limiting the spread of nuclear weapons technology is difficult. No state that said they would develop nuclear weapons has so far failed to do so. (Atomic weapons knowledge spreads; it isn't reinvented. It was discovered only once, during the Manhattan Project. From the the US and the UK it passed on to the USSR, France, and China; Israel and South Africa; then to India and Pakistan, North Korea, and Iran. There are few "secrets" left, but the technological methodology is daunting.
The manufacture of chemical and biological warfare also was banned, but the ban can be evaded. No inconveniently noticeable big explosion is required to test C&B weapons.
Quoting Jake
It isn't clear to me how "we" would limit "us" from learning whatever "somebody among us" decides to learn, be it benign or malignant. I can decide what I will not learn, but I don't know of a way to prevent you from learning what you wish to learn. (Well, I know a way, but it happen to be highly illegal and severely punished.)
We are what we are: inquisitive, intelligent, excitement seeking, short-sighted, selfish, (piggish, a good share of the time) poorly self-regulating beings. Worse, we do not have a high-resolution big-picture view over all human affairs. We can not see everything that is going on, and we are not able to interpret a good share of it. And even when we know that some people are engaging in skullduggery of the worst kind, there is sometimes nothing we can do about it.
Somewhere, right now, somebody is openly engaging in legal research which will likely have quite negative consequences. They are pushing the envelope, maybe too far. What are "we" going to do about it?
That was once a comforting assumption; it's not quite so comforting at the present moment. In any case, presidents long since began delegating to Naval command the power to launch defensively in the event of communication failure or severe time constraints. Fortunately, Pacific Fleet Command, and officers far down the ladder, were prudent, cautious, and careful. They might have, but did not launch on warning. See Daniel Ellsberg The Doomsday Machine: Confessions of a Nuclear War Planner.
Good point. Yes, pretty much everyone seems to agree there should be fewer nuclear weapons, probably because that's a very dramatic threat which is easy to understand. A box that goes BOOM!, it can be explained to a child.
But as you say, we're not having a lot of luck with getting rid of nukes. To me, this illustrates the relative weakness of intellectual understandings. We all have a pretty detailed intellectual understanding of nukes, but that's not helping us much. What's missing is a sufficient emotional relationship with the threat. We have the data, but we're failing to experience the data as being real.
The best I can offer in terms of disarmament suggestions is that we focus more closely on the near miss mistakes which have in some cases brought us inadvertently close to the edge of war.
As example, in one case somebody mistakenly loaded a training tape in to the NORAD computer and for a few precious minutes the U.S. government thought it was witnessing an incoming Soviet first strike. In another example, the U.S. warned the Soviets they would be launching a research satellite off the coast of Norway, but somebody on the Soviet side forgot to pass the info up the chain of command. Soviet generals thought they were observing a U.S. first strike and raced the nuclear football to the usually drunk Yeltsin and told him he had to launch Russian missiles. Happily, Yeltsin ignored their advice and waited for confirmation.
The Mutually Assured Destruction doctrine doesn't protect us from such screw ups, so the only way for us to be safe is to get rid of the nukes. When I become Secretary Of State I will explain to the Russians that while we have no intention of attacking them on purpose but I can't rule out that we might do so by mistake.
Quoting Bitter Crank
I don't have a solution to this either, other than to hope some dramatic event will radically shift the group consensus making new opportunities possible. The question would seem to be, do we want modern civilization to continue, or not? Yes, or no?
If we answer yes then we would seem to have little choice but to tackle the challenge.
Quoting Bitter Crank
We can prepare the ground. We can spread conversations like this as far as we can so when that researcher causes havoc that won't be the first time the group consensus has thought about these subjects.
Even though we in this forum thread are the most brilliant philosophers of all time :smile: we shouldn't assume that because we can't think of a solution to this off the top of our heads that automatically equals there not being a solution.
on purpose, I can't rule out that we might do so by mistake.
You doubt the maturity and judgment of President Dumpster?? You doubt the moral integrity of Vladimir Putin??? Where in the world do you get these wacko ideas Dr. Crank??
Seriously, isn't it amazing? Two of the world's biggest assholes hold the fate of humanity in their grimy little hands. And Dumpster was legally elected, and Putin has wide support among the Russian people.
We are insane beyond words....
It's not something we would say about something like the weather... isn't it insane or amazing that it's raining!
Why do we treat things differently when humans are involved. Because we attribute agency to them, the ability to freely make rational and moral decisions.
I think that view is at best partly true. In fact, that view is often part of the hubris.
When you look at the history of the two countries, and begin to understand the mechanics a bit more, it's isn't quite as insane.
A 'strong man' like putin was needed to hold Russia together after the fall of the USSR. And Trump, well, he's the result of large parts of the population being ignored and not represented politically.
I can largely agree with this. In the spirit of thinking well, again the question would seem to be, do we want to hang on to modern civilization, or not, yes or no? If we answer yes, then having nukes and handing them over to people like Putin and Trump seems fairly defined as insane, in relation to the goal of maintaining the benefits of modern civilization.
Quoting ChatteringMonkey
I prefer to describe Putin as the world's leading gangster, head of a criminal enterprise which is busy soaking the Russian people for every last dollar which can be skimmed off the top and exported to secret bank accounts out of the country.
But you make a good point. Russia has been invaded from the West a number of times, and the last invasion was a horror show beyond our imagination. For example, Russia lost 40 times as many lives as America lost in WWII, and the western part of the country was burned to the ground. And so the Russians rationally choose to be ruled by a very smart gangster over risking political chaos which might invite another invasion. All that said, none of this is going to matter once somebody screws up and the missiles start flying. And what are the chances that nobody in Russia, or here, will ever screw up?
As I see it, Trump is a symptom of what we're discussing in this thread. Modernity is moving too fast, some folks are being left behind, and others are worried they will be next. And so some of us turn to a hyper-confident leader who promises he can take us back to the past when we felt we knew what was going on.
Knowledge explosion => Globalization => Rapid Change => Fear => Trump
Only way out of the prisonner's dilemma would seem to be a supranational legal framework where all parties are obliged to disarm simultaniously. Some efforts have been made to create such a thing, the league of nations, and the after WOII the United Nations... but I think that this have been failed attempts, or at least flawed. A lot of the time it has been used by nations only to serve their interests.
There were some design flaws, veto-rights, not enough resources and real power behind it to enforce decisions etc... Maybe one could do something about that, and then that could be a way forward, but in the end it's allways still people who have to do it. Still that has to be the way to deal with some of these problems.
Something like what you describe is in the works. Here's a couple of links to provide a quick overview.
https://en.wikipedia.org/wiki/Treaty_on_the_Prohibition_of_Nuclear_Weapons
http://www.icanw.org/status-of-the-treaty-on-the-prohibition-of-nuclear-weapons/
It looks like everybody agrees to the treaty, except those states that have nuclear weapons. :smile: Still, it's movement in the right direction.
As you know, the policy of deterrence is called MAD, mutual assured destruction. What's not factored in to this equation is the possibility of unintended launches. Here's an article which describes some near misses.
https://www.newyorker.com/news/news-desk/world-war-three-by-mistake
Nuclear weapons do have the benefit of teaching us a lot about ourselves. The truth is that we don't really care that much about all the work so many generations did to create modern civilization, nor do we care that much about whether future generations will get to enjoy what's been built. We have the goodies right now, and that's pretty much all that matters to us. Sobering, but instructive.
Quoting The New Yorker
MAD assumes that all parties will have technical control of their weapons at all times, an assumption undermined by a series of incidents where such control was lost.
MAD assumes that the players involved are sane intelligent actors who will make rational calculations. While this is typically true, it's also sometimes not true. As example, it was nuts for Hitler to invade the Soviet Union, as his generals tried to tell him, but Hitler was a high stakes gambler addicted to the next roll of the dice.
Quoting Jake
NEW ARGUMENT:
Let's consider gun ownership. There is of course a great deal of debate here in the U.S. about what exactly the limits of gun ownership should be. But as far as I know, pretty much no one is arguing weapon ownership should be unlimited. You know, the NRA is not arguing that attack helicopters, surface to air missiles, and nuclear weapons should be legally available at the local Army Navy store.
So within the realm of weaponry we have wide agreement that our access to weapons should be somehow limited, and we are arguing only over the details as to what degree that access should be limited.
This group consensus regarding weapons is compatible with the thesis of this thread, which is that the power available to human beings should be somehow limited. I don't claim to know exactly how it should be limited, or to what degree, I'm arguing only that a "more is better" relationship with knowledge, and thus power, is simplistic, outdated and dangerous.
And, I ask members again to reflect upon the fact that modern science is pretty much defined by a simplistic "more is better" relationship with knowledge, and thus power. The mantra seems to be, if we can learn something, we should learn it.
This "more is better" relationship with knowledge and power really makes no sense, just as selling nuclear weapons at the Army Navy store would make no sense.
Thus, I am claiming that the intellectual, scientific and political elites of our civilization are selling us a theory which makes no sense. The selling of such assumptions is understandable, but not logical. Their intentions are generally good, but their reasoning is not.