Applying Skeptical Scrutiny To Our Relationship With Knowledge

Here’s some good and bad news regarding a possible solution, imho of course.
BAD NEWS: After discussing this for years I’ve come to the conclusion that there’s little chance we will reason our way out of this problem. The “more is better” relationship with knowledge is too deeply rooted in the human condition and human experience.
GOOD NEWS: The “good news” is that revolutionary environments like the knowledge explosion provide not only revolutionary risks, but revolutionary opportunities as well. I’ve put “good news” in quotes because…
It’s only a matter of time until we wake up one day and some city somewhere in the world has been erased by a nuke. This is obviously bad news, except that such an event has the potential to radically shift the cultural group consensus.
Consider the response to 9/11. We responded to 3,000 dead with 2 wars costing trillions of dollars. We dramatically inflated the security apparatus. (PLEASE let’s not debate all this yet again.) A dramatic response happened, that’s all I’m saying.
If we multiply 9/11 by a hundred or a thousand, it’s hard to know what the impact would be on global consciousness. But it seems fair to reason it would be substantial. At the least, the media would talk of nothing else around the clock for years.
The best hope we may have is that some existential power like nuclear weapons goes out of control in a limited manner. When masses of people can see the vast damage with their own eyes, this topic may morph from being an abstract intellectual analysis that’s rarely discussed, in to a pressing real world concern which is impossible to ignore.
The best we can probably accomplish now is to begin the conversation so that we won’t be starting from scratch on the day after.

Your whole theory relies on assuming levels of stupidity that don't exist.
Which of the following statements is false: a) Science is driving an ever accelerating knowledge explosion. b) This explosion will give us greater and greater powers, at a faster and faster rate. c) Some of these powers will be capable of collapsing modern civilization. d) Sooner or later we’ll lose control of at least one of these large powers. e) When that happens the accomplishments of science will be largely erased.
Nicholas Maxwell seems to share your concerns and suggests we should focus on “more" wisdom
John, this was an excellent reference for this thread. I've joined his listserve, and am trying to contact Maxwell to invite him in to this thread. His mail is bouncing back at the moment, but I'll look for another contact method. Perhaps I am contact him via the listserv, we'll see. For those who missed the link, here it is again: http://www.ucl.ac.uk/from-knowledge-to-wisdom
Your whole theory relies on assuming levels of stupidity that don't exist.
Which of the following statements is false: a) Science is driving an ever accelerating knowledge explosion. b) This explosion will give us greater and greater powers, at a faster and faster rate. c) Some of these powers will be capable of collapsing modern civilization. d) Sooner or later we’ll lose control of at least one of these large powers. e) When that happens the accomplishments of science will be largely erased. I didn't argue with those, I argued with the two points you made after this, and you didn't respond. The above are true, but that doesn't lead to the conclusion that we should stop learning or stop sharing knowledge or whatever it is you are saying.
Nicholas Maxwell seems to share your concerns and suggests we should focus on “more" wisdom
John, this was an excellent reference for this thread. I've joined his listserve, and am trying to contact Maxwell to invite him in to this thread. His mail is bouncing back at the moment, but I'll look for another contact method. Perhaps I am contact him via the listserv, we'll see. For those who missed the link, here it is again: http://www.ucl.ac.uk/from-knowledge-to-wisdom
Where in here does it say what you are saying? This is about being wise about science, not about limiting knowledge.
I didn't argue with those, I argued with the two points you made after this, and you didn't respond.
I'm deliberately not replying to posts from anybody that have the potential to start a chain reaction of distracting personality conflicts, the plague of all forums. That's all I'm going to say about it.
The above are true, but that doesn't lead to the conclusion that we should stop learning or stop sharing knowledge or whatever it is you are saying.
You seem intent on rejecting "whatever it is I am saying". If you're not sure what I'm saying, why not ask me to clarify specific points? That's why I keep posting the list of assertions, to try to help members identify exactly where it is they disagree with this thesis, if they do. If you agree with the listed assertions, what conclusions do you draw from them?
Where in here does it say what you are saying?
Where do I claim he says what I'm saying?
Your whole theory relies on assuming levels of stupidity that don't exist.
The human race has thousands of nuclear missiles aimed down it's own throat, enough to destroy everything which has been constructed over hundreds or even thousands of years. These missiles are on hair trigger launch on warning status, so that decision makers have only a few minutes to evaluate a situation before hitting the launch button. In addition, this imminent ultimate catastrophe predicament is rarely discussed. As example, it was barely touched on in the recent presidential campaign. Anyone who does discuss it for more than a few minutes is typically branded as being hysterical. In summary, the human race is like a man who walks around with a loaded gun in his mouth all day every day for years, and who doesn't find that situation worth mentioning. If anyone brings it to his attention, he shrugs and change the subject to sports, the latest celebrity breakup, or any of the many other topics on this forum. So, you're right. We aren't stupid. We're insane.
So let's identify those thinkers who have thought it through to the logical conclusion. I don't claim they don't exist, and it seems likely that they do. But who are they? Other than Ted Kaczynski.
Wow - it shocks me that you have to ask that. Before saying something like that, I would most certainly have googled phrases like the value of science and the danger of science. Henri Poincare even wrote a book entitled "The value of science" https://ebooks.adelaide.edu.au/p/poincare/henri/value-of-science/chapter10.html#pg321 Richard Feynman also wrote an article with the same title https://www.google.com.mx/url?sa=t&source=web&rct=j&url=http://www.wegerscience.com/documents/thevalueofscience_article.pdf&ved=0ahUKEwi28bed3-bUAhVK5YMKHQQEDPQQFghcMAo&usg=AFQjCNE2672ejZ2A4U565-xHjixQcovyHg. Feynman took part in the Manhattan Project so the issue was very much on his mind. Stephen Hawking has received much recent attention for his concern about the dangers of new developments in science and technology. Michael Servetus and Giordano Bruno were 16th century scientists who chose to die for their beliefs in truth rather than accept the religious demands of the time. Galileo also chose to fight for truth and suffered because of it. I'd guess they thought very seriously about the value of knowledge. In earlier times, Plato raised the issue in the Socratic dialog, Meno, as did many other philosophers. I'm not sure they would all be thrilled to be placed in the same category as a psychopath. Feynman credited Buddhism with the observation "To every man is given the key to the gates of heaven; the same key opens the gates of hell." Which sums up the dilemma you ponder and suggests it has been an issue since antiquity.
That's the status quo argument, which I claim is outdated.
Outdated, or has just defied improvement? Given the catalog of thinkers from antiquity to the present day who have addressed the issue and never reached a consensus, outdated seems rather inaccurate - especially given the enthusiasm you express about Maxwell. Sorry you're having trouble contacting him. He will be 80 in a few weeks and may be slowing down. He could have cancelled his UCL email. Is that the one you tried?
Listing all the challenges to this thesis does not solve the problem.
It helps to know what you face. Do you think dreaming up all sorts of doomsday scenarios helps - other than providing a focus for the additional knowledge we need to develop to overcome them? Hawking raises potential problems but notes “We are not going to stop making progress, or reverse it, so we must recognise the dangers and control them," and recommends progress such as working to colonize other planets.
d) Sooner or later we’ll lose control of at least one of these large powers. e) When that happens the accomplishments of science will be largely erased.
I don't think either of these is necessarily the case. You could also add between d) and e) with certainty that sooner or later we’ll be faced with a existential threat that we cannot currently defeat. How should we deal with that certainty? The anti-spam
Where in here does it say what you are saying?
Where do I claim he says what I'm saying? Why else would you cite it? You seem to be avoiding developing a theme.
Wow - it shocks me that you have to ask that.
I'm shocked that you are shocked. :-) When did I claim to be an academic? The fact that I've published my thesis on a tiny net forum should be a clue that I don't teach at Harvard. :-) Some of your links have proven useful, so thank you for that. However, it's not going to be possible for me to read every book or article that generally references science in some way. If you've read these books can you help us narrow the focus to those writings which address the thesis of this thread directly? Which writers support or challenge the thesis that our "more is better" relationship with knowledge will lead to the collapse of civilization?
Feynman credited Buddhism with the observation "To every man is given the key to the gates of heaven; the same key opens the gates of hell." Which sums up the dilemma you ponder and suggests it has been an issue since antiquity.
Yes, everybody always says things like this, it seems important to readers to prove that my thesis is not original, which is fine with me, I don't care if it's original or not. I have no career or reputation to promote or defend. But again... Which specific writers support or challenge the specific thesis that our "more is better" relationship with knowledge will lead to the collapse of civilization? If you or anybody should know of any such writers, please quote some of the relevant parts of their work here so we may discuss it. I'm not an academic, but for 50 years I've been an avid watcher of shows like Charlie Rose, Netflix documentaries, and I hear pretty much every show NPR produces etc etc. I spend hours a day on educating myself. I have yet to hear or see a single thought leader address this proposed threat directly. Just like on this forum, a million other topics are considered more important. Again, my point is that if this thesis is correct (debatable) then none of these million other topics really matter at all, as all of them will be swept away in a coming collapse. As example, if we are to have a major nuclear war sometime in this century, what is the point of any of the science research being done today? What is the point of all the other topics being addressed here on the forum and throughout our culture?
Outdated, or has just defied improvement? Given the catalog of thinkers from antiquity to the present day who have addressed the issue and never reached a consensus, outdated seems rather inaccurate -
How could thinkers prior to 1945 have addressed the subject of existential scale powers which is essential to the thesis presented in this thread?? The vast majority of writers you are referring to were commenting on the old era characterized by conventional powers. It isn't possible to crash global civilization with conventional powers, so all such discussion does not address the topic of this thread.
especially given the enthusiasm you express about Maxwell. Sorry you're having trouble contacting him. He will be 80 in a few weeks and may be slowing down. He could have cancelled his UCL email. Is that the one you tried?
I wrote the mail listed on his site. It looks like that gets forwarded to another address (demon.co.uk) which then bounces. I'm guessing he may have created that site some time ago and lost interest due to lack of participation? Or, as you say, perhaps he's just retired. My plan for now is to wait and see if anything comes via his listserv, and if not perhaps I'll post something there.
Do you think dreaming up all sorts of doomsday scenarios helps - other than providing a focus for the additional knowledge we need to develop to overcome them?
Well, it's a proven well documented fact that a global nuclear war could happen at any moment without warning. If that's not worth discussing, I'm not sure what is.
Hawking raises potential problems but notes “We are not going to stop making progress, or reverse it, so we must recognise the dangers and control them," and recommends progress such as working to colonize other planets.
Right, this is what I mean by "outdated". I've already agreed that we can likely control most existential scale powers most of the time, but... PLEASE NOTE: That doesn't matter at all unless we can control EVERY existential scale power ALL of the time. Hawking is a genius, but not on this topic. He's still thinking by the rules of the past, the era of conventional powers. He hasn't grasped that the era of existential scale powers has different rules. In the conventional power era we had room for error, could make mistakes, and learn from them. The existential power era erases the room for error, erases the ability to make mistakes, erases the opportunity to learn, adjust and rebuild. One bad day = game over. Migrating to other planets simply transfers this problem to other planets. We set up a colony on Mars. Ten years later somebody brings some nukes to Mars. Nothing accomplished. A repeat of the same problem in a different landscape.
d) Sooner or later we’ll lose control of at least one of these large powers. e) When that happens the accomplishments of science will be largely erased.
I don't think either of these is necessarily the case.
Ok, why? How will we successfully manage every existential scale power every day forever? Human history is defined by a consistent pattern. Everything goes along pretty good for awhile, and then every so often we go totally bat shit crazy. All out wars using every available weapon etc. This pattern has been repeated over and over and over again since the invention of agriculture. How can we have an evidence based examination of these issues if we ignore such a longstanding pattern?

I hope this helps. We might be looking for thought leaders who make arguments like the following:

  1. The thought leader might argue that there’s no point in doing any further research because whatever we learn will be swept away in a coming collapse, or…
  2. They might argue that there won’t be a coming collapse because we will successfully manage existential scale powers by method XYZ, or…
  3. They might argue that we don’t yet know if we can manage existential scale powers, but we could find out by redirecting all efforts to the challenge of removing the threat from nuclear weapons and climate change, or…
  4. They might argue that all research not directed at nuclear weapons and climate change should be stopped until we prove that we can solve the existential scale problems we’ve already created, or…
  5. They might acknowledge the threat, but argue we should continue to push forward on all knowledge fronts and hope for the best.
    And so on…
    The key element the thought leader should address is the threat posed by existential scale technologies, ie. any power with the ability to crash modern civilization.
  • If the thought leader was writing before 1945, but predicted the emergence of existential scale powers and then commented upon them, that would be a candidate for our list.
  • If the thought leader was writing at any time, and doesn’t specifically reference the concept of existential scale powers (they don’t have to use that term) we should set them aside.
  • We’re looking for thought leaders who grasp the difference between the conventional scale power realm (can survive mistakes) and the existential scale power realm (mistakes are fatal). If a thought leader is on this specific subject, any perspective they might have would seem to be a useful contribution.
    Hope that helps focus our conversation a bit.
You seem to be avoiding developing a theme.
See post #51. Perhaps that helps?
You seem to be avoiding developing a theme.
See post #51. Perhaps that helps? No it doesn't. You are complaining that no one is specifically addressing your general problem. When anyone gets specific, you say they are doing it wrong, but when I specifically stated there are threats other than nuclear war, you did a rant on that. You say you are listening to people address the problems of the modern world and that you don't hear people addressing this problem. How would you suggest we address a problem of knowledge, except with more knowledge? We can't un-know how to make bombs. We can learn about each other, learn to co-exist, understand dangers, improve standards or living to take away a need for war, and lots more.
How would you suggest we address a problem of knowledge, except with more knowledge?
By developing more knowledge about how to better manage knowledge. By "better managing" I mean something more sophisticated than a simplistic "more is better" formula. Already explained above many times. Please 1) read the thread and 2) dial back the emotion, or 3) join another thread which you find to be more credible. Thanks.
it's not going to be possible for me to read every book or article that generally references science in some way. If you've read these books can you help us narrow the focus to those writings which address the thesis of this thread directly? Which writers support or challenge the thesis that our "more is better" relationship with knowledge will lead to the collapse of civilization? ........ I'm not an academic, but for 50 years I've been an avid watcher of shows like Charlie Rose, Netflix documentaries, and I hear pretty much every show NPR produces etc etc. I spend hours a day on educating myself. I have yet to hear or see a single thought leader address this proposed threat directly.
No time to participate in academic inquiry, but plenty to passively sit in front of gogglebox and fill your head with opinions and even propaganda from individuals who most wouldn't rank as the world's greatest thinkers. Your stance on knowledge probably reinforces this. If knowledge is bad, you don't want to use it - and attempt to demonstrate that you can address your issue even though you avoid knowledge. Your approach seems to be to pick a sound bite "more is better" and suggest that it hasn't been mentioned specifically in connection within a specific field of inquiry then use that omission to discredit any attempt to suggest that it is implied in the works and even actions of many intellectuals addressing that issue. Gimme a spam break!

John,
Thank you for characterizing my thoughts on the subject, but please allow me to remind you that characterizing an argument, and addressing that argument, are not at all the same thing. And characterizing any argument is obviously not by itself a debunking of that argument.
Let’s get back to the actual thesis.

  1. If the thesis of this thread is correct, we should be trying to involve others in the discussion with the goal of coming up with constructive ways to address the threat.
  2. If the thesis is not correct, somebody should try to explain why. I don’t hear such an explanation yet.
    I don’t know of a way to successfully manage every existential scale power that will emerge. Do you John? This isn’t a debating challenge, but a sincere question. If you, or anybody else reading, knows of a way to do this please share it.
    If we don’t know of a way to successfully manage every existential scale power that will emerge, could we please just admit that so the conversation can proceed? It might proceed in the direction of looking for others who claim they have a solution to this problem.
    I’m not claiming there is no solution. I’m claiming I don’t know of one, and don’t know of anyone else who does either. If anyone here can fix that, please do!

I think you are right to point out the disproportionate way that knowledge and technology has outstripped human development and the problem of us having to adapt to a very different world from even a few hundred years ago. We are still essentially the same animal, biologically, as we were in the Ice-Age so there exists a basic ‘disconnect’ between the conditions we evolved in and the modern world. We still possess destructive emotions that can get out of control at times and those in authority have a huge potential to cause harm to many others on a large scale. So ‘more’ in this context could very well be a negative thing but also, can be a positive one. What can be done about it seems problematical to me since everyone will have their own ideas about it and not every nation enjoys the freedom of speech to discuss it. But I think the genie is now out of the bottle and I don’t really see how it can be put back. It’s all very complex and no one answer would suffice, usually because of the conflict between vested interests, so maybe it will take a natural catastrophe such as a giant meteor hit or deadly pandemic or some other catastrophic event to make humankind come to it senses and adopt a more co-operative way of life.

How would you suggest we address a problem of knowledge, except with more knowledge?
By developing more knowledge about how to better manage knowledge. By "better managing" I mean something more sophisticated than a simplistic "more is better" formula. Already explained above many times. Please 1) read the thread and 2) dial back the emotion, or 3) join another thread which you find to be more credible. Thanks. How about dialing back the tone policing. You haven't done anything but suggest priorities. Nothing wrong with that, especially since I like the priorities you are choosing. But why not just get on with it, instead of making some other untenable suggestion like stopping all other efforts to gain knowledge elsewhere and concentrate on your priorities. What about dealing with starving children? Or death from measles? Why can't we do those things AND work on global warming?

Hi webplodder, welcome to the conversation.

I think you are right to point out the disproportionate way that knowledge and technology has outstripped human development and the problem of us having to adapt to a very different world from even a few hundred years ago.
Yes, that's it, we're attempting to apply a world view that worked well in the past to an emerging age with a profoundly new factor, existential scale powers.
So 'more' in this context could very well be a negative thing but also, can be a positive one.
Agreed. The knowledge explosion offers great opportunities too, and not just great dangers. The problem being referenced in this thread is that we are now in a position where a single failure can erase the opportunities. That was never true pre-1950s, but it is now.
What can be done about it seems problematical to me since everyone will have their own ideas about it and not every nation enjoys the freedom of speech to discuss it.
Agreed again.
But I think the genie is now out of the bottle and I don't really see how it can be put back.
To use an example, it seems that all information about how to make nuclear weapons can not now be erased from human consciousness. But it would be possible to get rid of nuclear weapons. Reagan and the Russians seriously considered that way back in the 80's. And we are making some progress towards addressing climate change. So while it may not be possible to solve the entire knowledge explosion threat, it does seem possible to identify and address the most immediate and dangerous challenges. To me, it seems nuclear weapons should be at the top of that list.
It's all very complex and no one answer would suffice, usually because of the conflict between vested interests, so maybe it will take a natural catastrophe such as a giant meteor hit or deadly pandemic or some other catastrophic event to make humankind come to it senses and adopt a more co-operative way of life.
Yes, I think that's it. I doubt anyone is going to take this seriously until we can see the damage from an out of control existential scale power with our own eyes. A terrorist nuke attack on a single city may be what turns the tide. I know this sounds sick, but we might almost hope for that, because that might be what is necessary to avoid the far more damaging nuclear war between major powers. It's kind of like how we typically need a tooth ache to get us to the dentists. Pain has often been a more effective teacher than reason.