Yes, I think fast progress is dangerous, because human beings are limited in how quickly they can adapt to change. Fast progress is dangerous because it further empowers violent men's ability to crash the entire system.
The future of this civilization will be decided by our relationship with knowledge. Just as animals have to adapt to a changing environment or die, our relationship with knowledge has to adapt to a changing environment too.
Currently we're operating from a "more is better" relationship with knowledge philosophy left over from the 19th century. That philosophy made sense in the long era of knowledge scarcity, an era we no longer live in. Today we live in a time when knowledge is exploding in every direction at an accelerating rate, a revolutionary new era very unlike the early knowledge scarcity era. New conditions require new thinking.
The idea that we should be generating more and more knowledge so as to obtain more and more power represents an immature understanding of the human condition. Such an assumption is instead bad engineering, as it doesn't take in to account the limited nature of human ability.
Do we want to have media that contributes to a better future? Do we want to fuel content grounded in reason, logic, and common sense?
Focus on nuclear weapons.
There is no other factor within human control that can so quickly and so decisively end our hopes for a better future. The vast majority of other subjects being discussed in "constructive journalism" are really mostly a dangerous distraction from that which will decide our future.
Happily, we seem to be emerging from climate change denial, and now pretty much the entire population is alert to this danger, and receptive to plans to address this challenge. Unhappily, nuclear weapons denial disease remains rampant, pervasive, and durable, even at the very highest levels of our society.
This claim will now be disputed in following comments. The debate may be interesting for a few days, but then it will become boring, and we'll drop right back in to nuclear weapons denial, and sweep it back under the rug so as to return our focus to sexier topics like AI. And this is the mechanism by which the brighter future you dream of will be destroyed.
Given the pervasive nature of nuclear weapons denial, every mention of these weapons in any media is an act of constructive activism. It's not necessary to agree with any particular point of view. Just say the words "nuclear weapons" where ever you can, and you're making a constructive contribution.
The population of Florida is now 7 times larger than it was when I was born in the early 50s. A thousand people move here every day. Florida is still a place of incredible beauty...
...but in 50 years it will likely look a lot like New Jersey. I'm happy to report that I will be dead then, and won't have to witness the destruction of one of the most wonderful places on Earth.
Well, nobody claimed that poor countries are safe and secure. The claim is that high technology countries are not safe and secure, and that speeding up the knowledge explosion will make them ever less safe and secure.
Trying to understand the dynamics of progress is great. If we are assuming without questioning that speeding up the knowledge explosion should obviously be our goal, then we have not yet understood the dynamics of progress.
What we are witnessing is an engineering failure of historic proportions. That is, we are failing to take in to account all relevant factors in our design of this technological society. We love the story that we are brilliant, so we cling to that, willfully ignoring that we are instead a very immature culture bordering on insane. What other word should we use to describe anyone who has a loaded gun in their mouth and is bored by the gun???
"For a number of reasons, there is no broad-based intellectual movement focused on understanding the dynamics of progress, or targeting the deeper goal of speeding it up."
Can you please explain why the goal should be to speed up the knowledge explosion?
We already have thousands of massive hydrogen bombs aimed down our own throats, an ever present existential threat that we typically consider too boring to bother discussing, perhaps because we haven't the slightest clue how to rid ourselves of these weapons. And so, we're ignoring that threat, while we race to develop AI and genetic engineering as fast as possible, new potential existential scale technologies which we also have no idea how to make safe.
Is this evidence of a species that is mature enough to benefit from ever more, ever greater powers, delivered at an ever greater pace, without limit?
Yes, culture can improve, and has. Is our morality more effective? That's a tricky one.
Consider that we have thousands of massive hydrogen bombs aimed down our own throats, and we generally find this ever present existential threat too boring to bother discussing. It seems we have a ways to go yet in achieving effective morality.
I think we're basically agreeing that culture can both improve and deteriorate. The history of modern Germany perhaps offers one example of that. High culture, to primitive barbarism, and then back to high culture.
Hi Roger, I agree with comments. Yes, there has been important progress within the content of thought. But because that kind of morality is just ideas, it's not permanent or durable. It can change quickly based on particular local circumstances. It is of course nonetheless an important project to keep working on.
Here's an example which may add to what we're exploring.
To my knowledge, every ideology ever invented has inevitably subdivided in to competing internal factions. The universality of this experience suggests the source of the division is something that all ideologies have in common. This can not be their content, for the content of ideologies varies widely. What all ideologies have in common is what they're all made of, thought.
And so we see many very different ideologies all follow a similar path of internal division, due to the nature of the medium in which all the ideologies exist.
I liked this quote from the Edge article...
"With more powerful technologies such as nuclear weapons, synthetic biology and future strong artificial intelligence, however, learning from mistakes is not a desirable strategy: we want to develop our wisdom in advance so that we can get things right the first time, because that might be the only time we’ll have."
It reassures me to find others writing on this subject, and making this point specifically. The issue of scale changes the progress equation in fundamental ways, erasing the room for error we've always counted on in the past.
We are required to defeat ALL existential threats, every one, because a single failure a single time with a single threat may be sufficient to bring the entire system crashing down, making other successes irrelevant. When we see existential threats in this holistic manner, it becomes clear that dealing with particular threats one by one by one is a loser's game, and our focus should instead be on the process generating all the technological threats, the knowledge explosion. I would define that shift of focus to be an act of wisdom.
As example, if you get puddles all around your house every time it rains, the solution is not to focus on managing the pots you use to catch this and that drip. The wise solution is to go to the source of the problem, and get up on the roof and fix the leaks.