Donald Hobson


Sorted by New

Wiki Contributions


Small signs you live in a complacent society

Have you considered printing off a few sheets of paper, getting some glue, and just adding a few signs yourself? ;-)

Why slow progress is more dangerous than fast progress

Some tech, like seatbelts, are almost pure good. Some techs, like nukes are almost pure bad. Some, like cars, we might want to wait until we develop seatbelts and traffic lights for before we use widely. It depends on the technology. 

Philosophy in Space

Elon musk is very good at making himself the center of as many conversations about technology as possible. 

He should not be taken as a source of information of any reliability. 


Living on mars with tech not too far beyond current tech is like living in antarctica today. It's possible, but it isn't clear why you would want to. A few researchers on a base, not much else. 

Think ISS but with red dust out the windows. 

At some point, which might be soon or not so soon, tech is advanced enough that it becomes easy to get to mars. But at that point, traditional biological humans on mars might be stupid, compared to say self replicating robots containing computers running uploaded human minds in the asteroid belt.

A mars base is cool scifi. But it might turn into the largest white elephant in history. It doesn't solve any obvious practical purpose in increasing human wellbeing or industrial capability. 

Sure, at some point you are disassembling all the planets to build a dyson sphere. But before that, a mars landing doesn't actually need to mean any real progress.

Will Technology Keep Progressing? (A Happier World video)

I don't think the "aside from the internet, nothing much". Firstly comuter and internet tech have been fairly revolutionary across substantial chunks of industry and our daily lives. This is the "a smartphone is only 1 device so doesn't count as much progress" thinking. Without looking at the great pile of abacuses and slide rules and globes and calculators and alarm clocks and puzzle toys and landline phones and cameras and cassette tapes and ... that it replaced and improved on. 

Secondly, there are loads of random techs that were invented recently, solarPV, LED's. Mrna Vaccines. Electric (self driving?) cars. 

And finally, a substantial part of progress is the loads of tiny changes that make things cheaper and better. If you don't include things like 3d-printers and drones that haven't really gotten good yet, then of course you will see less inventions recently. The first fridges were expensive and not that good either. 

Against Altruism

If longtermists existed back when blacks were widely regarded as morally inferior to whites, would the moral calculus of the longtermists have included the prosperity of future blacks or not? It seems like it couldn't possibly have included that. More generally, longtermism can't take into account progress in moral knowledge, nor what future generations will choose to value. Longtermists impose their values onto future generations.

It is true that we can't predict future moral knowledge. However.

  1. An intervention by someone from that time period that helps modern whites and doesn't harm modern blacks would still be seen as better than doing nothing from the point of view of most people. (excluding the woke fringe) Most random interventions selected to help future white people are unlikely to cause significant net harm to blacks. 
  2. If their intervention is ensuring that we are wealthy and knowledgeable, and hence more able to do whatever it is we value, then that intervention would take into account progress and moral knowledge.
  3. In reality, you have to choose to do something. When making decisions that effect future generations, either you impose your current values, or you try to give them as much flexible power to allow moral knowledge, or you basically pretend they don't exist. 

This is an intresting new combination of standard mistakes. 

Another issue is that if altruistic morality is taken to its logical conclusion, then everyone would be trying to solve everyone else's problems. How could that possibly be more effective than everyone trying to solve their own problems?

Altruistic morality in the total utilitarian sense would recognize that solving everyones problems is equally valuable, including our own. In the current world, practically no humans are going to put themselves lower than everyone else, and most of the best opportunities for aultruism are helping others. But in the hypothetical utopia land, people would solve their own problems, there being no more pressing problems to solve. 

If we are here to help others, what on Earth are the others here for?

Well imagine the ideal end goal, if we develop some magic tech. Everyone living in some sort of utopia. At this point, most of the aultruists say that there is no one in the world who really needs helping, and just enjoy the utopia. But until then, they help. 

What we actually need to be is selfish, not altruistic. We need to make as rapid progress as possible so that the people of the future themselves will be at a starting point where they can make even more rapid progress. 

A aulturist argument for selfishness. You are arguing that selfishness is good because it benefits future people. 

If you were actually selfish, you would be arguing that selfishness is good because it makes you happy, and screw those future people, who cares about them.

I also don't know where you got the idea that selfish=max progress. 

Suppose I am a genius fusion researcher. (I'm not) I build fusion reactors so future people will have abundant clean energy. If I was selfish, I would play video games all day. 

Altruism is subordinating one's own preferences to those of others. It’s a zero-sum game. It's not win-win.

In the ideal utilitarian hypothetical utopia, who exactly is loosing. If hypothetically everyone had the exact same goal, the well being of humanity as a whole, valuing their own well being at exactly the same level as everyone elses, that would be a 0 difference game, the exact opposite of a 0 sum game.