Here are some ways that I think differently about effective-altruism-relevant topics compared to a year or two ago.

I am erring on the side of writing this faster and including more of my conclusions, at the cost of not very clearly explaining why I’ve shifted positions. Instead of saying “I have updated towards thinking X”, I’m going to mostly just say “X” throughout this post.

Posts that explain what you think without justification are dangerous, because they make it really easy to be passive-aggressive and criticise other people baselessly without having to produce actual arguments. So in this post, I’ve refrained from criticizing the conduct of any organizations or individuals. The closest I come is criticizing GWWC, but that criticism is mostly directed at the EA community as a whole rather than the organization or anyone at it.


In general, I feel I’ve moved more towards the school of thought that thinks we should keep EA weird and small. I’m much more inclined to think that EA should focus on being a really good place for a relatively small group of unusual people to try to be extremely impactful. I think our norms should be optimized for that type of person, at the expense of being as welcoming to the people who basically want to live normal lives and donate ten percent of their income.

I’m concerned about having a norm that you should donate. I am somewhat more opposed to attitudes of frugality and self-sacrifice. I wish EA was closer to master morality than slave morality. I wish that we encouraged an abundance mindset—more like “You should go out and find the thing you can do that is awesome, and strive for really big wins and not worry about the little stuff in the meantime”.

I have a much more negative attitude towards the GWWC pledge than I used to. I think that signing it is a mistake for most young people who signed it. (I think it was less of a mistake for me than most EAs, because I signed it when I was just about to start a job.) I feel that encouraging the GWWC pledge discourages people from doing direct work or floundering without jobs, both of which are probably more valuable than 10% of a recent grad’s salary. I suspect that more than half of the people I know who signed the GWWC pledge while under 25 now believe that they made that decision badly.

The GWWC pledge also cuts against the whole abundance mindset/master morality thing. I think that we should be telling our future selves that we believe in them and we want them to flourish and do great things, rather than that we don’t trust them and we think we need to tie their hands.

I wish people would spend less time talking about small scale personal consumption ethics like veganism. I think that these topics encourage a type of thinking which is very different to the type of thinking that is most important for EA. I worry that when new EAs run into these arguments, they think that the types of arguments used there are the most important types of arguments, and so don’t learn the important of getting deep understanding of topics relevant to EA.

I am much less concerned about astronomical future suffering than I used to be.

I now think animal rights is less important for the long term value of the universe than I used to.

I have a higher opinion of the intelligence and judgement of the staff of CFAR and MIRI than I used to.

I feel very concerned by the relative lack of good quality discussion and analysis of EA topics. I feel like everyone who isn’t working at an EA org is at a massive disadvantage when it comes to understanding important EA topics, and only a few people manage to be motivated enough to have a really broad and deep understanding of EA without either working at an EA org or being really close to someone who does.

I am worried that the good ideas and smart people in EA are harder to find than they used to be. I fear that smart people who run across EA might have worse first impressions of EA than they used to. This is mostly a problem with our online presence.

I am less enthusiastic about hedonic utilitarianism than I used to be. I don’t regret naming my DAF the Hedonium Shockwave Fund, but it’s much more of a joke now. I’m more inclined to care about complex value, and I think I might start thinking that death of humans is terminally bad.