For about five years, I was swept up into a cult-like ideology called “Rationalism”, via the writings of cult leader Eliezer Yudkowsky on his website Less Wrong. I’m ready to talk about it.
Despite my better efforts I overheard some conservatives on a show talking about… how “Gaia Theory and environmentalism” are “suppressing ambition” and somehow this destroys manliness and leads to the end of Civilization. And I just—
I really wish people would make an effort to understand concepts they don’t agree with with even a tiny bit of good faith.
What could be more ambitious (or manly) than taking responsibility for the climate? I don’t even know if we have what it takes tbh.
@futurebird
'Terraforming earth' would get a lot more people interested but it would also be like shit for flies to the #ExistentialRisk / #Longtermism / #LessWrong / #TESCREAL crowd. You'd almost immediately see it coopted to advocate for building a bunch of fusion-fueled domed cities populated by white people designign a brave new future filled with computronium-maximizing nanoassemblers & simulated humans living simulations of lives optimized for maximum neo-Utilitarian 'happiness.'
#LessWrong-ism leads to thinking GPTs get more creative when they're hot. #WisdomAccelerationism (a play on #EffectiveAccelerationism or maybe just #Accelerationism - taking either one seriously enough to fix it by substituting 'wisdom' is a little concerning, especially given the apparent belief that GPTs are 'creative.')