2025-01-13
reactionary
Alex Kaschuta interviews Hunter Ash, which coincidentally talks about two things I’ve been thinking about recently, the utilitarian implications of evolution, and Lee Smolin’s cosmological natural selection. I think Hunter takes the opposite position to Lionel Page too far, with the claim that evolutionary trends are by default good and should be helped along. That being said, it’s pretty clear why he does this: similar to how Eneasz Brodski had an idea of building a rationalist religion, it’s striking how well-suited cosmological evolution is to the creation of an e/acc religion. If what is evolved is natural is good, you now have a moral duty to accelerate technological progress and get to as many black holes as soon as possible. But I think it’s more nuanced than that, it’s rather that evolution has a tendency to create mutually beneficial systems, where each party pursuing their natural interest leads to synergistic effects. For example, if we get to the point of creating artificial black holes, we don’t need to think too hard about what the optimal parameters are: whatever works best for our use cases, whether that’s optimization for low activation energy, high efficiency, long duration, high initial energy production, or anything else, that is all we need to care about. If we are the mitochondria of the universe, we can pursue our own personal utilities even if it might be suboptimal globally speaking, trusting the universe to manage it’s own needs. This is why personally I see this as an umbrella for all rationalist-adjacent groups; it’s actually very compatible with EA: firstly, because the mere fact that altruism exists implies that it is useful for the evolutionary goals of the universe in some ways; secondly, it should increase how seriously you take AI doomer scenarios and the importance of alignment, because it’s very easy to see why AI would need to create large numbers of black holes, while it’s less clear why humans might need so much energy (maybe interstellar travel, or bad Malthusian scenarios). In this way, e/acc and EA could the spear and shield, one pushing universe creation forward, and the other ensuring that the universes being created are still actually worth creating.
Podcast by Galaxy Digital on the eventual tokenization of everything. It’s interesting because this was sort of the blockchain bull case in 2021, the idea that there would be a sort of Negroponte switch where the backend of the financial system completely transitioned over, with something like FTX mediating to end-users. Obviously that is not the way. But given the possible ascendancy of the tech-right and the shift in the power center from NY to SF, I think Balaji-style dissatisfaction with the financial system and alternative visions for it could become increasingly relevant. The idea that seems most likely to occur is something like the splitting of traditional banking into narrow banking and private equity (for savings and investment respectively), and using stablecoins built on the former to gradually migrate the latter.
There’s an interesting tendency for “what are the implications” over-analysis to lead to counter-articles stating that such analysis isn’t necessary. I wonder if Finspan will spawn any such phenomena.

