(Replying to PARENT post)
There are a handful of movies that involve the biggest tort litigations in the US: Erin Brockovich, Dark Water, A Civil Affair, the Rainmaker. It’s not clear that most of these businesses would exist if the actual cost of negative externalities was displayed on the accounting ledger with a dollar value.
(Replying to PARENT post)
Oooh, nice ! I wish Baxter had added this idea in Manifold (hard scifi novels about "why are we alone in the universe ?" and other things.
For those who wonder what's an uncontrolled paper-clip maximizer https://en.wikipedia.org/wiki/Instrumental_convergence it's a thought experiment that illustrates instrumental convergence:
> Instrumental convergence is the hypothetical tendency for most sufficiently intelligent beings (human and non-human) to pursue similar sub-goals, even if their ultimate goals are quite different.[1] More precisely, agents (beings with agency) may pursue instrumental goals—goals which are made in pursuit of some particular end, but are not the end goals themselves—without ceasing, provided that their ultimate (intrinsic) goals may never be fully satisfied.
> Instrumental convergence posits that an intelligent agent with unbounded but apparently harmless goals can act in surprisingly harmful ways. For example, a computer with the sole, unconstrained goal of solving a difficult mathematics problem like the Riemann hypothesis could attempt to turn the entire Earth into one giant computer in an effort to increase its computational power so that it can succeed in its calculations.
https://en.wikipedia.org/wiki/Instrumental_convergence#Paper...
> Suppose we have an AI whose only goal is to make as many paper clips as possible. The AI will realize quickly that it would be much better if there were no humans because humans might decide to switch it off. Because if humans do so, there would be fewer paper clips. Also, human bodies contain a lot of atoms that could be made into paper clips. The future that the AI would be trying to gear towards would be one in which there were a lot of paper clips but no humans.
Though I disagree: the AI in this analogy is mindless, it's up to the controlling entity with agency to stop it or program it to stop at some point. Just like we humans have agency and could decide to stop exploiting our planet's resources until we can do it without destroying it. How that decision is made and implemented is another debate.
(Replying to PARENT post)
I think we will need a similar shift in the modern age. Some religious/philosophical movement with a morality which makes waste and pollution sinful/shameful for environmental protection. I also think a ‘liturgical’ epistemological system is needed to reframe ‘truth’ and the value in reason. In this way absurd situations like ignoring externalities could be considered sinful rather than the status quo where most people do not care that these things don’t make sense.
This is kind of a reaction to hyper-individualism and cynicism, which I think is the reason for most modern problems. When people do not see themselves as part of a whole, they become their own god and stop caring about the commons. Apathy becomes the default.
TLDR: we need a cultural shift where people have greater respect for the environmental/intellectual commons
(Replying to PARENT post)
Get a system rolling that allows real costs to be ignored, then combine that with technological advancement that allows for planetary-scale damage and maybe it becomes inevitable that nascent Type I civilizations go back to banging rocks together, at best. It's the same result you get from an uncontrolled paper-clip maximizer, just with a lot more moving parts and flourishes.