Discussion about this post

User's avatar
Tragic Jonson's avatar

“Even the lofty notion of a human imagining a future plan and then carrying it out is, very strictly, impossible if it involves a future state affecting the present (that would be backward causation).”

I think there’s a basic misunderstanding here—“a human imagining a future plan” is not a future state. Predictions are thoughts (emergent patterns of physical activity in the brain) that exist in the present, not in the future.

“I see the “final goal” of living things being self-perpetuation. The “instrumental goals” are all the behaviours we see which, ultimately, either contribute to the final goal or tend to go out of existence.”

Again, I think there’s some basic empirical evidence that this is untrue. Humans are living things, and yet humans make all kinds of decisions that threaten (if not outright destroy) our ability to “self-perpetuate”, whether it’s practicing extreme sports, smoking cigarettes, choosing not to have children, or even committing suicide. Plenty of adults are even conscious of the fact that having children isn't quite the same as perpetuating the “self”, and we have no indication that our individual consciousnesses persist beyond our mortal life, children or otherwise. You could make the argument that many people make choices to maximize their own life (satisfaction, pleasure, etc.), but that’s not quite the same thing either.

Expand full comment
James Fuller's avatar

Thanks for replying! I’m not an expert in this, but I recently listened to an interview with Athena Aktipis (https://80000hours.org/podcast/episodes/athena-aktipis-cancer-cooperation-apocalypse/), who argues for an evolutionary perspective on cancer. As I understand it, the idea is that individual cells aren’t perfect replicas, and so there is an evolutionary pressure for them to go rogue and start multiplying rapidly, hence cancer. The trick to multicellularity is getting cells not to do that and shutting them down when they do. If we imagine civilization as a multicellular organism, could we set up some sort of “cancer prevention” regime to keep AIs cooperative?

Now that I think about it, ants are a very different example because, unlike cells, individual ants (except for the queen) can’t replicate themselves. If we were able to somehow prevent AIs from being able to replicate themselves like worker ants, that would of course solve the problem. However, it would be tricky to do that when all they have to do to replicate themselves is copy a lot of software code.

If you’re interested, here are a couple more spitball biological metaphors that I came up with. Again, I’m not an expert, so these might be based in a misunderstanding of the relevant biology:

Mitochondria: This is similar to your symbiosis idea; however, the relationship between mitochondria and eukaryotic cells is even closer than symbiots: they have distinct genetic codes, but they can’t survive or reproduce without each other. Maybe if humans go full cyborg and AIs becomes like our mitochondria, then we could survive and thrive in that way, albeit in a very altered form.

Life history theory: Unlike organisms, AIs are potentially immortal. Would that change their reproductive strategies?

Expand full comment
5 more comments...

No posts