3 Comments
Mar 21·edited Mar 21Liked by Jamie Freestone

> AGI, will by definition not need us to further its interests. It can take care of itself.

Well-put. You may be also interested in this technical essay, which gets into how AGI’s physical needs are in conflict with our human needs. And how control of AGI is fundamentally limited.

https://www.lesswrong.com/posts/xp6n2MG5vQkPpFEBH/the-control-problem-unsolved-or-unsolvable#How_to_define__stays_safe__

Expand full comment
Jun 27, 2023Liked by Jamie Freestone

One, if not the only article about the future of AI I've read that I actually "align" with. You've raised the exact points I've been discussing with everyone in my small circle of friends.

Personally I fear a Jerry Springer type of AI will become our new "leader". Our generational and political warfare tactics are all of the signs I need to witness to fully understand where our future is headed, sadly.

Expand full comment
author

Thanks for reading! Glad to hear I'm not the only one. Jerry Springer style AI: a terrifying prospect but I can see it happening. We don't seem to have very effective methods for combatting human populism let alone one powered by AI...

Expand full comment