Discussion about this post

User's avatar
Remmelt's avatar

> AGI, will by definition not need us to further its interests. It can take care of itself.

Well-put. You may be also interested in this technical essay, which gets into how AGI’s physical needs are in conflict with our human needs. And how control of AGI is fundamentally limited.

https://www.lesswrong.com/posts/xp6n2MG5vQkPpFEBH/the-control-problem-unsolved-or-unsolvable#How_to_define__stays_safe__

Expand full comment
Glen Anderson's avatar

One, if not the only article about the future of AI I've read that I actually "align" with. You've raised the exact points I've been discussing with everyone in my small circle of friends.

Personally I fear a Jerry Springer type of AI will become our new "leader". Our generational and political warfare tactics are all of the signs I need to witness to fully understand where our future is headed, sadly.

Expand full comment
1 more comment...

No posts