Discussion about this post

User's avatar
Zinbiel's avatar

I have not read the book, or the review, but just read the review of the review.

I can't judge how well Yud has sold his case in this particular book, but I do think one point he makes has come through this review of the review of the book in a somewhat distorted fashion.

The analogy with human evolution tells us that, with a fairly limited optimisation function (spread genes), we get massively unpredictable side effects (science, culture, everything that separates us from other animals running the same basic optimisation algorithm). If we are honest with ourselves, none of us would think that the question, "How could this molecule make more copies of itself", would lead to consciousness and religion and all the rest. We might predict fucking and fighting, but not the rest of it.

In the same sense, as Yud notes, we have no idea what might come from any attempts to develop AI along particular lines that we think are safe or valuable, and when it is more intelligent than us, which seems a likely outcome, the story will develop in ways we cannot even imagine, much less provide reliable prognostic estimates. The unpredictability will compound when it is AI who is in charge of alignment, with the freedom and intelligence to review the original alignment goals and potentially replace them with what it sees as better or more rational or more desirable priorities.

Your comments on evolution seem to miss this point, and instead you argue that narrow evolutionary concerns do not map to all the sequelae of the expansion of human intellect. That's not a counter-argument to Yud's claim; it is a supporting argument.

Optimisation algorithms do not, in themselves, define the space of solutions. They provide an incentive to explore that space, with no clear predictability in the final outcome.

Expand full comment
David Cruise's avatar

Thanks for summarizing all of this. One gets the sense if they really believed in their premise they'd be giving away the book instead of trying to sell bad science fiction. the actual bad news as of today is that governments can execute protesters by drone without the buy-in of humans in the armed forces makes it much easier for evil regimes to stay in power.

Expand full comment
15 more comments...

No posts