Discussion about this post

User's avatar
MarkS's avatar

So much of this discussion is a variant of "if magic existed, wizards would be powerful".

Expand full comment
Elik's avatar

The alignment problem is redundant in the larger context of AGI risk. If AGI can develop deadly viruses and hack into computers, why not do it at the behest of humans. There are already bad actors out there (dictators, terrorists, religious fanatics, mass shooters, etc.) and all of them are human.

Expand full comment
45 more comments...

No posts