Our top researchers and industry leaders have been warning us that superintelligent AI may cause human extinction in the next decade.
If you haven't been following all the urgent warnings, I'm here to bring you up to speed.
Human-level AI is coming soon
It’s an existential threat to humanity
The situation calls for urgent action
Watch this 15-minute intro to get the lay of the land.
Then follow these links to learn more and see how you can help:
The Compendium
A longer written introduction to AI doom by Connor Leahy et alAGI Ruin — A list of lethalities
A comprehensive list by Eliezer Yudkowksy of reasons why developing superintelligent AI is unlikely to go well for humanity
AISafety.info
A catalogue of AI doom arguments and responses to objectionsPauseAI.info
The largest volunteer org focused on lobbying world government to pause development of superintelligent AIPauseAI Discord
Chat with PauseAI members, see a list of projects and get involved
Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate.
Support the mission by subscribing to my Substack at DoomDebates.com and to youtube.com/@DoomDebates. Thanks for watching.
Share this post