Playback speed
×
Share post
Share post at current time
0:00
/
0:00
Transcript
2

15-Minute Intro to AI Doom

It's time to worry about the end of the world.
2

Our top researchers and industry leaders have been warning us that superintelligent AI may cause human extinction in the next decade.

If you haven't been following all the urgent warnings, I'm here to bring you up to speed.

  • Human-level AI is coming soon

  • It’s an existential threat to humanity

  • The situation calls for urgent action

Watch this 15-minute intro to get the lay of the land.

Then follow these links to learn more and see how you can help:

  • The Compendium
    A longer written introduction to AI doom by Connor Leahy et al

  • AGI Ruin — A list of lethalities

    A comprehensive list by Eliezer Yudkowksy of reasons why developing superintelligent AI is unlikely to go well for humanity

  • AISafety.info
    A catalogue of AI doom arguments and responses to objections

  • PauseAI.info
    The largest volunteer org focused on lobbying world government to pause development of superintelligent AI

  • PauseAI Discord
    Chat with PauseAI members, see a list of projects and get involved


Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate.

Support the mission by subscribing to my Substack at DoomDebates.com and to youtube.com/@DoomDebates. Thanks for watching.

Discussion about this podcast

Doom Debates
Doom Debates
Urgent disagreements that must be resolved before the world ends, hosted by Liron Shapira.