All Humans Are Very Stupid
Ironically, it's the stupidest people who have the best intuitive sense of how stupid all humans are.
When you see a human genius at work, you're not seeing them soar through concept-space like a condor. 🕊️
You're actually seeing them flap around concept-space like a chicken. 🐓
None of us is equipped with a proper set of cognitive "wings". Yet we think we're equipped to dominate the cognitive sky, because we've only known the sky to be empty except for a few of us low-flying chickens. Chickens who imagine themselves as condors.
You're an ape species where some of the members (again, I point out - NOT EVEN ALL MEMBERS) got a bonus special ability to do a laborious version of broad-domain reasoning. It's a kludge that barely works, like flying on chicken wings, or playing the piano with your feet.
Ironically, the stupidest humans are the ones already wise about this. They correctly understand, intuitively, what it'll be like soon when humanity is NOT ABLE TO KEEP UP with even moderately smarter intelligences.
It’s been alarming to see how many smart technologists are too arrogant and unimaginative to fathom how outclassed they're about to be in every mental faculty.
I urge you to put yourself in the shoes of the stupid people in your life, instead of only comparing yourself to your smart peers. Then you can get an empathetic sense of how terrifying it'll be when every human genius is suddenly in a position of being stupider than that.
The above was a clip from my recent podcast with Theo Jaffee:
I recommend the full podcast for an overview of my thoughts about why AI’s capabilities will soon be much higher than humanity’s.
There's a version of the story of the rooster and the eagle that begins with a breathless and panegyric-adjacent description of the rooster, from the perspective of the other barnyard animals (or maybe just the hens). I've forgottten who wrote it, and I can't find it, but I recall how it describes his flight: "up, up, up, until it seemed he would pierce the very vault of heaven, coming at last to rest upon the very pinnacle of the barn!" Certainly it's a good metaphor, and if Aesop could have been apprised of AGI he'd probably have thought it appropriately used here.
As for the rest (and the podcast), I am a layman in these matters but flatter myself that I understand Yudkowsky's arguments enough to agree with all of them. Where all of you lose me, though, is with the assumption that we are in a real sense drawing near to creating intelligence. Humans really aren't that bright, and already our brains are absurdly complex in ways we can't model and don't understand. It's obvious that we're skulking along at the bottom of the concept-space we call 'intelligence', but I don't see any evidence that creating something truly smarter than us lies within our grasp -- especially not within the timelines we have while running on all our present systems*.
* I mean this in whatever way one might choose: in the Hansonian sense that we're not producing enough smart people right now to keep up with needed innovations, or hard resource limits (topsoil, crude oil, easily-extracted metals), social collapse, 'the woke mind virus', good-times-create-weak-men, whatever. I'm in general a 'doomer' in the sense that I don't think we'll ever get to AGI**, but if we could it would kill us all absolutely for sure, (p)99.5%.
** Please tell your kids I said so. If in 2080 they haven't been killed in bread riots or race riots or plagues or civil war, maybe they'll remember that once upon a time smart, serious-minded men thought we might be killed by too much progress.