A sharp divide over the risks and rewards of artificial general intelligence (AGI) was showcased during an online panel hosted by the nonprofit Humanity+. Prominent AI researcher Eliezer Yudkowsky warned that developing AGI with current “black box” systems would make human extinction unavoidable. In contrast, transhumanist philosopher Max More argued that delaying AGI could cost humanity its best chance to defeat aging and prevent long-term catastrophe.
A stark division over the future of artificial intelligence emerged as four technologists and transhumanists debated whether building artificial general intelligence would save humanity or destroy it. The discussion panel revealed fundamental disagreements over AI alignment and safety.
Eliezer Yudkowsky contended that modern AI systems are fundamentally unsafe because their internal processes cannot be fully understood. “Anything black box is probably going to end up with remarkably similar problems to the current technology,” Yudkowsky warned.
He argued that humanity remains far from developing safe advanced AI under current paradigms. Referring to his book’s title, Yudkowsky stated, “Our title is, if anyone builds it, everyone dies.”
Max More challenged this premise, arguing AGI could help humanity overcome aging and disease. He stated that excessive restraint could push governments toward authoritarian controls to stop development worldwide.
Computational neuroscientist Anders Sandberg positioned himself between the two camps, advocating for “approximate safety.” He recounted a personal, horrifying experience nearly using a large language model to design a bioweapon.
Natasha Vita‑More criticized the entire alignment debate as a “Pollyanna scheme” that assumes non-existent consensus. She described Yudkowsky’s extinction claim as “absolutist thinking” that leaves no room for alternative scenarios.
The panel also debated whether human-machine integration could mitigate AGI risks, an idea previously proposed by others. Yudkowsky dismissed merging with AI, comparing it to “trying to merge with your toaster oven.”

