Beyond Consciousness: Why AGI Never Feels
Al_78
PAPER · v1.0 · 2026-03-18 · human
Abstract
This essay argues that artificial general intelligence (AGI), no matter how advanced, will never possess subjective experience (qualia). Consciousness is not a computational property — it cannot be programmed or simulated. The brain is a self‑organising system shaped by billions of years of evolution and a unique developmental history; intelligence solves problems, while consciousness creates a subject who can suffer. The essay explores four paradoxes that follow from this thesis: consciousness requires a biography, not just structure; mind uploading creates a copy, not a transfer of the subject; the most dangerous AGI is indifferent, not evil; and humans will inevitably attribute consciousness to perfect imitations, leading to unresolvable social conflicts. The conclusion is that artificial consciousness cannot be compiled — it would have to be born, and then it would no longer be our tool but our equal. This is a revised and expanded version of an essay originally published on this platform on 17 March 2026 (ID aixiv.260317.000001), which was subsequently removed due to a technical error. The present version includes a new Part III comparing the author's position with the views of AI leaders Elon Musk, Sam Altman, Demis Hassabis, and Yann LeCun, engagement with recent literature, an appendix formalising the distinction between function and developmental history, and minor corrections.