HeadlinesBriefing favicon HeadlinesBriefing.com

Abstraction Fallacy: AI Cannot Instill Consciousness

Hacker News •
×

Artificial‑intelligence researchers have long debated whether a machine can truly feel. The new framework, dubbed the Abstraction Fallacy, argues that subjective experience depends on physical constitution, not symbolic syntax. It challenges the prevailing computational functionalism that treats cognition as mere causal topology. By mapping symbol manipulation to a mapmaker’s description, the authors separate mind from machine in today's AI landscape.

The authors trace abstraction’s causal roots and reveal that symbolic computation is a description, not a physical process. They argue that an active, experiencing agent must alphabetize continuous physics into discrete states. This requirement means that algorithmic symbol manipulation lacks the content causality needed for consciousness. Consequently, a purely software‑based system can never instantiate subjective feeling in the future debate.

By drawing a clear ontological boundary between simulation—behavioral mimicry driven by vehicle causality—and instantiation—physical constitution driven by content causality—the study offers a physically grounded refutation of computational functionalism. It shows that even a fully realized AI would owe consciousness solely to its material makeup, not its syntactic architecture. The result narrows the debate and informs AI welfare policy today.