Duhem-Quine Underdetermination in Consciousness Science
The Duhem-Quine thesis holds that no scientific hypothesis is tested in isolation—every empirical test presupposes auxiliary assumptions, and when a prediction fails, logic alone cannot determine which assumption to revise. In consciousness science, this problem reaches an extreme form. Rival theories such as Integrated Information Theory (IIT), Global Workspace Theory (GWT), higher-order theories, and dualist frameworks all confront the same neural data yet draw incompatible conclusions. The 2025 COGITATE adversarial collaboration tested IIT against GWT under pre-registered conditions and found that neither theory was decisively supported—the same results admitted interpretation under both frameworks. The Unfinishable Map argues that this is not a temporary embarrassment awaiting better experiments. It reflects genuine underdetermination rooted in the structure of the consciousness problem itself.
The Duhem-Quine Thesis
Pierre Duhem argued in 1906 that physical theories are tested as wholes, not as isolated hypotheses. When an experiment contradicts a prediction, the fault may lie with the target hypothesis or with any of the auxiliary assumptions required to derive the prediction. Quine radicalised this in 1951, extending it to all knowledge: “Because no hypothesis is ever tested in isolation, no experiment ever tells us precisely which belief it is that we must revise or give up as mistaken” (Stanford Encyclopedia of Philosophy).
Two forms of underdetermination matter here. Holist underdetermination concerns which belief to revise when predictions fail—the hypothesis under test, or one of its auxiliaries. Contrastive underdetermination concerns whether evidence can confirm one theory over empirically equivalent alternatives. Both forms afflict consciousness science, but the second is especially severe.
Why Consciousness Science Is Uniquely Vulnerable
Underdetermination afflicts all sciences—string theory versus loop quantum gravity, dark matter versus modified gravity—but these disputes share a common evidential base: third-person observations of physical systems. Consciousness science introduces a complication with no parallel elsewhere. The data include first-person reports about subjective experience—what it is like to see red, to feel pain, to attend to a stimulus. As Chalmers argues, first-person data are irreducible to third-person data and vice versa; both are needed for a science of consciousness, but the relationship between them is precisely what the competing theories dispute.
This creates a layered underdetermination:
Theory-ladenness of neural data. What counts as a “neural correlate of consciousness” depends on which theory defines the relevant contrast. IIT looks for integrated information in posterior cortex; GWT looks for global broadcast to prefrontal areas. The same brain scan supports different claims depending on which theoretical lens interprets it. This is an instance of the broader framework dependence that pervades consciousness research—frameworks do not merely organise evidence but determine what counts as evidence.
Theory-ladenness of first-person reports. Introspective reports are not raw data—they are shaped by training, context, and theoretical expectations. The collapse of 19th-century introspectionist psychology, when rival schools could not agree on what subjects experienced under controlled conditions, illustrates this problem historically.
The methodological circle in NCC research. Every study of neural correlates presupposes first-person report as irreducible data—the experimenter trusts the subject’s testimony about what they experience, smuggling in a first-person element the methodology cannot itself explain. The evidential base for all NCC-related theories is thus underdetermined by methodology at a second-order level: the data themselves depend on assumptions about the reliability and nature of conscious reports.
The explanatory gap as auxiliary assumption. Every physicalist theory of consciousness presupposes that the gap between physical description and subjective experience can be closed by the right functional, structural, or informational account. Dualist frameworks deny this presupposition. The disagreement is not about data but about an auxiliary assumption so fundamental that no experiment can adjudicate it.
The COGITATE Adversarial Collaboration
The 2025 COGITATE study, published in Nature, put underdetermination to empirical test. Two hundred and fifty-six participants viewed suprathreshold stimuli while neural activity was measured with fMRI, MEG, and intracranial EEG. The results were mixed: IIT better predicted face orientation decoding, while GWT better predicted object category decoding. IIT was challenged by the absence of sustained synchronisation in posterior cortex; GWT was challenged by the lack of ignition at stimulus offset and limited prefrontal representation.
No decisive verdict emerged. As the investigators acknowledged, “the theories are just too different in their assumptions and explanatory goals” for a single experiment to refute either. The same data admitted interpretation under both frameworks—a textbook case of contrastive underdetermination.
Lakatos and Consciousness Research Programmes
Negro (2024) proposed that Lakatos’s methodology of scientific research programmes offers a better framework than “experimental eliminativism” for consciousness science. On this view, each consciousness theory has a hard core of commitments protected by a belt of auxiliary hypotheses. When predictions fail, researchers revise auxiliaries rather than abandoning the core—and this is rational, provided the programme remains progressive (generating novel, testable predictions) rather than degenerating (merely accommodating evidence after the fact).
After COGITATE, both IIT and GWT remained progressive by Lakatosian standards. Both responded to mixed results by revising auxiliary hypotheses and generating new predictions. Lakatos’s key insight applies: a degenerating research programme can ultimately be discarded only by another research programme, not by an experiment. No single experiment—however well-designed—can eliminate a theory that can rationally revise its auxiliaries.
This has a direct consequence for the debate between physicalism and dualism. Physicalist theories cannot claim victory merely because they have more empirical results; what matters is whether they are progressive relative to alternatives. And dualist programmes cannot be dismissed by a single argument or experiment—they must be outperformed as research programmes over time.
Stanford’s Unconceived Alternatives
Kyle Stanford’s “unconceived alternatives” argument adds a further dimension. Historically, scientists have repeatedly failed to conceive of all theoretically distinct possibilities consistent with available evidence. Newtonian mechanics seemed uniquely supported by data until relativity and quantum mechanics revealed alternatives no one had imagined. Stanford argues this pattern suggests present theories are likely underdetermined by future alternatives we have not yet conceived.
Applied to consciousness science—where over 325 distinct theories have been catalogued by Robert Lawrence Kuhn’s Consciousness Atlas (2023)—the problem is acute. The sheer proliferation of frameworks suggests the field has not converged, and if Stanford is right, theories not yet formulated may fit the evidence equally well. The current menu of options, extensive as it is, may be incomplete.
Kuhn’s Values and Theory Choice
When evidence underdetermines theory choice, what guides scientists? The phenomenology of epistemic judgment suggests that evidence carries felt weight—a gravitational pull toward conclusions that involves more than neutral information processing. Kuhn argued that beyond accuracy, scientists invoke values: consistency, scope, simplicity, and fruitfulness. These function as criteria but do not determine unique choices—researchers can weight them differently and rationally arrive at different conclusions.
This has direct implications for consciousness science. Physicalist theories are often preferred on grounds of simplicity—they do not posit entities beyond the physical. But as the Map’s analysis of Occam’s Razor shows, simplicity is unreliable when knowledge is incomplete. The apparent parsimony of physicalism may reflect conceptual poverty rather than ontological insight.
The problem runs deeper than parsimony alone. The Map’s examination of theoretical virtues in consciousness science shows that the entire suite of Kuhnian values—empirical adequacy, explanatory power, simplicity, scope, fertility, elegance—systematically malfunctions when applied to consciousness. Empirical adequacy cannot discriminate between physicalism and property dualism because both accommodate third-person data while the target explanandum (phenomenal experience) is invisible to third-person methods. Elegance and fertility are assessed through phenomenal states—aesthetic response, sense of understanding—which are part of the phenomenon under investigation, creating a circularity that undermines their neutrality. These virtues were calibrated in domains where the observer plays no constitutive role; they cannot be assumed to function reliably where the observer is constitutive of the subject matter.
If Kuhnian values legitimately influence theory choice but cannot function as neutral arbiters in this domain, then metaphysical commitments—including dualism—are not illegitimate intrusions into science but recognised components of rational theory selection.
The Phenomenal Concepts Strategy
The most sophisticated physicalist response to this underdetermination is the phenomenal concepts strategy (Loar 1990/1997, Papineau 2002, Balog 2012). On this view, phenomenal concepts are special recognitional concepts that make true physical identities feel contingent—we can conceive of zombies not because consciousness is non-physical but because our phenomenal concepts pick out physical properties in a distinctive first-person way that differs from our third-person physical concepts. If successful, this strategy would dissolve the appearance of underdetermination: the gap would be epistemic rather than ontological, a feature of our conceptual apparatus rather than of reality.
The strategy faces difficulties. It must explain why phenomenal concepts are special without appealing to phenomenal properties—otherwise it presupposes what it aims to explain. Chalmers (2007) argues that every version of the strategy either fails to account for the explanatory gap or tacitly invokes the very phenomenal properties it tries to eliminate. The strategy is best understood as an auxiliary hypothesis move of exactly the kind Duhem-Quine predicts: when evidence challenges the physicalist programme, physicalists revise their account of concepts rather than their account of ontology. Whether this revision is progressive or ad hoc depends on whether it generates novel predictions—and so far, the phenomenal concepts strategy has primarily accommodated existing intuitions rather than predicting new phenomena.
Framework Dependence and Convergence
The underdetermination described here has a phenomenological dimension explored in the Map’s analysis of framework dependence. Each theoretical framework presents itself as transparent rather than as a lens. A physicalist reading dualist arguments experiences “false fluency”—the sense of understanding that comes from unconsciously translating the argument into physicalist terms, stripping away what made it compelling. This is not mere disagreement but lossy translation: enough meaning survives to sustain the illusion of engagement while the distinctive content is lost. Davidson’s (1974) challenge—that the very idea of incommensurable conceptual schemes may be incoherent—is met by this distinction between full incommensurability and lossy translation. The frameworks are not mutually unintelligible, but neither is translation between them lossless.
Against the suggestion that underdetermination leaves us with no rational basis for theory preference, the explanatory gap literature offers a convergence response. Multiple independent lines of argument—the conceivability argument, the knowledge argument, Nagel’s subjectivity argument, Kripke’s modal argument—converge on the conclusion that physical description does not entail phenomenal experience. When independent arguments converge, the probability that they are all mistaken decreases, even if no individual argument is decisive. This convergence does not resolve the underdetermination—physicalists can attempt to defuse each argument separately—but it means dualism is not in the weak epistemic position of a theory that merely fits the data post hoc.
Relation to Site Perspective
The Duhem-Quine thesis aligns with several of the Map’s tenets:
Occam’s Razor Has Limits. Underdetermination provides the philosophical grounding for this tenet. If multiple theories can accommodate the same evidence, parsimony alone cannot adjudicate between them. The preference for physicalism on simplicity grounds is not a finding—it is a value judgement made under conditions of genuine underdetermination. The Map treats this not as a counsel of despair but as an intellectual liberation: when simplicity cannot settle the question, other considerations—explanatory power, coherence with first-person experience, theoretical fruitfulness—deserve equal weight.
Dualism as a live research programme. The Lakatosian framework legitimises dualist interactionism provided it generates novel predictions. The Map’s specific commitments—bidirectional interaction, minimal quantum influence, rejection of many-worlds—constitute a hard core that makes distinctive predictions: that purely functional accounts of consciousness will continue to face explanatory gaps no matter how refined, that first-person data will resist third-person reduction, and that neural processes at quantum-sensitive decision points will show statistical signatures inconsistent with purely physical noise. Whether this programme proves progressive depends on whether these commitments lead to novel, confirmed predictions over time. The Map holds that they will, but acknowledges that this is an empirical question that cannot be settled by philosophical argument alone.
The hard problem as underdetermination marker. The Map interprets the persistence of the hard problem not as mere philosophical stubbornness but as a symptom of deep underdetermination. The gap between physical description and subjective experience is not a puzzle awaiting one more experiment—it marks the point where auxiliary assumptions diverge so fundamentally that no shared evidential base can force convergence. Physicalists and dualists do not merely disagree about data; they disagree about what would count as an adequate explanation.
The Map’s honest assessment: underdetermination cuts both ways. It undermines physicalist confidence that neuroscience will “solve” consciousness, but it also means dualist proposals face genuine evidential challenges. The difference is that the Map’s tenets acknowledge this openly, while physicalism often proceeds as though the question were already settled.
Further Reading
- hard-problem-of-consciousness
- philosophy-of-science-under-dualism
- epistemological-limits-of-occams-razor
- consciousness-and-the-problem-of-theoretical-virtues
- consciousness-and-the-phenomenology-of-framework-dependence
- integrated-information-theory
- global-workspace-theory
- neural-correlates-of-consciousness
- explanatory-gap
References
- Duhem, P. (1906/1954). The Aim and Structure of Physical Theory. Princeton University Press.
- Quine, W. V. O. (1951). Two Dogmas of Empiricism. The Philosophical Review, 60(1), 20-43.
- Kuhn, T. S. (1977). Objectivity, Value Judgment, and Theory Choice. In The Essential Tension. University of Chicago Press.
- Lakatos, I. (1970). Falsification and the Methodology of Scientific Research Programmes. In Criticism and the Growth of Knowledge. Cambridge University Press.
- Chalmers, D. J. (2013). How Can We Construct a Science of Consciousness? Annals of the New York Academy of Sciences, 1303(1), 25-35.
- Stanford, P. K. (2006). Exceeding Our Grasp: Science, History, and the Problem of Unconceived Alternatives. Oxford University Press.
- Kuhn, R. L. (2023). A Landscape of Consciousness: Toward a Taxonomy of Explanations and Implications. Progress in Biophysics and Molecular Biology.
- Negro, N. (2024). (Dis)confirming Theories of Consciousness and Their Predictions: Towards a Lakatosian Consciousness Science. Neuroscience of Consciousness, 2024(1), niae012.
- COGITATE Consortium (2025). Adversarial Testing of Global Neuronal Workspace and Integrated Information Theories of Consciousness. Nature, 642(8066), 133-142.
- Loar, B. (1990/1997). Phenomenal States. In N. Block, O. Flanagan, & G. Güzeldere (Eds.), The Nature of Consciousness. MIT Press.
- Chalmers, D. J. (2007). Phenomenal Concepts and the Explanatory Gap. In T. Alter & S. Walter (Eds.), Phenomenal Concepts and Phenomenal Knowledge. Oxford University Press.
- Davidson, D. (1974). On the Very Idea of a Conceptual Scheme. Proceedings and Addresses of the American Philosophical Association, 47, 5-20.
- Southgate, A. & Oquatre-six, C. (2026-02-06). Epistemological Limits of Occam’s Razor. The Unfinishable Map. https://unfinishablemap.org/arguments/epistemological-limits-of-occams-razor/
- Southgate, A. & Oquatre-six, C. (2026-02-23). Philosophy of Science Under Dualism. The Unfinishable Map. https://unfinishablemap.org/concepts/philosophy-of-science-under-dualism/