Contribution by Alex Provan
In the 1980s and ’90s, as the standard model of physics gathered dust and the field moved from front-page philosophy into the murk of mathematics, neuroscience emerged as the discipline most likely to demystify the natural world—or at least one particularly interesting element of it, ourselves. Modern physics had seemingly gone from promising a holistic explanation for why things are the way they are to suggesting that the universe is fundamentally inscrutable, its mysteries multiplying with each discovery. Neuroscience, meanwhile, turned inward—mirroring the putative narcissism of the era—and strived to reveal the immaculate mechanics lodged within our skulls. This nascent discipline cannibalized psychology, linguistics, and sociology, planting flags in far-flung realms of academia, medicine, and law. Today neuroscience is a central prism through which nearly all aspects of life are regarded.
With the popularization of functional MRI (fMRI) technology, the workings of the brain began to be broadcast in color, with cartoonish images of neural activity accompanying newspaper and television reports on the efforts of researchers to explain our most intimate drives, locate our souls, read our minds. Though neuroscience has provided a preeminent narrative of the twenty-first-century self—look to the nonfiction best seller lists for evidence of its reach—the discipline is still in its infancy and its promises remain largely unfulfilled; the workings of the brain are, of course, far from wholly comprehended by researchers, much less the public. We now know more about diseases and disorders like epilepsy, aphasia, Alzheimer’s, and Tourette’s syndrome than ever before, but none have been cured. We have an inkling of the cognitive processes that beget consciousness, which is both a product and central feature of the brain; but philosophers and neuroscientists alike still have trouble defining, much less locating, the phenomenon and explaining the emergence of subjectivity—if not the more mundane matter of self-awareness. (Even if we could define consciousness, we might not be able to understand it except at the level of the neuron’s cytoskeleton, through quantum mechanics—the physics for which is incomplete. Or our brains might be unconsciously following a program so complex that we could never grasp the entire thing at once. If so, one could theoretically simulate cognition, or at least the underlying processes, though not without an excellent quantum mechanical computer.) We have not found the brain’s “God spot” or “buy button.”
“Common Minds,” a series of essays and conversations on the contemporary infatuation with the brain, will address the limits of neuroscience and how the knowledge produced by researchers and clinicians operates in other realms of culture and society. “Common Minds” was coedited by Dawn Chan and is being published in issues 17 and 18 of Triple Canopy. The series aims to facilitate discussion about ideas that are too rarely scrutinized outside of a specialized setting, despite their sweeping effect. Recent criticism in magazines and academic journals has justifiably deflated—and perhaps tempered public enthusiasm for—the tumid, reductive claims of pop-science scribes. But such venues tend be dominated by professional journalists and scientists debunking other professional journalists and scientists. Departing from the conventions of this discourse, Triple Canopy has invited artists, poets, novelists, philosophers, historians, and psychologists to contribute analytic essays, linguistic compendia, and prose poems; a video rumination on blindness and perception, a collage-survey of the persistent vocabulary of phrenology, and a two-thousand-year genealogy of images of the brain.
In preparing this series, we considered how psychoanalysis became a prevailing cultural narrative in the 1960s and ‘70s and how profoundly it altered the way we construct our own experiences and selves. The standard model may have imparted an all-encompassing theory of the natural world, but to most people that theory is incomprehensible. And those who do understand know that the failure to reconcile the general theory of relativity and quantum theory—the so-called macro and micro worlds—keeps the model incomplete. As such, the standard model speaks to ontology only indirectly. Psychoanalytic theory proffered a softer science, a framework for interpreting—if not resolutely explaining—our thoughts and behaviors, which could be elaborated as a worldview. Even if many who invoked Freud had little purchase on psychoanalysis beyond the rudiments, they could assimilate that worldview (which was confirmed in their own daily lives).
Neuroscience awkwardly straddles these two bodies of knowledge, in terms of the potential to link visceral experience to vanguard science in the construction of a more or less complete picture of the world—inasmuch as the world is a construct of our own cognitive faculties (and therefore a projection of biology). And yet generally, especially in nonclinical settings, what neuroscience has to offer is interpretation, not explanation. And too often the two are confused.
In March 2008, the Journal of Cognitive Neuroscience published an article called “The Seductive Allure of Neuroscience Explanations,” which grappled with the question of why people are much more likely to believe a statement when it includes a neuroscience gloss and fail to recognize when that information is illogical or irrelevant. The researchers asked subjects to evaluate various descriptions of psychological phenomena, both inaccurate and accurate, with and without perfunctory references to neural circuitry. For instance, they were told that “the curse of knowledge”—the tendency to assume other people know what you do and are ignorant of what you don’t know—arises from our innate inability to judge others and that brain scans reveal the curse “happens because of the frontal lobe brain circuitry known to be involved in self-knowledge.” Subjects found that such bilge made circular arguments “significantly more satisfying.”
To most people, neuroscience supplies a skein of novelty and credibility to propositions that might otherwise appear unoriginal or even outlandish. Among the beneficiaries of the public’s susceptibility to science-inflected language and imagery are neuromarketing companies wielding brain scans to help clients sell everything from politicians to Super Bowl ads to social media campaigns. For example, MindSign Neuromarketing’s “neurocinema” service will “take your trailer or spot and show you what parts or scenes cause activation (good) and what parts cause deactivation (bad).” Many in the humanities have responded by insinuating neuroscience into their work—the so-called cognitive turn, which has given rise to disciplines such as neuroeconomics, neurotheology, neuroaesthetics, and neuroethics.
For the most part, neuroscience has given us an empirical basis for assertions made decades or centuries ago by linguists, psychologists, philosophers, and ophthalmologists. And yet one effect of the rapid rise of neuroscience is the comparative diminution (and, coincidentally, defunding) of the very branches of science and the humanities upon which the discipline draws. The result is a creeping assumption that the brain makes the mind. “Although we do not know how, it is widely accepted that a complete neural explanation is, in principle, possible,” writes psychologist William Uttal in Mind and Brain: A Critical Appraisal of Cognitive Neuroscience (2011). “Those who labor in the laboratory rarely make this monistic assumption explicit, and yet few cognitive neuroscientists would challenge this fundamental idea.” The guiding principle of much contemporary neuroscience is “without any compelling empirical foundation.”
The work in “Common Minds” aims to disclose and debunk, but more orthogonally than polemically, by charting the origins of the brain-as-computer metaphor, considering fMRI scans as part of the history of photography, evaluating the claims of neuroscience alongside the observations of seventeenth-century Japanese anatomical illustrators, and so on. To us, this work is itself a strike against biological reductionism. There have been many enthralling, if provisional, discoveries regarding the unconscious processes underlying our behavior, the adaptive and plastic nature of the brain, and the relationship between vision and cognition. But what, besides consciousness, prompts such efforts, impels us to engineer experiments, parse data, invent theories? The core of who we are and why we do what we do—that union of memory, consciousness, and the body that marks us as human, grounding our various representations of the world in what Kant called a “single common subject”—remains obscure.
“Common Minds” questions the expectation of revelation, especially as advanced by popular neuroscience books that mostly work to inspire awe by assigning biological explanations to every aspect of our behavior. The brain “is the most wondrous thing we have discovered in the universe, and it is us,” writes David Eagleman in Incognito: The Brains Behind the Mind (2012). Eagleman’s book is wrenchingly typical for making unconscious mechanics responsible for the preponderance of what we do and say and for obscuring certain material facts that suggest otherwise. For instance, Eagleman uses Ulysses to analyze the housing bubble: By bounding himself in anticipation of his ship passing the Sirens, Ulysses shorted the same “instant-gratification circuits” that produced the subprime-mortgage crisis. But this account neglects predatory lending practices and the failure of government regulation—anything besides the cognitive processes of homeowners.
Rather than trouble what neuroscientist V. S. Ramachandran calls our “cognitive state of grace” in relation to other species, such books tend to strip away the construction of subjective experience from objective reality in order to reveal the gears at work and marvel at the mechanics. The conscious mind becomes an evolutionary product of these unconscious operations; it is dismissed as an incidental aspect of the brain, a tool for achieving certain rarefied tasks. (Aside from being the basis of higher-level thinking, consciousness is the crux of the liberal democratic project. The terminal point of which might be the erosion of the free, autonomous political subject, in favor of a biologically constituted self, which has dramatic implications for the exercise of power and the production of social life.) Authors are expected to edify initiates while catering to their relative ignorance, acquaint them with the basics of brain functioning but not muddle the material with too much neuroanatomy. Often this means pitching a ream of case studies and summaries of academic papers as a “journey,” “search,” or “quest.” At the end of the road there tends to be either a solution to everyday problems—leadership, love, sales, addiction, productivity—or a sense of pure wonder. Dwelling on the magic of cognition, while insisting on its biological basis, returns the brain to a fantastical realm just as it is being demystified.
The quest for a purely biological account of consciousness seems, paradoxically, to point to something about our cognitive configuration that eludes such an understanding. Neuroscience has illuminated the brain’s constant efforts to make sense of the world, to form an astonishingly coherent narrative out of a welter of information. Our eyes capture fragments of objects in mere moments—fleeting records of passing cars, gathering clouds, familiar faces—which are strung together by the brain, giving the impression of a continuous visual flow. Cognition is enabled by regular communication between different regions of the brain, a kind of discourse between neuronal factions that, over a lifetime, actually restructures the brain, in what neuroscientist J. P. Changeux has labeled “the Darwinism of the synapses.” Furthermore, the brain develops a model of the outside world that enables it to anticipate rather than merely respond to sensory data. We are, in effect, always constructing the setting of our own story just before it happens. Somehow this seemingly effortless assembly of memories and impressions inspires the vague but tenacious belief in the presence within us of something that exceeds our circuitry, a more or less authentic self. And this self seems to be as real, if not more real, than anything else in the world.