27. november 2012

A Note on Common Minds

Contribution by Alex Provan

In the 1980s and ’90s, as the standard model of physics gathered dust and the field moved from front-page philosophy into the murk of mathematics, neuroscience emerged as the discipline most likely to demystify the natural world—or at least one particularly interesting element of it, ourselves. Modern physics had seemingly gone from promising a holistic explanation for why things are the way they are to suggesting that the universe is fundamentally inscrutable, its mysteries multiplying with each discovery. Neuroscience, meanwhile, turned inward—mirroring the putative narcissism of the era—and strived to reveal the immaculate mechanics lodged within our skulls. This nascent discipline cannibalized psychology, linguistics, and sociology, planting flags in far-flung realms of academia, medicine, and law. Today neuroscience is a central prism through which nearly all aspects of life are regarded.

With the popularization of functional MRI (fMRI) technology, the workings of the brain began to be broadcast in color, with cartoonish images of neural activity accompanying newspaper and television reports on the efforts of researchers to explain our most intimate drives, locate our souls, read our minds. Though neuroscience has provided a preeminent narrative of the twenty-first-century self—look to the nonfiction best seller lists for evidence of its reach—the discipline is still in its infancy and its promises remain largely unfulfilled; the workings of the brain are, of course, far from wholly comprehended by researchers, much less the public. We now know more about diseases and disorders like epilepsy, aphasia, Alzheimer’s, and Tourette’s syndrome than ever before, but none have been cured. We have an inkling of the cognitive processes that beget consciousness, which is both a product and central feature of the brain; but philosophers and neuroscientists alike still have trouble defining, much less locating, the phenomenon and explaining the emergence of subjectivity—if not the more mundane matter of self-awareness. (Even if we could define consciousness, we might not be able to understand it except at the level of the neuron’s cytoskeleton, through quantum mechanics—the physics for which is incomplete. Or our brains might be unconsciously following a program so complex that we could never grasp the entire thing at once. If so, one could theoretically simulate cognition, or at least the underlying processes, though not without an excellent quantum mechanical computer.) We have not found the brain’s “God spot” or “buy button.”

“Common Minds,” a series of essays and conversations on the contemporary infatuation with the brain, will address the limits of neuroscience and how the knowledge produced by researchers and clinicians operates in other realms of culture and society. “Common Minds” was coedited by Dawn Chan and is being published in issues 17 and 18 of Triple Canopy. The series aims to facilitate discussion about ideas that are too rarely scrutinized outside of a specialized setting, despite their sweeping effect. Recent criticism in magazines and academic journals has justifiably deflated—and perhaps tempered public enthusiasm for—the tumid, reductive claims of pop-science scribes. But such venues tend be dominated by professional journalists and scientists debunking other professional journalists and scientists. Departing from the conventions of this discourse, Triple Canopy has invited artists, poets, novelists, philosophers, historians, and psychologists to contribute analytic essays, linguistic compendia, and prose poems; a video rumination on blindness and perception, a collage-survey of the persistent vocabulary of phrenology, and a two-thousand-year genealogy of images of the brain.

In preparing this series, we considered how psychoanalysis became a prevailing cultural narrative in the 1960s and ‘70s and how profoundly it altered the way we construct our own experiences and selves. The standard model may have imparted an all-encompassing theory of the natural world, but to most people that theory is incomprehensible. And those who do understand know that the failure to reconcile the general theory of relativity and quantum theory—the so-called macro and micro worlds—keeps the model incomplete. As such, the standard model speaks to ontology only indirectly. Psychoanalytic theory proffered a softer science, a framework for interpreting—if not resolutely explaining—our thoughts and behaviors, which could be elaborated as a worldview. Even if many who invoked Freud had little purchase on psychoanalysis beyond the rudiments, they could assimilate that worldview (which was confirmed in their own daily lives).

Neuroscience awkwardly straddles these two bodies of knowledge, in terms of the potential to link visceral experience to vanguard science in the construction of a more or less complete picture of the world—inasmuch as the world is a construct of our own cognitive faculties (and therefore a projection of biology). And yet generally, especially in nonclinical settings, what neuroscience has to offer is interpretation, not explanation. And too often the two are confused.

In March 2008, the Journal of Cognitive Neuroscience published an article called “The Seductive Allure of Neuroscience Explanations,” which grappled with the question of why people are much more likely to believe a statement when it includes a neuroscience gloss and fail to recognize when that information is illogical or irrelevant. The researchers asked subjects to evaluate various descriptions of psychological phenomena, both inaccurate and accurate, with and without perfunctory references to neural circuitry. For instance, they were told that “the curse of knowledge”—the tendency to assume other people know what you do and are ignorant of what you don’t know—arises from our innate inability to judge others and that brain scans reveal the curse “happens because of the frontal lobe brain circuitry known to be involved in self-knowledge.” Subjects found that such bilge made circular arguments “significantly more satisfying.”

To most people, neuroscience supplies a skein of novelty and credibility to propositions that might otherwise appear unoriginal or even outlandish. Among the beneficiaries of the public’s susceptibility to science-inflected language and imagery are neuromarketing companies wielding brain scans to help clients sell everything from politicians to Super Bowl ads to social media campaigns. For example, MindSign Neuromarketing’s “neurocinema” service will “take your trailer or spot and show you what parts or scenes cause activation (good) and what parts cause deactivation (bad).” Many in the humanities have responded by insinuating neuroscience into their work—the so-called cognitive turn, which has given rise to disciplines such as neuroeconomics, neurotheology, neuroaesthetics, and neuroethics.

For the most part, neuroscience has given us an empirical basis for assertions made decades or centuries ago by linguists, psychologists, philosophers, and ophthalmologists. And yet one effect of the rapid rise of neuroscience is the comparative diminution (and, coincidentally, defunding) of the very branches of science and the humanities upon which the discipline draws. The result is a creeping assumption that the brain makes the mind. “Although we do not know how, it is widely accepted that a complete neural explanation is, in principle, possible,” writes psychologist William Uttal in Mind and Brain: A Critical Appraisal of Cognitive Neuroscience (2011). “Those who labor in the laboratory rarely make this monistic assumption explicit, and yet few cognitive neuroscientists would challenge this fundamental idea.” The guiding principle of much contemporary neuroscience is “without any compelling empirical foundation.”

The work in “Common Minds” aims to disclose and debunk, but more orthogonally than polemically, by charting the origins of the brain-as-computer metaphor, considering fMRI scans as part of the history of photography, evaluating the claims of neuroscience alongside the observations of seventeenth-century Japanese anatomical illustrators, and so on. To us, this work is itself a strike against biological reductionism. There have been many enthralling, if provisional, discoveries regarding the unconscious processes underlying our behavior, the adaptive and plastic nature of the brain, and the relationship between vision and cognition. But what, besides consciousness, prompts such efforts, impels us to engineer experiments, parse data, invent theories? The core of who we are and why we do what we do—that union of memory, consciousness, and the body that marks us as human, grounding our various representations of the world in what Kant called a “single common subject”—remains obscure.

“Common Minds” questions the expectation of revelation, especially as advanced by popular neuroscience books that mostly work to inspire awe by assigning biological explanations to every aspect of our behavior. The brain “is the most wondrous thing we have discovered in the universe, and it is us,” writes David Eagleman in Incognito: The Brains Behind the Mind (2012). Eagleman’s book is wrenchingly typical for making unconscious mechanics responsible for the preponderance of what we do and say and for obscuring certain material facts that suggest otherwise. For instance, Eagleman uses Ulysses to analyze the housing bubble: By bounding himself in anticipation of his ship passing the Sirens, Ulysses shorted the same “instant-gratification circuits” that produced the subprime-mortgage crisis. But this account neglects predatory lending practices and the failure of government regulation—anything besides the cognitive processes of homeowners.

Rather than trouble what neuroscientist V. S. Ramachandran calls our “cognitive state of grace” in relation to other species, such books tend to strip away the construction of subjective experience from objective reality in order to reveal the gears at work and marvel at the mechanics. The conscious mind becomes an evolutionary product of these unconscious operations; it is dismissed as an incidental aspect of the brain, a tool for achieving certain rarefied tasks. (Aside from being the basis of higher-level thinking, consciousness is the crux of the liberal democratic project. The terminal point of which might be the erosion of the free, autonomous political subject, in favor of a biologically constituted self, which has dramatic implications for the exercise of power and the production of social life.) Authors are expected to edify initiates while catering to their relative ignorance, acquaint them with the basics of brain functioning but not muddle the material with too much neuroanatomy. Often this means pitching a ream of case studies and summaries of academic papers as a “journey,” “search,” or “quest.” At the end of the road there tends to be either a solution to everyday problems—leadership, love, sales, addiction, productivity—or a sense of pure wonder. Dwelling on the magic of cognition, while insisting on its biological basis, returns the brain to a fantastical realm just as it is being demystified.

The quest for a purely biological account of consciousness seems, paradoxically, to point to something about our cognitive configuration that eludes such an understanding. Neuroscience has illuminated the brain’s constant efforts to make sense of the world, to form an astonishingly coherent narrative out of a welter of information. Our eyes capture fragments of objects in mere moments—fleeting records of passing cars, gathering clouds, familiar faces—which are strung together by the brain, giving the impression of a continuous visual flow. Cognition is enabled by regular communication between different regions of the brain, a kind of discourse between neuronal factions that, over a lifetime, actually restructures the brain, in what neuroscientist J. P. Changeux has labeled “the Darwinism of the synapses.” Furthermore, the brain develops a model of the outside world that enables it to anticipate rather than merely respond to sensory data. We are, in effect, always constructing the setting of our own story just before it happens. Somehow this seemingly effortless assembly of memories and impressions inspires the vague but tenacious belief in the presence within us of something that exceeds our circuitry, a more or less authentic self. And this self seems to be as real, if not more real, than anything else in the world.

24. november 2012

Neuroscience: Under Attack


THIS fall, science writers have made sport of yet another instance of bad neuroscience. The culprit this time is Naomi Wolf; her new book, “Vagina,” has been roundly drubbed for misrepresenting the brain and neurochemicals like dopamine and oxytocin.

Earlier in the year, Chris Mooney raised similar ire with the book “The Republican Brain,” which claims that Republicans are genetically different from — and, many readers deduced, lesser to — Democrats. “If Mooney’s argument sounds familiar to you, it should,” scoffed two science writers. “It’s called ‘eugenics,’ and it was based on the belief that some humans are genetically inferior.”

Sharp words from disapproving science writers are but the tip of the hippocampus: today’s pop neuroscience, coarsened for mass audiences, is under a much larger attack.

Meet the “neuro doubters.” The neuro doubter may like neuroscience but does not like what he or she considers its bastardization by glib, sometimes ill-informed, popularizers.

A gaggle of energetic and amusing, mostly anonymous, neuroscience bloggers — including Neurocritic, Neuroskeptic, Neurobonkers and Mind Hacks — now regularly point out the lapses and folly contained in mainstream neuroscientific discourse. This group, for example, slammed a recent Newsweek article in which a neurosurgeon claimed to have discovered that “heaven is real” after his cortex “shut down.” Such journalism, these critics contend, is “shoddy,” nothing more than “simplified pop.” Additionally, publications from The Guardian to the New Statesman have published pieces blasting popular neuroscience-dependent writers like Jonah Lehrer and Malcolm Gladwell. The Oxford neuropsychologist Dorothy Bishop’s scolding lecture on the science of bad neuroscience was an online sensation last summer.

As a journalist and cultural critic, I applaud the backlash against what is sometimes called brain porn, which raises important questions about this reductionist, sloppy thinking and our willingness to accept seemingly neuroscientific explanations for, well, nearly everything.

Voting Republican? Oh, that’s brain chemistry. Success on the job? Fortuitous neurochemistry! Neuroscience has joined company with other totalizing worldviews — Marxism, Freudianism, critical theory — that have been victim to overuse and misapplication.

A team of British scientists recently analyzed nearly 3,000 neuroscientific articles published in the British press between 2000 and 2010 and found that the media regularly distorts and embellishes the findings of scientific studies. Writing in the journal Neuron, the researchers concluded that “logically irrelevant neuroscience information imbues an argument with authoritative, scientific credibility.” Another way of saying this is that bogus science gives vague, undisciplined thinking the look of seriousness and truth.

The problem isn’t solely that self-appointed scientists often jump to faulty conclusions about neuroscience. It’s also that they are part of a larger cultural tendency, in which neuroscientific explanations eclipse historical, political, economic, literary and journalistic interpretations of experience. A number of the neuro doubters are also humanities scholars who question the way that neuroscience has seeped into their disciplines, creating phenomena like neuro law, which, in part, uses the evidence of damaged brains as the basis for legal defense of people accused of heinous crimes, or neuroaesthetics, a trendy blend of art history and neuroscience.

It’s not hard to understand why neuroscience is so appealing. We all seek shortcuts to enlightenment. It’s reassuring to believe that brain images and machine analysis will reveal the fundamental truth about our minds and their contents. But as the neuro doubters make plain, we may be asking too much of neuroscience, expecting that its explanations will be definitive. Yet it’s hard to imagine that any functional magnetic resonance imaging or chemical map will ever explain “The Golden Bowl” or heaven. Or that brain imaging, no matter how sophisticated and precise, will ever tell us what women really want.

7. november 2012

The Art of Positive Skepticism


In the late 1500’s, everyone believed Aristotle’s claim that heavy objects fell faster than light ones. That is, everyone except Galileo. To test Aristotle’s claim, Galileo dropped two balls of differing weights from the Leaning Tower of Pisa. And guess what? They both hit the ground at the same time! For challenging Aristotle’s authority, Galileo was fired from his job. But for his place in history, he showed us that testing human claims should be the mediator of all truth.

Fast forward to modern times. Challenging commonly held assumptions about computers and human behavior, Steve Jobs lost his job with Apple in 1985. Returning 12 years later, he changed the way people use technology by testing the truth of other people’s claims. As a result, history considers Jobs one of the most innovative minds of the 21st century.

Galileo and Jobs were skeptics. They had developed habits of thinking that challenged what appeared to be reliable facts. They understood that testing assumptions over human authority led to greater understanding, innovation, and creativity.

It’s easy to confuse being a skeptic with being a cynic. So let’s define the terms.

A cynic distrusts most information they see or hear, particularly when it challenges their own belief system. Most often, cynics hold views that cannot be changed by contrary evidence. Thus, they often become intolerant of other people’s ideas. It’s not difficult to find cynics everywhere in our society, from the halls of Congress to our own family dinner tables. People who are driven by inflexible beliefs rarely think like Galileo or Jobs.

Skepticism, on the other hand, is a key part of critical thinking – a goal of education. The term skeptic is derived from the Greek skeptikos, meaning “to inquire” or “look around.” Skeptics requires additional evidence before accepting someone’s claims as true. They are willing to challenge the status quo with open-minded, deep questioning of authority.

In today’s complex world, skeptics and cynics are often hard to differentiate. While the ability to challenge human authority has led to important innovation and reform, it has also made it possible, for a price, to prove our “rightness.” Oftentimes, what appear to be legitimate studies are manipulated to support a particular idea or outcome that a company, individual, or government believes is the truth.

And herein lays the dilemma of our modern day quest for certainty. When we can no longer be objective “inquirers” because we have already decided the truth, then we create a culture of cynicism instead of skepticism. Is this the kind of world we want for ourselves and our children?

If we model skepticism instead of cynicism, our children would inherit a world that would be less dependent on power and authority and more dependent on critical thinking and good judgment. Adolescents and young adults would be capable of questioning the reliability of what they think or hear. They would learn to believe in their natural abilities to facilitate positive change through intellectual inquiry. They would become discerning consumers of ideas rather than passive accepters of other people’s visions of certainty.

How we adults model the art of positive skepticism not only helps us make better informed decisions but also shows our children how to think for themselves. And, if kids learn to think for themselves, they learn to believe in themselves!

Five Ways to Model Positive Skepticism



Be a Deception-Detector

People constantly make claims that affect our daily lives. From those selling products and services to candidates running for political offices, we are barraged with decisions that require us to act. Thomas Kida, in his book Don’t Believe Everything You Think, shows how easily we can be fooled and why we should learn to think like a scientist.

Challenge claims by asking for evidence. Ask questions like, “What makes you think this way?” “What assumptions have you based your claim upon?” “What facts or research support your ideas?” “Are there facts or studies that dispute your claim?”

Doubt

Constant streams of commercial messages, TV news, and campaign ads try to tell us how to think. When we allow others to think for us, we become vulnerable to indoctrination, propaganda, and powerful emotional appeals. In her book, Descartes’s Method of Doubt, Janet Broughton examined the important role that doubt plays in our quest for truth.

Recognize the limits to anyone’s claims of truth! Look below the surface rather than accepting ideas at face value. Ask yourself questions like, “What is the logic of this argument?” Listen to yourself when something doesn’t feel right!

Play Devil’s Advocate

Part of being a good skeptic is learning to play a devil’s advocate role. Take a position you don’t necessarily agree with, just for the sake of argument. This doesn’t have to be combative. You can simply say “In order to understand this idea better; let me play the devil’s advocate.” Putting your mind to work poking holes in what you think might be a good idea can lead to greater understanding of a problem. Playing devil’s advocate is a great way to teach children how to see another person’s perspective.

Use Logic and Intuition

We are persuaded to doubt or believe other people’s claims through logic and intuition, and most of us tend to rely heavily on one type of thinking or the other. Whether you are a logical or intuitive thinker, it’s helpful to alternate between these two qualities of mind. In his book, Embracing Contraries, Peter Elbow says, “Doubting and believing are among the most powerful root acts we can perform with our minds.” We become better thinkers when we deploy doubting and believing more consciously through the use of logic and intuition rather than by chance.

Be a Bias-Detector

One of the most important tasks of a true skeptic is to determine whether sources of information and analysis are impartial. This is a trait that serves us well when we turn on the television. If we only listen to one channel, or our favorite news commentator, we’ll likely be persuaded by biased or emotional appeals. Ask yourself, “What’s the other side of this story?” “Is this one person’s story or does it apply to thousands of people?" “Is there an underlying belief or assumption being made that reflects this reporter’s ideology?”

R.M. Dawes’ points out in his book, Everyday Irrationality: How Pseudo-Scientists, Lunatics, and the Rest of Us Systematically Fail to Think Rationally, that emotional appeals and story-based thinking often lead to faulty reasoning. The point in detecting bias is to be able to identify messages that are intended to persuade rather than inform us.

Positive skepticism leads to better problem-solving, innovation, and creativity! It also helps develop our abilities to think critically about the world around us! Do you agree? Feel free to poke some holes in my thinking!

By Marilyn Price-Mitchell