30. april 2012

The Irrationality of Irrationality: The Paradox of Popular Psychology

Contribution by Samuel McNerney

In 1996, Lyle Brenner, Derek Koehler and Amos Tversky conducted a study involving students from San Jose State University and Stanford University. The researchers were interested in how people jump to conclusions based on limited information. Previous work by Tversky, Daniel Kahneman and other psychologists found that people are “radically insensitive to both the quantity and quality of information that gives rise to impressions and intuitions,” so the researchers knew, of course, that we humans don’t do a particularly good job of weighing the pros and cons. But to what degree? Just how bad are we at assessing all the facts?

To find out, Brenner and his team exposed the students to legal scenarios. In one, a plaintiff named Mr. Thompson visits a drug store for a routine union visit. The store manager informs him that according to the union contract with the drug store, plaintiffs cannot speak with the union employees on the floor. After a brief deliberation, the manager calls the police and Mr. Thompson is handcuffed for trespassing. Later the charges were dropped, but Mr. Thompson is suing the store for false arrest.

All participants got this background information. Then, they heard from one of the two sides’ lawyers; the lawyer for the union organizer framed the arrest as an attempt to intimidate, while the lawyer for the store argued that the conversation that took place in the store was disruptive. Another group of participants – essentially a mock jury – heard both sides.

The key part of the experiment was that the participants were fully aware of the setup; they knew that they were only hearing one side or the entire story. But this didn’t stop the subjects who heard one-sided evidence from being more confident and biased with their judgments than those who saw both sides. That is, even when people had all the underlying facts, they jumped to conclusions after hearing only one side of the story.

The good news is that Brenner, Koehler and Tversky found that simply prompting participants to consider the other side’s story reduced their bias – instructions to consider the missing information was a manipulation in a later study – but it certainly did not eliminate it. Their study shows us that people are not only willing to jump to conclusions after hearing only one side’s story, but that even when they have additional information at their disposal that would suggest a different conclusion, they are still surprisingly likely to do so. The scientists conclude on a somewhat pessimistic note: “People do not compensate sufficiently for missing information even when it is painfully obvious that the information available to them is incomplete.”

In Brenner’s study, participants were dealing with a limited universe of information – the facts of the case and of the two sides’ arguments. But in reality – especially in the Internet era – people have access to a limitless amount of information that they could consider. As a result, we rely on rules of thumb, or heuristics, to take in information and make decisions. These mental shortcuts are necessary because they lessen the cognitive load and help us organize the world – we would be overwhelmed if we were truly rational.

This is one of the reasons we humans love narratives; they summarize the important information in a form that’s familiar and easy to digest. It’s much easier to understand events in the world as instances of good versus evil, or any one of the seven story types. As Daniel Kahneman explains, “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.” The implication here is that it’s how good the story is, not necessarily its accuracy, that’s important.

But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes. This is what the participants in Brenner’s study highlight; people who take in narratives are often blinded to the whole story – rarely do we ask: “What more would I need to know before I can have a more informed and complete opinion?”

The last several years have seen many popular psychology books that touch on this line of research. There’s Ori and Rom Brafman’s Sway, Dan Ariely’s Predictably Irrational and, naturally, Daniel Kahneman’s Thinking, Fast and Slow. If you could sum up the popular literature on cognitive biases and our so-called irrationalities it would go something like this: we only require a small amount of information, often times a single factoid, to confidently form conclusions and generate new narratives to take on new, seemingly objective, but almost entirely subjective and inaccurate, worldviews.

The shortcomings of our rationality have been thoroughly exposed to the lay audience. But there’s a peculiar inconsistency about this trend. People seem to absorb these books uncritically, ironically falling prey to some of the very biases they should be on the lookout for: incomplete information and seductive stories. That is, when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired. They jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment.

Tyler Cowen made a similar point in a TED lecture a few months ago. He explained it this way:

There’s the Nudge book, the Sway book, the Blink book… [they are] all about the ways in which we screw up. And there are so many ways, but what I find interesting is that none of these books identify what, to me, is the single, central, most important way we screw up, and that is, we tell ourselves too many stories, or we are too easily seduced by stories. And why don’t these books tell us that? It’s because the books themselves are all about stories. The more of these books you read, you’re learning about some of your biases, but you’re making some of your other biases essentially worse. So the books themselves are part of your cognitive bias.

The crux of the problem, as Cowen points out, is that it’s nearly impossible to understand irrationalities without taking advantage of them. And, paradoxically, we rely on stories to understand why they can be harmful.

To be sure, there’s an important difference between the bias that comes from hearing one side of an argument and (most) narratives. A corrective like “consider the other side” is unlikely to work for narratives because it’s not always clear what the opposite would even be. So it’s useful to avoid jumping to conclusions not only by questioning narratives (after all, just about everything is plausibly a narrative, so avoiding them can be pretty overwhelming), but by exposing yourself to multiple narratives and trying to integrate them as well as you can.

In the beginning of the recently released book The Righteous Mind, social psychologist Jonathan Haidt explains how some books (his included) make a case for how one certain thing (in Haidt’s case, morality) is the key to understanding everything. Haidt’s point is that you shouldn’t read his book and jump to overarching conclusions about human nature. Instead, he encourages readers to always think about integrating other points of view (e.g., morality is the most important thing to consider) with other perspectives. I think this is a good strategy for overcoming a narrow-minded view of human cognition.

It’s natural for us to reduce the complexity of our rationality into convenient bite-sized ideas. As the trader turned epistemologist Nassim Taleb says: “We humans, facing limits of knowledge, and things we do not observe, the unseen and the unknown, resolve the tension by squeezing life and the world into crisp commoditized ideas.” But readers of popular psychology books on rationality must recognize that there’s a lot they don’t know, and they must be beware of how seductive stories are. The popular literature on cognitive biases is enlightening, but let’s be irrational about irrationality; exposure to X is not knowledge and control of X. Reading about cognitive biases, after all, does not free anybody from their nasty epistemological pitfalls.

Moving forward, my suggestion is to remember the lesson from Brenner, Koehler and Tversky: they reduced conclusion jumping by getting people to consider the other information at their disposal. So let’s remember that the next book on rationality isn’t a tell-all – it’s merely another piece to the puzzle. This same approach could also help correct the problem of being too swayed by narratives – there are anyways multiple sides of a story.

Ultimately, we need to remember what philosophers get right. Listen and read carefully; logically analyze arguments; try to avoid jumping to conclusions; don’t rely on stories too much. The Greek playwright Euripides was right: Question everything, learn something, answer nothing.

====

The author personally thanks Dave Nussbaum for his helpful editorial comments and criticisms. Dave is a social psychologist who teaches at the University of Chicago. Follow him on Twitter and check out his homepage.

Image: by Wyglif on Wikimedia Commons.

About the Author: Sam McNerney graduated from the greatest school on Earth, Hamilton College, where he earned a bachelors in Philosophy. After reading too much Descartes and Nietzsche, he realized that his true passion is reading and writing about cognitive science. Now, he is working as a science journalist writing about philosophy, psychology, and neuroscience. He has a column at CreativityPost.com and a blog at BigThink.com called "Moments of Genius". He spends his free time listening to Lady Gaga, dreaming about writing bestsellers, and tweeting @WhyWeReason. Follow on Twitter @whywereason.

29. april 2012

The Six Best Ways to Decrease Your Anxiety


We all know the uncomfortable feeling of anxiety. Our hearts race, our fingers sweat, and our breathing gets shallow and labored. We experience racing thoughts about a perceived threat that we think is too much to handle. That's because our "fight or flight" response has kicked in, resulting in sympathetic arousal and a narrowing of attention and focus on avoiding the threat. We seem to be locked in that state, unable to focus on our daily chores or longer-term goals. As a Cognitive-Behavior Therapist with more than 15 years of experience, I have found a variety of techniques that I can teach my patients with anxiety disorders such as phobias, panic attacks, or chronic worry. Some are based on changing thoughts, others on changing behavior, and still others involve physiological responses. The more aspects of anxiety I can decrease, the lower the chance of relapse post-therapy. Below are six strategies that you can use to help your anxiety.:

(1) Reevaluating the probability of the threatening event actually happening

Anxiety makes us feel threat is imminent yet most of the time what we worry about never happens. By recording our worries and how many came true, we can notice how much we overestimate the prospect of negative events.

(2) Decatastrophizing

Even if a bad event happened, we may still be able to handle it by using our coping skills and problem-solving abilities or by enlisting others to help. Although not pleasant, we could still survive encountering a spider, having a panic attack, or losing money. It's important to realize that very few things are the end of the world.

(3) Using deep breathing and relaxation to calm down

By deliberately relaxing our muscles we begin to calm down so we can think clearly. If you practice this without a threat present at first, it can start to become automic and will be easier to use in the moment when you face a threat. Deep breathing engages the parasympathetic nervous system to put the brakes on sympathetic arousal.

(4) Becoming mindful of our own physical and mental reactions

The skill of mindfulness involves calmly observing our own reactions, including fear, without panic or feeling compelled to act. It is something that can be taught in therapy and improves with practice.

(5) Accepting the Fear and Committing to Living a Life Based on Core Values

Acceptance and Commitment Therapy (ACT) is an approach that encourages people to accept the inevitability of negative thoughts and feelings and not try to repress or control them. By directing attention away from the fear and back onto life tasks and valued goals, we can live a full life despite the fear.

(6) Exposure

Exposure is the most powerful technique for anxiety and it involves facing what we fear and staying in the situation long enough for the fear to habituate or go down, as it naturally does. Fear makes us avoid or run away, so our minds and bodies never learn that much of what we fear is not truly dangerous.

25. april 2012

The New Brain Science - Optimizing Neuroplasticity

Remarkable discoveries about neuroplasticity over the last few decades are creating (amazing) opportunities.

You may have heard of self-directed neuroplasticity, the brain revolution, or even the brain rewiring itself and wondered what is going on and what it has to do with the function of your brain.

Well, here’s the long and short of it.

Neuroplasticity describes the ability of the brain to change – literally to rewire itself in response to experience.

Where before the brain was thought of as a machine, preset at age five and deteriorating from there, we now know that we have the capacity to re-wire brain circuits and grow new neural pathways.

Ever since neuroscientists discovered neuroplasticity and also neurogenesis (the brain’s ability to grow new neural networks), practitioners have sought to bring these new findings to their patients.

These findings open clinical possibilities when treating everything from stroke to OCD to anxiety and depression.

Any yet, the possibilities that can result from neuroplasticity aren’t limited to work done by health and mental health professionals.

Teachers, coaches, managers, even parents can benefit from learning ways to rewire the brain.

So what brain lessons have we learned so far? I’ll give you a few examples:

Too much stress can be harmful, not just to the cardiovascular system, but also to your brain. Recent studies are showing that when corticosteroids (the stress hormone) were increased, neurogenesis (brain growth) decreased. In fact, chronic stress can shrink the brain making it hard to learn new information or even retain the information you already have.

You need to find the “sweet spot” for neuroplasticity. New learning is most likely to take place when the brain has an optimal amount of arousal. Too much arousal and the brain shuts down; not enough, and it gets distracted and lazy. For you teachers, parents, and managers, this is especially useful information about the function of the brain.

Aerobic exercise is one of the best things you can do for your brain – at any age! Aerobic exercise increases BDNF (brain-derived neurotrophic factor), which when released into the system, enhances brain growth and neural connections, and overall function of the brain. John Ratey, MD, calls it “miracle grow” for the brain.
There is a brain-belly connection. That’s right: to improve brain health, you need to take a metabolic approach, including controlling inflammation and insulin levels. And don’t forget to eat brain happy foods like nuts, omega 3s, and avocados.

15. april 2012

How to Form & Change Habits

The Power of Habit | Psychology Today

I’m not alone in my opinion about how this responsibility lies with the individual in question. Recently, I listened to the audio edition of Charles Duhigg’s book, The Power of Habit. Although I didn’t agree with how widely he cast the conceptual net of “habit,” I did agree with most of what he said as he summarized recent research about habits, habit formation and how we change habits.

Interestingly, he saved a key statement to the very end of his book (the last nine minutes of the audio book). I’ve transcribed key ideas of these statements below. I added the emphasis with bold font.

"Once you know a habit exists, you have the responsibility to change it . . . others have done so . . . That, in some ways, is the point of this book. Perhaps a sleep-walking murderer can plausibly argue that he wasn’t aware of his habit, and so he doesn’t bear responsibility for his crime, but almost all of the other patterns that exist in most people’s lives — how we eat and sleep and talk to our kids, how we unthinkingly spend our time, attention and money — those are habits that we know exist. And once you understand that habits can change, you have the freedom and the responsibility to remake them. Once you understand that habits can be rebuilt, the power of habit becomes easier to grasp and the only option left is to get to work."

Duhigg then goes on to quote William James, and while it is tempting to add this, it isn’t necessary. The key points were well said above.


“Once you understand that habits can change, you have the freedom and the responsibility to remake them . . . Once you understand that habits can be rebuilt, the power of habit becomes easier to grasp and the only option left is to get to work.”
The procrastination habit can change, and it is through learning about why we procrastinate and how that relates to your particular habit that you can find that keystone habit that will be central to change. Perhaps your procrastination hinges on internalized unrealistic expectations of others and an irrational dialogue that this has set up in your own mind. Perhaps it’s your unwillingness to tolerate frustration or delay of gratification, you always want to feel good now. Perhaps it’s chronic disorganization. Whatever it is, it is something, and it can change. Find that keystone habit, and you will leverage change to more life-giving as opposed to self-defeating habits like procrastination.

I write about procrastination research here in my blog and interview colleagues for my iProcrastinate podcasts as a resource for this self-exploration, but the final step is always our own, as we take seriously our own freedom and responsibility for change. What a wonderful promise our agency holds for us, as we autonomously shape our own lives and enhance our well being.

The power of habit can be a life-giving, even life-saving, force in our lives.

As Duhigg concludes,

"Once we choose who we want to be, people grow to the way in which they have been exercised . . . If you believe you can change, if you make it a habit, the change becomes real. This is the real power of habit. The insight that your habits are what you choose them to be. Once that choice occurs, and becomes automatic, habitual, it's not only real, it starts to seem inevitable."

So, let's just get started. That's a keystone habit.



Tags: bold font, different ways, drug abuse, drugs, duhigg, habit formation, habits, learning, murderer, nancy reagan, procrastination, self help books, sleep, slogan, the power of habit

10. april 2012

Is Neuroscience the New Freakonomics?


Could neuroscience surpass economics when it comes to figuring out why and when we choose to buy?

Many of the bestselling business books of the past decade, such as “Freakonomics” and “The Undercover Economist”, started with an implicit, fundamental premise: "If it can't be quantified or calculated, it can't be true."

These books often reduced baffling and complex scenarios -- everything from global warming to why there are so many Starbucks stores in your neighborhood — to simple explanations supported by basic economic thinking. Sometimes these explanations contained charts, graphs and little diagrams that made the world appear neat, tidy and orderly. A decade ago, in fact, Google made news when they hired UC Berkeley economics professor Hal Varian as their first in-house economist. Varian was charged with modeling consumer behaviors and consulting on corporate strategy. The announcement further projected the belief that, in short, economics was the key to market success.

Today, Google should be looking for a prize-winning neuroscientist.

The new generation of business thinking combines a more nuanced understanding that many actions can neither be quantified nor calculated. Often, these insights are culled from the cutting edge of neuroscience. This new "what you didn't know about your brain can help you"-genre most likely started with science writer Jonah Lehrer's wonderful ”Proust was a Neuroscientist.”. In the book, Lehrer explains how many of the underpinnings of modern neuroscience were actually discovered by the likes of Proust, Stravinsky and Escoffier.

A slew of other titles have followed, each of them offering unique insights into the workings of the human brain. Along the way, we've been told how the human brain decides what to buy, why traditional brainstorming approaches don't work as well as they should, how changing the default settings can change the final outcome and why companies need to understand and cultivate the habits of their customers.

We are, as a society, experiencing a profound reappraisal of traditional economics and its shortcomings. The world is suddenly a lot more irrational than we ever thought, full of black swans. In Economics 101, we're taught that economic models are able to predict the behavior of coldly rational decision-makers. Charts and graphs follow a simple mathematical beauty. When we lower interest rates, we expect a certain reaction. When we devise incentives for customers, we expect them to react in a certain way. When we provide customers with a menu of choices, we expect them to answer in a certain way.

The only problem, of course, is that humans are not always rational.

Not surprisingly, some of the most popular business titles of the past few years have combined research findings at the cutting edge of economic theory. Perhaps the best example is Daniel Kahneman's ”Thinking, Fast and Slow,” which is now edging up the bestseller lists. Kahneman is a Nobel Prize-winning economist who has helped to popularize the latest in economics thinking, including loss aversion. Interestingly enough, Kahneman refers to himself as a psychologist, rather than an economist.

This new thinking about the way the human brain works is starting to impact everything -- how supermarkets stock their shelves, when coupon offers are sent out to consumers, and how to devise the perfect title that will get you to click on a news article (wait, did you think that your reading this was an accident?). A retail store such as Target now knows that you're pregnant before your parents do, thanks to the wonders of understanding customer purchase habits. On the Web, understanding human behavior is everything, given that the best and brightest of our generation are now engaged in an elaborate game of getting people to click on a specific button, text link or banner ad.

A decade ago, if you asked top business leaders whether they'd ever consider reading a book on neuroscience, they probably would have looked askance at you while tapping away at their BlackBerry. Today, they realize that profits lie in understanding how the human brain works, how people make decisions, and what influences the final purchase. What they may not realize, however, is that this understanding is moreso in a neuroscientist’s wheelhouse than an economist’s.

The Washington Post
Innovations
By Dominic Basulti

Your Brain on Fiction

AMID the squawks and pings of our digital devices, the old-fashioned virtues of reading novels can seem faded, even futile. But new support for the value of fiction is arriving from an unexpected quarter: neuroscience.

Brain scans are revealing what happens in our heads when we read a detailed description, an evocative metaphor or an emotional exchange between characters. Stories, this research is showing, stimulate the brain and even change how we act in life.

Researchers have long known that the “classical” language regions, like Broca’s area and Wernicke’s area, are involved in how the brain interprets written words. What scientists have come to realize in the last few years is that narratives activate many other parts of our brains as well, suggesting why the experience of reading can feel so alive. Words like “lavender,” “cinnamon” and “soap,” for example, elicit a response not only from the language-processing areas of our brains, but also those devoted to dealing with smells.

In a 2006 study published in the journal NeuroImage, researchers in Spain asked participants to read words with strong odor associations, along with neutral words, while their brains were being scanned by a functional magnetic resonance imaging (fMRI) machine. When subjects looked at the Spanish words for “perfume” and “coffee,” their primary olfactory cortex lit up; when they saw the words that mean “chair” and “key,” this region remained dark. The way the brain handles metaphors has also received extensive study; some scientists have contended that figures of speech like “a rough day” are so familiar that they are treated simply as words and no more. Last month, however, a team of researchers from Emory University reported in Brain & Language that when subjects in their laboratory read a metaphor involving texture, the sensory cortex, responsible for perceiving texture through touch, became active. Metaphors like “The singer had a velvet voice” and “He had leathery hands” roused the sensory cortex, while phrases matched for meaning, like “The singer had a pleasing voice” and “He had strong hands,” did not.

Researchers have discovered that words describing motion also stimulate regions of the brain distinct from language-processing areas. In a study led by the cognitive scientist Véronique Boulenger, of the Laboratory of Language Dynamics in France, the brains of participants were scanned as they read sentences like “John grasped the object” and “Pablo kicked the ball.” The scans revealed activity in the motor cortex, which coordinates the body’s movements. What’s more, this activity was concentrated in one part of the motor cortex when the movement described was arm-related and in another part when the movement concerned the leg.

The brain, it seems, does not make much of a distinction between reading about an experience and encountering it in real life; in each case, the same neurological regions are stimulated. Keith Oatley, an emeritus professor of cognitive psychology at the University of Toronto (and a published novelist), has proposed that reading produces a vivid simulation of reality, one that “runs on minds of readers just as computer simulations run on computers.” Fiction — with its redolent details, imaginative metaphors and attentive descriptions of people and their actions — offers an especially rich replica. Indeed, in one respect novels go beyond simulating reality to give readers an experience unavailable off the page: the opportunity to enter fully into other people’s thoughts and feelings.

The novel, of course, is an unequaled medium for the exploration of human social and emotional life. And there is evidence that just as the brain responds to depictions of smells and textures and movements as if they were the real thing, so it treats the interactions among fictional characters as something like real-life social encounters.

Raymond Mar, a psychologist at York University in Canada, performed an analysis of 86 fMRI studies, published last year in the Annual Review of Psychology, and concluded that there was substantial overlap in the brain networks used to understand stories and the networks used to navigate interactions with other individuals — in particular, interactions in which we’re trying to figure out the thoughts and feelings of others. Scientists call this capacity of the brain to construct a map of other people’s intentions “theory of mind.” Narratives offer a unique opportunity to engage this capacity, as we identify with characters’ longings and frustrations, guess at their hidden motives and track their encounters with friends and enemies, neighbors and lovers.

It is an exercise that hones our real-life social skills, another body of research suggests. Dr. Oatley and Dr. Mar, in collaboration with several other scientists, reported in two studies, published in 2006 and 2009, that individuals who frequently read fiction seem to be better able to understand other people, empathize with them and see the world from their perspective. This relationship persisted even after the researchers accounted for the possibility that more empathetic individuals might prefer reading novels. A 2010 study by Dr. Mar found a similar result in preschool-age children: the more stories they had read to them, the keener their theory of mind — an effect that was also produced by watching movies but, curiously, not by watching television. (Dr. Mar has conjectured that because children often watch TV alone, but go to the movies with their parents, they may experience more “parent-children conversations about mental states” when it comes to films.)

Fiction, Dr. Oatley notes, “is a particularly useful simulation because negotiating the social world effectively is extremely tricky, requiring us to weigh up myriad interacting instances of cause and effect. Just as computer simulations can help us get to grips with complex problems such as flying a plane or forecasting the weather, so novels, stories and dramas can help us understand the complexities of social life.”

These findings will affirm the experience of readers who have felt illuminated and instructed by a novel, who have found themselves comparing a plucky young woman to Elizabeth Bennet or a tiresome pedant to Edward Casaubon. Reading great literature, it has long been averred, enlarges and improves us as human beings. Brain science shows this claim is truer than we imagined.


Annie Murphy Paul is the author, most recently, of “Origins: How the Nine Months Before Birth Shape the Rest of Our Lives.”

How does the human brain decide which memories to store?

In a year alone, we experience hundreds of thousands of small events that have the potential to become memories. Yet our brain will only store a certain number of these memories (or at least only allow us access to some of them).

How does the brain decide which memories are stored?

The brain uses a number of automatic mechanisms to determine what information to retain. Everything else naturally fades away.

The brain's overriding principle, given to it from millions of years of evolution, is to retain whatever is likely to be useful later for long-term survival. Since the future utility of information is impossible to predict, the brain uses a number of heuristics that have been honed over the millenia.

Here are some of the most well studied:

Repetition

This is probably #1. Things that happen repeatedly are either highly significant or irrelevant. However even if they are irrelevant -- like the background noise that you tune out -- they must be identified so that they can be removed from perception. When studying for a test, students often use repetition to activate the brain's importance circuits.


Primacy and Recency

Things that happened first are often more important because they predict what comes later. And things that happened most recently are often the most relevant because they are closest to the present. Things in the middle tend to get forgotten. This is why so many presentations start and end with an overview of the key points.


Surprise

Anything that is unusual stands out. This can include an uncanny coincidence or an event that led to something unpredicted. An entertaining science teacher will ask students to guess what will happen and then show that the opposite happens. Setting up the experience of surprise increases retention. If you are thinking of calling someone and the phone rings and its them, you will remember that for a long time because the coincidence is so unusual, while forgetting all the times you thought of calling them and the phone didn't ring.


Emotional Impact

Emotions are one of the ways the brain prioritizes perception and action. Emotions are a way of assessing and categorizing situations according to their role in our instinctual survival program. A moment correlated with a strong emotional state will be retained for a long time, which is why, for example leading up to a car accident, people have the memory of time slowing down and noticing every detail. Time didn't actually slow down -- it's just that a lot of detail got recorded and so the event is remembered this way.


Leads to positive or negative outcome

The systems in the brain that learn behaviors and habits are especially tuned to the eventual outcome of an action or perception. This is why addictive activities such as gambling can be so tenacious. With a slot machine, most of the time nothing happens, but sometimes the bells ring and the sign flashes "winner!". Each dose of "reward" ensures that more quarters go into the machine. Addictive drugs like nicotine and cocaine activate reward circuits directly, causing everything that led up to taking the drug to be given automatic priority by the brain. Dopamine is a neurotransmitter central to signaling reward and activating procedural memory formation. When something leads to a strongly unpleasant outcome, emotional circuits label the preceding events as fearful.