Marijuana Makes for Slackers? Now There’s Evidence

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

In cities like Seattle and Vancouver, the marijuana icon has become almost as common on storefronts as the Starbucks mermaid. But there’s one big difference between the products on offer: A venti latte tastes the same everywhere and provides an identical caffeine rush, while marijuana stores offer the drug’s active ingredients in varying combinations, potencies and formats. There is no consistency in testing, standards or labeling.

This matters because marijuana’s two psychoactive ingredients, tetrohydrocannabinol (THC) and cannabidiol (CBD), have contrasting effects on the brain. “THC makes you feel high,” said Catharine Winstanley, a psychology professor at the University of British Columbia who does research on marijuana, while CBD “is responsible for its analgesic, antiseizure and purported anticancer effects.”

In street marijuana, the THC-to-CBD ratio now tends to be 10 to 1, and it is increasing, a trend occurring even at some marijuana clinics, Dr. Winstanley said. And few people know what effect that has on their brains. A new study by Dr. Winstanley’s group in the Journal of Psychiatry and Neuroscience examines how these two chemicals shape our willingness to face a challenge. Does marijuana make us lazy?

To answer this question, the Winstanley team first tested rats to determine which were hard workers and which were slackers. After being injected with THC, CBD or both, the rats had to choose between a simple task with a measly reward or a demanding one that reaped a bigger payoff.

In the easy version, the animals had a minute—an eternity for a rat—to see that a light was on in a chamber and poke their noses inside. They got a single sugar pellet. In the hard version they only had a fifth of a second to notice the light—something a rat, even a stoned one, should have no problem perceiving—and respond. Their vigilance in the hard choice would earn them two lumps instead of one. Under normal circumstances, the vast majority of rats prefer to work harder for a bigger payoff.

The results? “Whether they were workers or slackers to begin with,” Dr. Winstanley reported, “even small amounts of THC made them all slackers.”

THC didn’t impair the rats’ ability to perform, only their willingness to try. That downshift in motivation didn’t happen in rats injected with CBD only.

Later analysis of the rats’ brains showed that those with the greatest reaction to THC also had a greater density of a particular receptor in their anterior cingulate cortex, or ACC. “That area of the brain is very important for people to gear up to face a challenge and stay the course,” Dr. Winstanley said.

A small study shows something similar in humans. Published this month in the journal Psychopharmacology by a University College London team, the study of 17 adults showed that inhaling cannabis with THC alone (versus pot with CBD plus THC, or a placebo), induces people to choose an easy task more often, eschewing the harder one that offered four times the payoff. Neither the researchers nor the subjects knew who had gotten the drug, who the placebo. The effects were short term, meaning that the subjects’ apathy didn’t persist after the high wore off.

The need for policy-makers to deal with the results of tests like these is complicated by the lack of regulatory consistency. That’s because the U.S. Drug Enforcement Administration considers marijuana as illegal as heroin, while 25 states and the District of Columbia have legalized pot for various purposes. So no national standards exist.

“Thinking that it’s harmless, that you can smoke cannabis and you’ll be fine, is a false assumption,” said Michael Bloomfield, a University College London professor in psychiatry and one of the UCL study’s authors. “THC alters how willing you are to try things that are more difficult.” So next time you go to a clinic—or dealer—you might want to ask about the product’s chemical breakdown.

Medicating Children With ADHD Keeps Them Safer

New research suggests that medication can reduce risky behavior in teenagers with attention deficit hyperactivity disorder, or ADHD

 

By

SUSAN PINKER

Updated Aug. 17, 2016 10:23 a.m. ET

See the column on the Wall Street Journal site

 

If a pill could prevent teenagers from taking dangerous risks, would you consider it for your children?

I’d be tempted. My skateboard- and bicycle-riding son was hit by a car—twice—when he was a teenager. I would have welcomed anything that could have averted those dreadful phone calls from the emergency room.

While some bumps and scares are inevitable for active guys like him, serious misadventures with long-lasting repercussions are often par for the course for a subset of them—those with attention deficit hyperactivity disorder, or ADHD. But a new article suggests that early medication can significantly cut the odds of bad things happening later.

Affecting nearly 9% of all Americans between 4 and 18 years of age, ADHD is one of the most common childhood disorders and also one of the most misunderstood. Its symptoms color almost every aspect of a child’s life—from being able to focus in school to making and keeping friends, reining in fleeting impulses and assessing risk and danger.

Indeed, accidents are the most common cause of death in individuals with ADHD, withone 2015 study of over 710,000 Danish children finding that 10- to 12-year-olds with ADHD were far more likely to be injured than other children their age. Drug treatment made a big difference, however, nearly halving the number of emergency room visits by children with ADHD.

Medicating children to address problems with attention and self-control remains controversial. ADHD isn’t visible, like chickenpox, nor immediately life-threatening, like asthma. Its distortion of a child’s ability to meet adults’ expectations creates an atmosphere of frustration and blame. So it’s not often taken for what it really is: a neurodevelopmental disorder with genetic roots.

An enduring myth about ADHD is that children grow out of it in adolescence. We now know that a 5-year-old with a bona fide attentional disorder may well become a dreamy, restless and impulsive teenager and adult. Adolescents with ADHD think even less about consequences than the average teenager and are especially thrilled by novelty. They’re more likely than their friends to drink too much, drive like maniacs, abuse drugs and have unprotected sex.

It’s a sobering list. But an article published last month by Princeton researchers Anna Chorniyand Leah Kitashima in the journal Labour Economics shows that treating ADHD with medication during childhood can head off later problems. “We have 11 years of data for every child enrolled in South Carolina Medicaid who was diagnosed with ADHD,” Dr. Chorniy told me. The researchers tracked each doctor visit and every prescription, with a sample of over 58,000 children whose health progress they tracked into adulthood.

This long view let the economists compare the behaviors of teens treated with the most common ADHD medications, such as Ritalin, Concerta and Adderall, to the types of risks taken by other children with ADHD who were not treated. The researchers found fewer and less severe injuries and health problems among the treated children: a 3.6% reduction in sexually transmitted infections; 5.8% fewer children who sought screening for sexually transmitted infections (suggesting they had had an unprotected sexual tryst); and 2% fewer teen pregnancies.

That adds up to a lot fewer teenagers in trouble.

The economists did their study based on existing data, but randomized, controlled studies—experiments carefully designed to establish cause-and-effect relationships—have reached the same conclusion: that medication to control ADHD can reduce the high price in psychic pain, loss of educational opportunity and riven relationships. A child whose disorder is diagnosed and treated early by a trained clinician stands a better chance of growing into a healthy and thoughtful adult.

 

To Beat the Blues, Visits Must Be Real, Not Virtual

Loneliness keeps increasing, but new research suggests that electronic ways of keeping in touch do little compared with in-person contact

See the column on the Wall Street Journal site

 

Imagine being stranded on a desert island with a roof over your head and sufficient provisions—but no human contact other than what you can get from your smartphone. Would you get depressed? Or would your networked device provide enough connection to stave off dark thoughts?

This metaphor applies to a great many Americans. Their basic material needs are covered, and 85% have internet access. Yet at least 26% say that they feel deeply lonely. Psychologists know this from population surveys, not because people talk about it. The distress of feeling rejected or neglected by friends or family is a key predictor of depression, chronic illness and premature death. It’s also a public-health time bomb. The rate of loneliness has increased from about 14% in the 1970s to over 40% among middle-aged and older adults today, and the aging of America’s population is likely to make things worse in the years ahead.

Few public health initiatives aim at combating loneliness, despite the fact that it’s riskier to health and survival than cigarette smoking or obesity. It’s also a taboo topic. Doctors don’t often ask about it, and we might not fess up, even if they did. There’s a fine line between loneliness and exclusion, and who wants to admit to that?

Many of us expect our smartphones and tablets to be the perfect antidote to social malaise. But do virtual experiences provide that visceral sense of belonging so important to being human?

A recent study pokes a hole in that assumption. Alan Teo, an assistant professor at Oregon Health & Science University, followed 11,000 adults over age 50 who participated in a national study of aging at some point between 2004 and 2010. He and his colleagues wanted to know what type of social contact or lack of it might predict clinical depression two years later.

Major depression, the disease of dark thoughts, hits 16% of all Americans, who are twice as likely to be diagnosed with it during their lifetimes as they are to be diagnosed with cancer. Yet there’s not much talk of prevention.

The research team, which published its findings last October in the Journal of the American Geriatrics Society, controlled for demographic factors like age and sex—as well as for any medical, family or psychological history that might boost one’s depression risk. They found that only face-to-face interaction forestalled depression in older adults. Phone calls made a difference to people with a history of mood disorders but not to anyone else. Email and texts had no impact at all.

How often people got together with friends and family—or didn’t—turned out to be key. What’s more, the researchers discovered that the more in-person contact there was in the present, the less likely the specter of depression in the future.

People who had face-to-face contact with children, friends and family as infrequently as every few months had the highest rates of the disease. Those who connected with people in person, at least three times a week, had the lowest.

“That’s the beauty of it,” Dr. Teo told me. “The more often they got together in person, the better off they were.”

Winston Churchill called his own bouts of depression “my black dog,” and we know that it can be a tenacious foe. This study tells us that a cheap and easy way to foil it is in-person interaction, and that how you connect and with whom is important: People between the ages of 50 and 70 were best protected by face-to-face contact with their friends. Over the age of 70, it was in-person contact with family that mattered most.

Of course, as Dr. Teo said, phone and email are still great for making social plans. But to keep dark and dangerous thoughts at bay, you have to leave your desert island now and then and be there, in the flesh.

 

How Babies Quickly Learn to Judge Adults

Even if toddlers can’t tell us, they are making hard and fast judgments about adults

See the column on the Wall Street Journal site

 

Adults often make snap judgments about babies. First impressions lead us to assign them personalities, such as fearful, active or easy to please, and with good reason. Fifty years of evidence shows that babies begin life with traits that set the stage for how they interact with the world—and how the world reacts to them.

That might be one reason why siblings can have such wildly different takes on their own families. Once a mother has assessed her child as shy or fussy, she tends to tailor her behavior to that baby’s personality.

But what if babies make hard and fast judgments about us, too? Just because they can’t say much doesn’t mean they don’t have strong opinions. New research shows that babies are astute observers of the emotional tenor of adult interactions and censor their own behavior accordingly. Published in the March issue of Developmental Psychology, the study shows that infants who get a glimpse of a stranger involved in an angry exchange with another stranger will then act more tentatively during play.

The study’s lead authors, Betty Repacholi and Andrew Meltzoff, both of the University of Washington, explained that infants who witness an emotional outburst then expect that person to lose his cool again in a new situation. “Babies are registering how we respond emotionally,” Dr. Meltzoff said, “taking notes on how we typically react.”

The experiment included 270 15-month-old toddlers who watched two adults unfamiliar to them demonstrating how to play with an intriguing new toy. One adult, called “the emoter,” reacted either neutrally or angrily to the other adult’s attempts to play with the toy, showing her emotional cards by commenting “that’s entertaining” in a dispassionate tone or “that’s aggravating” in an angry rebuke.

The babies who witnessed the adult’s harsh reaction were then more likely to hang back before touching the intriguing toy. Even if the anger-prone adult had turned her back, and even when a different plaything was offered, the child’s hesitation was palpable. Some toddlers avoided the toy altogether.

Taking an adult’s emotional temperature happened quickly. Each baby was tested three times, but it usually took just one instance of verbal aggression for the baby to pigeonhole an adult as a hothead. The babies had “formed an impression about [the adult’s] psychological makeup, that this is an angry person,” said Dr. Repacholi.

What’s more, a secondhand brush with a riled-up adult will prompt toddlers to mollify that person. Other studies by Drs. Repacholi and Meltzoff and colleagues, using the same “eavesdropping on two strangers” design and published in February in the journal Infancy, showed that toddlers who witness an adult’s anger are more likely than other toddlers to hand over a prized toy. “Because they’ve learned that the adult is anger-prone, they try to appease her,” Dr. Repacholi said.

Well before they attribute thoughts and motivations to other people, young toddlers suss out any volatility in the adults around them, these studies show. But the findings also prompt some deeper questions. If brief shows of anger put toddlers on high alert, what might this mean for the inevitable conflicts that occur in family life?

As their studies involved babies observing the interactions between two people they had never met before, Drs. Repacholi and Meltzoff explained, their findings don’t really reproduce family life, during which parents and siblings show all kinds of feelings in various situations. “They have a history of interacting with their babies that the strangers in our study did not have,” Dr. Meltzoff wrote in an email.

Getting angry occasionally is not going to override the positive expectations that babies have built up about you over months of loving encounters, he told me. Still, “we are catching a glimpse of how babies pigeonhole us, and how they would describe our personalities, if they could only talk.”

The Perilous Aftermath of a Simple Concussion

Susan Pinker on how a concussion was both a personal struggle for her and a catalyst to study a phenomenon still only partly understood

See the column on the Wall Street Journal site

 

Eighteen months ago, a pickup truck hit me while I was walking on the sidewalk. The last thing I remember from that sunny Tuesday morning was reaching for my car keys. Then the lights went out.

I regained consciousness while being slid out of the scanner in the trauma unit. I had two fractures, two torn tendons, some wicked road rash and a concussion. The accident shook up my relationships, my memory and my personal drive. Still, everyone told me I was lucky to be alive, and I agree. Life has never seemed as tenuous—or as precious.

Not everyone with a mild brain injury is as lucky. A 20-year study published in February in the Canadian Medical Association Journal shows that people who have had a mild concussion are twice as likely to commit suicide as military personnel and more than three times as likely as members of the general population. Most of us agree with the bromide that time heals all wounds, but this study showed the opposite: After the concussion the risk of suicide rose steadily over time.

Led by Donald Redelmeier, a professor of medicine at the University of Toronto, the study included 235,110 healthy people who had seen a doctor after their accident. Most of the patients had no prior psychiatric diagnosis, hospital admission or suicide attempt.

“In the aftermath of a crash there is tremendous agony. But the broken ribs and leg will heal,” Dr. Redelmeier told me. “I’m not as sanguine about a concussion. Even when the CT scan doesn’t show major trauma, a minor injury can damage thousands and thousands of neurons. There are all sorts of problems that can last a long time, and we don’t know how to treat them.”

That clinical gap was clear. As I was leaving the emergency room, a staffer handed me a tip sheet written for teenage hockey players. (I live in Canada, after all.) There was no information for adults, nor anything on women and girls—who are known to be at greater risk of long-term problems after a concussion. When I asked the surgeon about cognitive symptoms during a follow-up visit, he exclaimed, “One concussion! The risk comes with more than one knock.” He added, “You’ll be fine.”

But I was far from fine. I spent mornings doing media interviews by phone about my new book, trying to sound upbeat. I spent afternoons sleeping, sobbing or staring at the clock, willing the time to pass so I could take another dose of oxycodone.

Meanwhile, well-meaning friends and colleagues were suggesting that the accident was some sort of warning. “This is God’s way of telling you to slow down,” said one. “Were you texting? Wearing headphones?” asked another. The refrain was that I should be thankful I’d dodged a karmic bullet and just get on with things.

But life was very different. A year after the accident I invited a friend to a concert—then blithely went on my own, forgetting all about her. I napped like a toddler and, because of fatigue and shoulder pain, couldn’t work a full day. I was never suicidal, thank goodness. But I was racked by an unanswered question: Why did this happen?

That lack of closure leads to one of the most astounding of Dr. Redelmeier’s findings. Compared to weekday accidents, weekend concussions magnify one’s suicide risk. One explanation could be psychological, he said. “If you get hurt at work, you can blame the circumstances. But if you get hurt horseback riding, that might affect how much support and sympathy you get, whether there’s companionship in fighting the good fight, or whether people feel you’re the architect of your own misfortune.”

Understanding why something happened is as important to brain health as a cast is to bone strength, it seems. It turns out that I am lucky, after all. I may never know why the driver didn’t see me. But I do know one thing: I was working that day, and it was a Tuesday.

The Peril of Ignoring Vaccines—and a Solution

Once considered eradicated in the U.S., measles is back. A look at the dangers of shunning vaccines and what can be done.

See the column on the Wall Street Journal site

In 2000 the U.S. considered measles eradicated, but the picture has changed alarmingly since then. In 2014, 667 unvaccinated people contracted measles. Last year an outbreak that began in California’s Disneyland infected more than 100.

Many Americans have been refusing to protect themselves and their children with the measles vaccine. According to a recent study in the American Journal of Public Health, as much as 5.5% of children are unvaccinated in some U.S. communities, and the parents most likely to refuse vaccines tend to be affluent, well-educated and white. Their resistance can largely be traced to a 1998 article in a British medical journal that falsely linked childhood vaccines to autism. That study was debunked, but the damage had been done.

Now doctors must figure out how to persuade these parents to change their minds. Late last year they got some help from a team of psychologists from the University of Illinois at Urbana-Champaign and the University of California, Los Angeles, who were interested in what might sway anti-vaxxers’ opinions.

Measles can be devastating. A highly contagious and virulent disease, it can lead to convulsions, hearing loss, brain damage and even death. Vaccination efforts have been so successful up to now, however, that almost half of the nation’s pediatricians have never seen a real case. The question is how to make people understand that the threat is real.

Would correcting misconceptions about the childhood vaccine-autism myth do the trick? Or would testimonials and graphic photos of sick children be more effective?

The study, led by Dr. Zachary Horne of the University of Illinois and published last August in the Proceedings of the National Academy of Science, asked 315 participants to complete questionnaires about their attitudes to vaccines and their plans to vaccinate their children. The subjects were chosen at random and not prescreened, although some dropped out or were later disqualified for not paying attention to the testing.

The researchers randomly divided subjects into three groups. They showed the “disease risk” group photos of young, infected children with florid rashes and a paragraph written by a mother of a child with measles, as well as three short warnings about the disease. The “autism correction” group read research summaries showing that childhood vaccines do not cause autism. And a control group read unrelated scientific vignettes.

Once again, all the participants completed the questionnaire about attitudes to vaccines. Which intervention was most likely to alter their views?

Surprisingly, the “autism correction” approach was no more influential in changing anti-vaxxers’ minds than the control condition. Telling people that their beliefs aren’t true just didn’t work. But showing people images of sick children with ugly rashes did, as did reading a parent’s account of how it feels to have a baby with measles who is spiking a fever of 106 degrees. “We spent three days in the hospital fearing we might lose our baby boy,” the mother wrote. “He couldn’t drink or eat, so he was on an IV, and for a while he seemed to be wasting away.”

Why would frightening people change their minds more than giving them the facts? The human brain evolved to give priority to appalling, negative events over positive ones, according to a seminal paper published in the Review of General Psychology in 2001. Lead author Roy Baumeister, a psychology professor at Florida State University, documented hundreds of ways that “bad is stronger than good,” as he and his colleagues titled the paper. It’s a position that has been confirmed by the PNAS study on vaccines and by a 2015 analysis in Psychological Bulletin of the impact of fear-based appeals on changing people’s behavior.

So, public officials, go ahead—scare parents silly.

 

Why Looks Count in Politics

In elections, appearance counts more than most of us care to admit. It all comes down to the brain’s orbitofrontal cortex

See the column on the Wall Street Journal site

 

Those who watched the New Hampshire Republican debate surely made a mental note of aspects of the candidates’ appearances as they staked out their turf: Ted Cruz’s jutting chin, Ben Carson’s avuncular temples, Donald Trump’s blond, blond hair, Marco Rubio’s full cheeks and boyish manner.

It’s all window-dressing, you might say; only the candidates’ positions matter. But in politics, looks count more than we care to admit.

Indeed, we often clinch our political decisions a split-second after we see a candidate and don’t change them much over time. In a 2006 study published in Psychological Science and led by Alexander Todorov of Princeton University, subjects who selected favorites after a brief glance at snapshots of unfamiliar candidates were able to predict who would win nearly 70% of the 2004 Senate and House races. When the researchers gave people more time to decide, they simply confirmed their first impressions.

In another study, published in 2008 in the Proceedings of the National Academy of Sciences, Prof. Todorov and colleague Nikolaas Oosterof digitally manipulated people’s facial features in photos, revealing just what makes us fall so hard for a candidate. They showed that a rounded, baby-faced appearance with prominent cheekbones, arched inner eyebrows and a sunny demeanor makes a person seem trustworthy.

Such snap judgments also can skew life-or-death decisions, according to a new study in Social Psychological & Personality Science. Assessments of the trustworthiness of convicted murderers based on their facial features aligned well with how they were sentenced. The sense that a prisoner was trustworthy was a good predictor of whether he got life in prison or the death sentence.

Not only do we make critical inferences about people based on their appearance, but another study, published last June in the Journal of Neuroscience, suggests that the ability to assimilate more than one piece of information about them hinges on having a healthy brain—a healthy orbitofrontal cortex, to be precise. Situated right behind the eyeballs on the floor of the skull, this area of the brain is central to social decision-making and impulse control—and to how we make political choices, according to the study.

Lesley Fellows and her team at the Montreal Neurological Institute and McGill University investigated what happens when the orbitofrontal cortex is badly damaged. How might that affect a person’s first impressions of a candidate? The researchers found seven people who had lost use of this part of their brain due to an aneurysm or tumor surgery but whose other cognitive abilities were intact. The study also included a group of 18 people with frontal lobe damage but no harm to the orbitofrontal cortex area. Another control group included matched, healthy subjects.

The researchers asked participants to look at pairs of head shots of political candidates. The subjects didn’t know the candidates and didn’t have any information about them. They first “voted” for one from each pair based on the photos. Then they rated each candidate on two traits: perceived attractiveness and competence (the latter a guess based on appearance).

The votes of the healthy participants and of the 18 people whose brain injuries did not include the orbitofrontal area included considerations of both attractiveness and competence—that is, they might vote for someone whom they judged to be competent even if that candidate wasn’t rated the most attractive. But the votes of the seven people with orbitofrontal damage matched one factor only: the candidates’ appearance. They were most likely to vote for whoever they deemed most attractive.

Even those of us fortunate enough not to have brain damage often can’t explain why we like who we like. “Eventually the brain gets overwhelmed with so many factors,” Dr. Fellows told me. “When it gets to be too much, people just simplify.”

Children’s Lies Are a Sign of Cognitive Progress

Research shows that kids’ ability to bend the truth is a developmental milestone, much like walking and talking

See the column on the Wall Street Journal site

 

Child-rearing trends might seem to blow with the wind, but most adults would agree that preschool children who have learned to talk shouldn’t lie. But learning to lie, it turns out, is an important part of learning in general—and something to consider apart from fibbing’s ethical implications.

The ability to bend the truth is a developmental milestone, much like walking and talking. Research led by Kang Lee, a psychology professor at the University of Toronto, shows that lying begins early in precocious children. Among verbal 2-year-olds, 30% try to pull the wool over their parents’ eyes at some point. At age 3, 50% regularly try it. Fibbing is common among 80% of 4-year-olds and is seen in nearly all healthy 5- to 7-year-olds.

In other words, lying is nothing unusual in small children. What’s more, younger children who tell tales have a cognitive advantage over the truth-tellers, Dr. Lee said. “Lying requires two ingredients. Children need to understand what’s in someone else’s mind—to know what they know and what they don’t know. We call this ability theory of mind. The children who are better at theory of mind are also better at lying.”

The second requirement, according to Dr. Lee, is executive function—the power to plan ahead and curb unwanted actions. “The 30% of the under-3s who can lie have higher executive function abilities,” he said, “specifically the ability to inhibit the urge to tell the truth and to switch to lying.”

Such cognitive sophistication means that these early liars will be more successful in school and in their dealings with other kids on the playground, he added.

Though Dr. Lee had known for decades that children who excel at theory-of-mind tasks are better liars, he didn’t know which came first. Does lying make children better at guessing what other people are thinking? After all, trying half-truths on for size would elicit feedback from adults that would reveal something about their mental states. Or is it that if you teach people to imagine what’s going on in others’ minds, they then become better fabricators? He tested that notion in an experiment that he published in the journal Psychological Science last November.

Theory-of-mind training has become a popular tool for helping children on the autistic spectrum as well as those with behavioral problems. The training walks children through situations that help them to discover that other people could have knowledge or beliefs different from their own. In Dr. Lee’s lab the children are also read stories rich in information about people’s mental states. “So we asked, what are the side effects? Can we induce lying by training theory of mind?” Dr. Lee said.

He and a team of researchers from Canada, the U.S. and China divided a group of 58 preschoolers from a city in mainland China into two groups after testing them for such things as intelligence, lying ability and executive function. Half of the children received six sessions of theory-of-mind training and the other half received an equal number of sessions devoted to teaching number and spatial problem-solving skills.

After six sessions over eight weeks, the researchers found that the children in the theory-of-mind group had not only become better liars but also were significantly better at lies than the control-group children were. The effects lasted a month. Dr. Lee intends to follow up to see if these results persist.

“The first occasion of your child telling a lie is not an occasion to be alarmed but an occasion for celebration. It’s a teachable moment,” he told me, “a time to discuss what is a lie, what is the truth and what are the implications for other people.”

When Does Gratitude Bring Better Health?

During the holiday season, gifts, cards, carols and donations constantly urge us to give thanks. But gratitude really can have beneficial psychological effects.

See the column on the Wall Street Journal site

 

Gratitude is one of those tricky, hard-to-pin-down feelings that can be either inert or powerfully transformative. At this time of year, when we’re constantly importuned to give thanks with gifts, cards, carols and donations, it often becomes a reflex. So when does gratitude have psychological effects?

That question didn’t get much scientific scrutiny until recently. Only in the past decade has there been a push to determine if gratitude “decreases pain and depression, and boosts happiness,” as a recent study in Primary Health Care Research & Development put it. The researchers found that an act of explicitly expressing gratitude lifted people’s mood and sense of well-being.

Bolstering this finding, other targeted studies have shown that health-care workers who cataloged why they were grateful experienced a 28% reduction in stress, and that writing about gratitude halved the risk of depression in those with a history of the disease.

Some research results seem almost too good to be true. Simply asking suicidal patients to write a letter of gratitude reduced their hopelessness in 90% of the cases. Among fit teenage athletes, those with high levels of gratitude were more satisfied with life in general and with their teams in particular.

Counting one’s blessings, as opposed to life’s annoyances, seems to bring with it all kinds of benefits: resilience, better health, a rosier outlook—even a longer, more restful night’s sleep and a sense of connectedness to other people.

Changing how we feel is one thing, but changing behavior is another. “It’s not the hardest sell in the world that emotions could make you feel more optimistic about your life. But I grow skeptical about observable effects in the body,” Michael McCullough told me. A psychology professor at the University of Miami who investigates emotions like forgiveness, revenge and gratitude, Dr. McCullough wonders whether feeling grateful actually alters our health or whether it works by motivating us to change our behavior—to quit smoking or drinking, for example.

He found both mood and behavior changes in an experiment he did with Robert Emmons, a colleague at the University of California, Davis. In the study, reported in 2003 in the Journal of Personality and Social Psychology, prompting people to list five things they were grateful for several times a week not only brought an uptick in mood but also resulted in subjects devoting more time to exercise and to helping others.

“Gratitude motivates people into trying to give back,” Dr. McCullough said, “and the research is really good that volunteering is good for health. Emotional state to social contact to feeding back into health behavior—it all makes sense.”

Whether the feeling or the behavior comes first, we do know that gratitude is tied to conscientiousness. Grateful people eat 25% fewer fatty foods and have better blood pressure readings than ungrateful folks. And a new, still unpublished study shows that feeling thankful is linked to lower Hemoglobin A1C, a sign of good blood-glucose management and thus better diabetes control.

In fact, gratitude is such a powerful catalyst for feeling hale and hearty, it’s a wonder that no one (except greeting-card companies and religious leaders) has found a way to package it.

Which brings us back to the holidays. This time of year most of us steel ourselves for bland turkey, rich desserts and loud relatives and forget what is remarkable in our lives. We’d probably all be better off—and enjoy the holidays more—if we followed the lead of gratitude researchers and took a few minutes to list exactly what we’re grateful for. Better still, think it over, then say it out loud so that everyone can hear it.

Changes in Sense of Humor May Presage Dementia

New research suggests that shifts in what a person finds funny can herald imminent changes in the brain—possibly presaging certain types of dementia

A close friend was an even-keeled, responsible man, endowed with a sunny outlook and a gentle, punny sense of humor. So when he started to make snide remarks at social gatherings several years ago, I secretly championed the delight he was taking in his newfound freedom from social constraints. After more than 50 years of exemplary adult behavior, he had earned the right to play court jester now and then. Or so I thought.

New research from University College London suggests that shifts in what a person finds funny can herald imminent changes in the brain. Published this month in the Journal of Alzheimer’s disease, the study found that an altered sense of humor can predate a diagnosis of dementia by as much as 10 years. A burgeoning penchant for slapstick—over a past preference for satire or absurdist humor, for example—characterized nearly everyone who eventually developed frontotemporal dementia. (Far less common than Alzheimer’s, this illness usually hits people in their 50s and 60s.) But a changed sense of comedy affected less than half the people later diagnosed with Alzheimer’s disease.

“The type of change could be a signpost of the type of dementia the person is going to develop,” said Jason Warren, a neurologist at University College London who led the study. Acknowledging that humor is an unconventional way to think about neurodegenerative disease, he told me that most research in the area uses more standard assessment tools, such as memory tests, but that “memory may not be the type of thing that patients or relatives notice first.” Such warnings could be subtle changes in behavior, including humor.

After all, most forms of humor require some form of cognitive sleight-of-hand. “Getting” satire hinges on the ability to shift perspective in a nanosecond. Absurdist jokes play fast and loose with our grasp of logic and social norms; black humor lampoons taboos. All are a rich source of data about the brain.

“Humor is like a stress test,” said Dr. Warren. “The same way you’re on a treadmill to test the cardiovascular system, complex jokes are stressing the brain more than usual.”

Modest in size, the London study compared 48 patients from an outpatient dementia clinic with 21 healthy older adults. A spouse or longtime caregiver filled out a semi-structured questionnaire about what kinds of TV shows, comic writing and other media each subject preferred. Fifteen years before the study and now, how much did the person enjoy slapstick (along the lines of “The Three Stooges,” though the British study focused on U.K. entertainment only, like “Mr. Bean”), satirical comedy (“Saturday Night Live”) or absurdist comedy (“Monty Python”)?

A change in the type of comedy that people found funny turned out to be a sensitive predictor of a later diagnosis of frontotemporal dementia, though Dr. Warren cautioned that a small, retrospective study like this one is just a first step. Still to come are brain-imaging studies and a prospective look at changes in humor in people who carry genetic markers for the disease.

Their findings could well apply to me, given that dementia runs in my family. I admit, though, that I’m not used to thinking about humor this way. The quip attributed to Groucho Marx that “a clown is like an aspirin, only he works twice as fast” captures my view.

In a perfect world, laughter would be the antidote to illness, not its red flag.