Can an Entire Generation Change Its Personality?

By

SUSAN PINKER

See the column on the Wall Street Journal site

When the cartoon character Popeye proclaims, “I yam what I yam,” he personifies the idea that personality traits are more or less fixed in adulthood. Whether shy or outgoing, a leader or a follower, we are—notwithstanding minor tweaks—who we are.

But what if a whole generation can learn a new set of tricks?

A new study suggests as much. Published this month in the Proceedings of the National Academy of Sciences, the research shows a shift in many personality traits in men that could make them more successful in the workplace. This huge study included 80% of the male population born in Finland between 1962 and 1976, or 419,523 young men. All of them had taken standardized cognitive and personality tests when they entered the Finnish Defense Forces under a compulsory male draft at the age 19 or 20.

The researchers then looked at these Finnish men’s average annual earnings at age 30 to 34, which the scientists considered a good predictor of their lifetime earnings.

What struck researchers the most was that self-confidence, sociability and leadership motivation all rose on average from the group of men born in 1962 to those born in 1976. Striving, deliberation and dutifulness crept up too, though not as much. Levels of intelligence or family income didn’t seem to be driving these generational shifts, given that they surfaced at all cognitive levels and social strata.

And here’s the kicker: When the researchers compared personality scores when the men entered the draft with earnings at age 30 to 34, they found that even small upward shifts in personality ratings predicted a higher income 10 years later—with the 1976 group earning as much as 12% more than its 1962 counterpart, when other factors such as inflation, overall wage rises and education were stripped away.

“We don’t want to say that personality improves, because reasonable people can perfectly well disagree on what constitutes a good personality,” wrote Matti Sarvimäki, one of five authors of the study and an economist at Aalto University in Finland. “What we show is that the types of personality traits that predict higher earnings rise” from birth year to birth year.

Of course, we should be cautious when extrapolating from the experience of Finland, a country of about 5.5 million people, to the U.S. Nor is it clear whether these results would have a positive effect on the American labor pool. In addition, the Finnish study, due to the male-only draft, was limited to men.

Still, this study’s findings line up with some other positive trends. U.S. college students have become more outgoing, self-confident and self-absorbed—though that last trait may not be quite as positive as the others. Another rising measure: IQ scores are improving about three points a decade, a phenomenon known as the Flynn effect.

As with IQs, we don’t know why personality traits are changing. “It’s kind of a mystery,” says Richard Haier, an emeritus professor at the University of California, Irvine, and the author of “The Neuroscience of Intelligence.”

If extraversion is rising, he said, then—as is the case with IQ-test levels—improvements in general health and nutrition could be affecting the change. Education too could play a role, he says—for example, more exposure to problem-solving at school. Indeed, education is one of the factors Prof. Sarvimäki will explore next.

Does Facebook Make Us Unhappy and Unhealthy?

A look at new research covering thousands of adults

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

If you’re one of the almost two billion active users of Facebook , the site’s blend of gossip, news, animal videos and bragging opportunities can be irresistible. But is it good for you?

A rigorous study recently published in the American Journal of Epidemiology suggests that it isn’t. Researchers found that the more people use Facebook, the less healthy they are and the less satisfied with their lives. To put it baldly: The more times you click “like,” the worse you feel.

The study’s authors, Holly Shakya, an assistant professor of public health at the University of California, San Diego, and Nicholas Christakis, the director of the Human Nature Lab at Yale University, monitored the mental health and social lives of 5,208 adults over two years. The subjects agreed to participate in national surveys collected by the Gallup organization between 2013 and 2015 and, during that time, to share information with the researchers about their health, social lives and Facebook use.

The use of the Gallup survey at three different points let the scientists take informational snapshots of the participants’ health and social lives and chart how their feelings and behavior changed over the two years. The researchers also kept direct tabs on the subjects’ Facebook usage: how often they clicked “like,” clicked others’ posts or updated their own status.

Using standardized questionnaires, the researchers also asked about participants’ social lives: How often did they get together with friends and acquaintances in the real world, and how close did the participants feel to them? There were queries about life satisfaction, mental health and body weight, too.

The findings? Using Facebook was tightly linked to compromised social, physical and psychological health. For example, for each statistical jump (away from the average) in “liking” other people’s posts, clicking their links or updating one’s own status, there was a 5% to 8% increase in the likelihood that the person would later experience mental-health problems.

Responding to the study, Facebook cited an earlier paper by a company scientist and Carnegie Mellon University Prof. Robert Kraut. “The internet’s effect on your well-being depends on how you use it,” the authors wrote. Participants who received more Facebook comments than average from close friends reported a 1% to 3% uptick in satisfaction with life, mood and social support, the study reported. It also acknowledged that it’s hard to measure the emotional effect of the internet.

In the last couple of months, two other studies have cast a negative light on the social-media use of teenagers and young adults. One, of 1,787 Americans, found that social media increased feelings of isolation; the other, of 1,500 Britons, found that the websites—image-based sites in particular—exacerbated feelings of anxiety and inadequacy.

In their own study of adults on Facebook, Profs. Shakya and Christakis found a powerful link between real world, face-to-face social contact and better psychological and physical health all around, a finding matched by dozens of previous studies. What’s astonishing about this research is that the investigators had direct access to people’s Facebook data over a two-year period. With a dynamic picture of how the participants’ activities and outlook evolved over that time, they could see whether someone who was already sad or in poor health used Facebook more often—or whether their symptoms started or worsened in tandem with their online social activities.

Still, there are some nuances to consider. Why would online social activity be so damaging to health and well-being in this study when the same activity was found to be correlated with longevity in a 2016 study co-written by Prof. Christakis? The bottom line, he says, is that replacing in-person interactions with online contact can be a threat to your mental health. “What people really need is real friendships and real interactions,” he adds.

Appeared in the May 27, 2017, print edition as ‘Does Facebook Make Us Unhappy And Unhealthy?.’

‘Momnesia’? No, Pregnancy May Boost Intelligence

Research suggests that mental changes last well past delivery

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

As a mother of three, I had always thought of pregnancy as a time of increased girth but decreased smarts. I wasn’t alone in thinking that my mental capacities were temporarily making way for the needs of the new arrival. Mommy Brain and Momnesia—pop terms for the sleepiness of pregnancy and the postpartum period—have branded such folk wisdom with the veneer of truth.

But that’s where the proof for Mommy Brain ends. Though many women say pregnancy makes their thinking fuzzy, an impressive body of research shows the opposite. Pregnancy and motherhood seem to make mothers smarter.

A 2014 study led by the late Craig Kinsley of the University of Richmond and published in the journal Hormones and Behavior showed that lactating mother rats beat childless rats at hunting—an asset not linked to any detectable uptick in their ability to hear, see or smell. Instead, mother rats had a fertility-related boost in mental power making them better at providing for themselves and their young.

Thanks to work by Dr. Kinsley and his team, we also know that the pregnancy-related hormones of motherhood restructure some brain areas not typically linked to reproduction, such as the hippocampus. This seahorse-shaped brain area consolidates memories and helps us figure out how to navigate through space. Motherhood-related hormones might explain why pregnant and lactating rats beat their non-reproducing female peers at running mazes, for example.

Amazingly, these hormones can also protect mothers’ brains from injury, says Adam Franssen, an associate professor of biology at Longwood University in Virginia and a former colleague of Dr. Kinsley’s. Five years ago, Dr. Franssen led a study that exposed mother rats and childless rats to new experiences and then injected an acid into the rats’ hippocampi to create amnesia. The mother rats’ memory and problem-solving abilities rebounded more quickly, compared with female rats without offspring.

Recent evidence suggests that pregnancy induces changes in the human brain, too. In a study published a few months ago in the journal Nature Neuroscience, Elseline Hoekzema of Leiden University in the Netherlands and colleagues scanned the brains of about 80 women and men, half of them hoping to become parents. The couples who wanted to have a baby were scanned before pregnancy, then again if they got pregnant, after the baby was born and when the baby turned 2.

Women who became pregnant between the scanning sessions showed neural changes so distinct that a computer could distinguish between pregnant and nonpregnant women based on their brain scans alone. The heightened estrogen and progesterone hormones of pregnancy trimmed back some “gray matter”—the cell branches that connect neurons to each other-—which has the effect of sharpening, not diminishing, mental capacities. The neural pathways that remain are streamlined and strengthened in the process.

“An analogy would be moving to a neighborhood and trying to find the best way home from work. Each day you take different routes, eventually settling on the most efficient route. You’ve pruned out the less efficient neural pathways and can travel the main route with almost no effort. The same is likely true during pregnancy and birth. The hormones of pregnancy help the brain refocus on the new priority in the mother’s life,” Dr. Franssen says.

The regions affected included ones linked to maternal bonding and memory. And the changes weren’t temporary. As Dr. Hoekzema said in an interview, they lasted for at least two years after a child was born.

Cognitive tests showed that this pruning of gray matter was associated with greater social acuity, and there was no attendant decline in the women’s intelligence. The mothers’ neural pruning affected the same areas that were activated—brain regions linked to empathy and nurturing—when they saw photos of their own babies. The change is “really about refinement and specialization…and a better recognition of emotions,” Dr. Hoekzema told me.

Add these findings to a big Swedish study released in March, showing that parenthood is linked to living longer in both sexes, and we can discredit two old saws: Children do not take years off your life, and pregnancy does not erode your smarts.

You Can’t Be Fooled by a Con? Don’t Count On It

Sobering April Fool’s news from research on deception

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

A friend recently discovered a surprise on her credit-card account: nearly $20,000 in cash withdrawals, along with charges for private-jet flights and a trip to Acapulco. This was no April Fool’s joke. A young family friend, to whom she had offered moral support and a couch in her basement when he was in dire straits, had conned her.

If you think that you couldn’t be tricked so easily, you’d be wrong. Some 35 million Americans fall for scams each year, according to the Federal Trade Commission.

One reason for our susceptibility to deception is that evolution has tipped humans toward trusting and cooperating with each other for mutual advantage. “Any of us can be fooled, because con artists give us the best possible version of ourselves,” says psychologist Maria Konnikova, author of the 2016 book “The Confidence Game.”

Research shows how hard it is for us to detect a swindler. Hundreds of lie-detection studies suggest that we succeed at it less than half the time—flipping a coin would be more accurate. This may be because most of our stereotypes about liars are wrong. People don’t avoid eye contact or fidget when they’re lying, says Leanne ten Brinke, a researcher on deception at the University of Denver. A telling clue, “such as Pinocchio’s nose,” just doesn’t exist, she writes.

A forensic psychologist, Dr. ten Brinke had long suspected that people’s indirect assessments of deception might be more effective than their conscious attempts. After all, studies have found that nonhuman primates like chimpanzees can detect fakery designed to distract them from hidden food; other research shows that some people who can’t understand speech due to brain damage are better at sussing out swindlers than people with no language deficits. Could it be that the unconscious mind is better than rational focus when it come to detecting lies?

In a study published in 2014 in the journal Psychological Science by Dr. ten Brinke and her colleagues, a total of 138 subjects participated in two experiments designed to answer this question. In the first experiment, researchers randomly set up two groups. The first was told to steal $100 from an envelope deliberately placed in the testing room and then to lie about it. A second group was told not to steal the money and to tell the truth.

The experimenters posed questions to both groups like, “Did you steal the money?” along with neutral queries about subjects like the weather. Later, when a new set of participants saw the video footage of one truth-teller paired with one liar, they could distinguish between honesty and fibs only 45% of the time—a dismal figure in line with previous research.

The researchers then tried to get below the mind’s surface with a second experiment. They tested whether the image of someone telling a lie—even if glimpsed for just fractions of a second—would prime the viewer to recognize notions related to dishonesty, and conversely, if a very brief flash of a truth-teller would do the same for ideas related to honesty.

The experimenters showed participants still photos from the first experiment’s videos and then asked them to categorize a set of words related to veracity. They found that viewers were able to categorize lie-connected words—such as dishonest and deceitfulmore quickly and accurately when they had just glimpsed an image of someone who was lying. Similarly, after a subliminal flash of a truth-teller’s face, the participants responded more quickly and accurately to words like genuine or sincere. This suggests that they had somehow registered who was telling the truth and who was lying.

But the priming only went so far. When shown the images of people lying and telling the truth and specifically asked to judge which was which, the participants didn’t do nearly as well at the task.

The sobering April Fool’s message for the rationalists among us is that a sucker is born every minute. And that sucker is us.

A Poor Sense of Smell May Point to Brain Trouble

New research ties dementia predictions to olfactory results

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

If you can smell the difference between pineapple and paint-thinner, that’s a good sign: You may be more likely to keep your marbles well into the future. So says a new study on the predictive power of our sense of smell.

In a paper published in the journal Neurology in December, neurologist Kristine Yaffe and her team discovered that older adults with a keen sense of smell are less likely than their peers to develop dementia as they age. In contrast, a blunted sense of smell, much like an altered sense of humor or sense of direction, may presage more pervasive cognitive losses in the years ahead.

Previous research had led Dr. Yaffe, who is a professor of psychiatry and neurology at the University of California, San Francisco, to suspect that people with a diminished ability to identify odors might be at increased risk of dementia. Olfactory nerve fibers project into the brain’s centers for memory and emotional processing, which suggests a connection between those cells and smell-related ones.

Dr. Yaffe’s research team followed 2,428 healthy adults in their 70s, all participants in a national, ongoing study of aging. Previous studies on the topic were often confined to white populations. This time the research team created a group that was half black, half white as well as half male, half female.

Three years into the study, the researchers assessed the participants’ sense of smell using a standardized test of 12 common scents. The test required the subjects to inhale a series of airborne chemical compounds present in foods like onion, lemon and chocolate, as well as in environmental odors, such as smoke and gasoline. After each whiff, the subjects had to identify what they had just smelled, which resulted in their “odor identification score.”

The researchers then monitored the participants’ psychological and medical status for the next nine years, using a memory test and their medication and hospitalization records. The team controlled for other factors that could increase the risk of memory loss, such as a prior smoking habit, a history of head trauma, depression or a genetic predisposition for Alzheimer’s disease. With those factors statistically stripped away, the experimenters found that on tests for sense of smell, men trailed women slightly and blacks lagged a bit behind whites.

What mattered most, though, were comparisons within the racial and gender groups. People of either race or sex who had more trouble identifying smells were indeed more likely to develop dementia—despite having no signs of cognitive decline when they signed up for the study, nor during its first three years.

Among whites, the weakest smellers (the lowest third of the group) had three times the dementia risk as the highest third. Among black participants, the weakest group had two times the risk. Subtypes of dementia may explain the difference. Smell is a better early predictor of Alzheimer’s. Blacks, Dr. Yaffe said, have a greater risk of developing vascular dementia, which causes diffuse neural deficits, than Alzheimer’s, which leaves more tangles and plaques near the olfactory nerve.

“What’s happening,” she added, “is that abnormal proteins are building up over decades, and some of the early changes start in the olfactory bulb, the brain structure that receives neural information about odors. “When you think about how our brains evolved, it’s not a shocker that olfaction, considered an older part of the brain, would reflect degenerative processes first.”

In all, 491 of the participants developed dementia at the end of 12 years.

That a compromised sense of smell may be a biomarker for dementia is both good news and bad news. An early warning signal has limited use, since no drug yet exists to head off Alzheimer’s (though several medications are being developed that target its symptoms). Still, those who can’t smell shouldn’t panic. Researchers emphasize that a mildly dialed down sense of smell and taste—much like mild hearing loss—is just a feature of aging.

 

Watching Terror and Other Traumas Can Deeply Hurt Teenagers

The focus: web exposure to the Boston Marathon bombing

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

Several weeks into the Gulf War in 1990, when I was working as a clinical psychologist in Montreal, pediatricians started referring children with mysterious behavior problems. A few 8- to 10-year-olds had started to wet their beds for the first time since they were toddlers. Others were refusing to go to school. Their first interviews revealed a common thread: Though strangers to each other, the children and their families were all visitors or immigrants from Israel, and they had been watching Scud missile attacks against Israel on the news every day.

These children were living more than 5,000 miles from the bombing, but observing the devastation from afar—and their parents’ reactions to it—had elicited an emotional response that skewed their habits. New research is now helping us to understand how second-hand exposure to disasters can change our brains and behavior, and who is most at risk.

“We used to think of it as a bull’s-eye. The closer you were to the trauma, the more likely you were to show symptoms,” said Jonathan Comer, a professor of psychology at Florida International University who investigates the impact of terrorist attacks on children and families. “But a summary of post-9/11 reactions really challenged that idea.” Those closest to the event didn’t necessarily show the most trauma, said Prof. Comer.

Prof. Comer’s own research upends our assumptions about who is most prone to feel psychological fallout after a terrorist attack. In a 2016 study of the Boston Marathon bombing published last summer in Evidence-Based Practice in Child and Adolescent Mental Health, the research team surveyed nearly 500 parents of 4-to-19-year-olds about their children’s internet exposure during the bombing and the manhunt that followed. The surveyed families all lived within 25 miles of either the bombing site itself or Watertown, Mass., where the police had ordered the local population to shelter in place during a search for suspects.

The findings? As one would expect, exposure to graphic online content rose with age. Over 23% of children saw internet images of the bombing, and 68% viewed footage of police with military-grade weapons searching their neighborhoods. The average was two to three hours of daily online activity per child. While those under age 8 were less exposed, three-quarters of children over 12 spent up to 6 hours daily viewing online news and social media coverage of the crisis.

Teenagers were the most prone of all age groups to experience psychological trauma after the bombing. The researchers found that the more internet news and social media contact they had about the bombing and manhunt, the more severe were their PTSD symptoms, which ranged from intrusive flashbacks to emotional numbing. What’s more, even if 87% of parents believed that online exposure to the crisis could be damaging, very few restricted their children’s access to it. And what parents thought was private—their own fear—turned out to be contagious, too, as shown in a 2014 study of parents’ reactions to the marathon bombing by Prof. Comer’s team.

To understand how observational distress works, a team led by Alexei Morozov at Virginia Tech Carilion Research Institute looked at what happens when mice watch a sibling experience fear. He first learned that vicarious fear leaves a neural trace that predisposes animals to experience trauma later on. A Morozov-team study published last month in Neuropsychopharmacology found that vicarious trauma changes the connections between the amygdala (roughly, our emotion and survival center) and the prefrontal cortex (the planning and decision-making area). “After the animal has the experience of the other’s pain, it allows excitation to be more robust…it reduces the ability of the prefrontal cortex to act rationally,” he told me.

That’s one reason why “watching around-the-clock breaking news is not in our best interest—either for adults or children,” said Jonathan Comer. Because it’s not how much danger we’re in that matters. It’s how much threat we perceive in others.

What You Just Forgot May Be ‘Sleeping’

Can’t remember what you were just thinking about? A new study amends our understanding of how memory works

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

If you’ve ever forgotten why you just entered a room, you know how fickle memory can be. One moment it’s obvious why you walked down the hall, and the next moment you’re standing there befuddled.

Here today, gone in a millisecond. At least that’s how we used to think about short-term, or working, memory. But a study just published in the journal Science tells a different story. A recent idea or word that you’re trying to recall has not, in fact, gone AWOL, as we previously thought. According to new brain-decoding techniques, it’s just sleeping.

“Earlier experiments show that a neural representation of a word disappeared,” said the study’s lead author, Brad Postle, a professor of psychology and psychiatry at the University of Wisconsin-Madison. But by using a trio of cutting-edge techniques, Dr. Postle and his team have revealed just where the neural trace of that word is held until it can be cued up again.

Their study amends the long-standing view of how memory works. Until now, psychologists thought that short-term memory evaporates when you stop thinking about something, while long-term memory permanently rewires neural connections. The new research reveals a neural signature for a third type of memory: behind-the-scenes thoughts that are warehoused in the brain.

In the study’s four experiments, a total of 65 students viewed a pair of images—some combination of a word, a face or a cloud of moving dots—on a screen. During a 10-second period, the students were prompted to think about one of the two images they had seen. After a brief delay, they had to confirm whether a picture they saw matched one of the first two images.

Throughout the experiment, the Postle team monitored the students’ pattern of neural activity. When they prompted the students to remember a particular image, a unique 3-D display of neural activity corresponding to that idea popped up. What really interested the experimenters, though, was what the brain was doing with the image it had effectively set aside. Where did that memory go while it was waiting in the wings?

To find out, the team used a new technique to see what happened when the participants were warned that they would be tested on the set-aside image. This novel approach created a dynamic 3-D display of electroencephalogram (brain wave) and brain-imaging data that let the researchers see beyond what part of the brain “lights up” and zoom in on the pattern of activity within a region. That’s how the team learned that the students’ brain activity had indeed shifted to the “on hold” image’s distinctive pattern—which until then had been invisible.

To confirm that the memory still existed even while a person was not thinking about it, the scientists used another recent technique, transcranial magnetic stimulation, or TMS. They positioned a wand over a participant’s scalp and delivered a harmless magnetic pulse to the brain areas that held the images. The pulse made the distinctive neural signature of those fleeting memories visible to the scientists and triggered their recall in the students.

Dr. Postle compared working memory to paper inscribed with invisible ink. Words written in lemon juice are initially imperceptible, but by passing a hot cup of coffee over the paper, “you can see the part of the message that was heated up…. Our TMS is like the coffee cup.” In this way the team activated a memory that was not only temporary but below the student’s level of consciousness.

Using Dr. Postle’s new trifecta of brain-imaging and brain-stimulation techniques to reactivate forgotten memories has enticing—though still remote—therapeutic possibilities. It is neuroscience’s most faithful reading yet of the real-time content of our thoughts—about as close as we have ever come to mind-reading.

“Our study suggests that there’s information in the penumbra of our awareness. We are not aware that it’s there, but it’s potentially accessible,” said Dr. Postle.

Why Lessons From Chimp Mothers Last a Lifetime

A grooming study suggests the powerful influence of moms (human ones, too)

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

My mother taught me how to cook, learn my times tables and read with a critical eye—and, in the social realm, how to mind my manners and reach out to others.

But maternal mentors are hardly exclusive to humans. Vervet-monkey moms show their infants their own way to clean off fruit, while chimpanzee mothers teach their toddlers just which stick is best for termite-fishing and how to use rocks to crack nuts. Some bottlenose-dolphin mothers show their young how to find sponges to protect their sensitive snouts while scouring the sea floor for treats—a safety measure that resonated with the mother in me.

New evidence shows that some mammal mothers—specifically, chimps—also transmit to offspring their unique style of socializing. (In contrast, nonhuman primate fathers rarely get involved in teaching children.) The new research focuses on grooming, a primary feature of an ape’s social life.

Chimp grooming does double duty: Picking through another animal’s fur controls parasites and also establishes a trusting social bond. Grooming can be a relaxing pastime, a come-on or a way to forge alliances. As with human social behavior, chimps do it in different ways.

In what primatologists call “high-arm grooming,” two chimps groom each other face-to-face, each with one arm raised in the air. Using their opposite hands to comb through each other’s fur, the pair might be clasping their raised hands aloft or leaning their forearms together overhead. Either way, chimps of both sexes practice this unique grooming style well into adulthood.

While male chimps stay where they are born all their lives, most females migrate to new groups at adolescence. Wherever they end up, the ones from the high-arm, hand-holding community then teach this endearing posture to their own progeny. “It’s a social custom inherited from mother to offspring and not known in any primate before,” said Richard Wrangham, a professor of biological anthropology at Harvard. He and his team observed this unusual behavior among wild chimps living in the Kanyawara community of Uganda’s Kibale National Park.

As described last month in the journal Current Biology, the researchers analyzed 932 photos of high-arm grooming among 36 wild chimps, half of them female. The team had expected that when the adolescent female chimps immigrated to a new community, they would conform to its customs. Teenagers like to fit in, after all.

But a close look at which chimps held hands showed that what mattered is how mom did it. “We’ve got individuals up to 40 years old who are following the maternal pattern,” Dr. Wrangham told me, adding that even after the mother dies, her offspring keep doing things her way.

Intriguingly, hanging out with other family members or peers made no difference in grooming style—regardless how close the relationship or how much time the animals spent together. That makes sense, given that primate mothers are crucial to the education and survival of their offspring. In a 2006 study of wild chimps in Tanzania’s Gombe National Park, the amount of time spent watching mom fish for termites correlated with how skilled 6-year-old chimps became at that task. Female children copied their mothers more faithfully and so became more efficient learners.

Human mothers also have a uniquely powerful effect on their children’s behavior. As mammals and primates, they take time to coach their young ones, who then copy what they do. I’m not discounting the importance of fathers, but it looks like we belong to a large evolutionary family that learns enduring lessons at our mothers’ feet.

Empathy by the Book: How Fiction Affects Behavior

Not all genres have the same effect, research shows

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

When I want to escape I pick up a good novel. But does this habit provide more than a quick getaway?

We’ve long known about the collateral benefits of habitual reading—a richer vocabulary, for example. But that’s only part of the picture. Mounting evidence over the past decade suggests that the mental calisthenics required to live inside a fictional character’s skin foster empathy for the people you meet day-to-day.

In 2006, a study led by University of Toronto psychologists Keith Oatley and Raymond Mar connected fiction-reading with increased sensitivity to others. To measure how much text the readers had seen in their lifetimes, they took an author-recognition test—a typical measure for this type of study. “The more fiction people read, the better they empathized,” was how Dr. Oatley summarized the findings. The effect didn’t hold for nonfiction.

Still, no one knew whether reading fiction fostered empathy or empathy fostered an interest in fiction. Other factors could have been at play too, like personality.

So, in 2009, part of the Oatley-Mar team involved in the 2006 study reproduced it with a sample of 252 adults—this time controlling for age, gender, IQ, English fluency, stress, loneliness and personality type. The researchers also assessed participants’ “tendency to be transported by a narrative”—the sense that you’re experiencing a story from within, not watching it as an outsider.

Finally, participants took an objective test of empathy, called the Reading the Mind in the Eyes Test. The aim of all of this was to see how long-term exposure to fiction influenced their ability to intuit the emotions and intentions of people in the real world.

The results? Once competing variables were statistically stripped away, fiction reading predicted higher levels of empathy. Such readers also lived large in the flesh-and-blood social sphere, with richer networks of people to provide entertainment and support than people who read less fiction. This finding put to rest the stereotype of bookworms as social misfits who use fictional characters as avatars for real friends and romantic partners.

Later studies confirmed that reading fiction does cause a spike in the ability to detect and understand other people’s emotions—at least in the short term. In a series of experiments published in 2013 in Science, social psychologist Emanuele Castano and David Comer Kidd of the New School for Social Research tried to figure out whether the type of fiction mattered.

The researchers handed subjects—in groups ranging in size from 69 to 356—different types of genre fiction, literary fiction or nonfiction, or nothing to read at all. They then assessed participants on several measures of empathy. Nonfiction—along with horror, sci-fi or romance novels—had little effect on the capacity to detect others’ feelings and thoughts. Only literary fiction, which requires readers to work at guessing characters’ motivations from subtle cues, fostered empathy.

In these studies, the reading of nonfiction not only failed to spur empathy but also predicted loneliness and social isolation, specially among men. Of course, nonfiction reading has its virtues. Other research suggests that various kinds of nonfiction can prompt empathetic feelings—as long as the narrative is moving and transformative.

In recent studies, neuroscientist Paul Zak at Claremont Graduate University and colleagues showed participants heartfelt stories, such as a video narrated by a father of a toddler with brain cancer. The video induced a spike in observers’ levels of oxytocin—a hormone that promotes trust, nurturing and empathy—and larger donations to charity. Watching a straightforward travelogue-type video of the same father and son visiting the zoo didn’t have that effect.

Apparently, what matters is not whether a story is true. Instead, as Dr. Oatley says, “If you’re enclosed in the bubble of your own life, can you imagine the lives of others?”

 

 

When We Display Our Piety, Our Social Stock Rises

People perceive signs of religious observance in others as a measure of dependability, new research shows

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

One of the many unusual aspects of this presidential campaign has been how little the candidates have discussed religion. Compare this with two previous presidential contenders, among many others who publicly affirmed their faith. When asked in 2000 to name his favorite political thinker, George W. Bush replied, “Christ, because He changed my heart,” while a 1980 ad for Jimmy Carter’s unsuccessful re-election campaign intoned, “He takes the time to pray privately and with Rosalynn each day.”

Perhaps one reason for the change is that “none” is the fastest-growing major religious affiliation in America, as a Pew Research Center survey showed last year. Given this shifting terrain, does being visibly devout still signal that you can be trusted?

Surprisingly, the answer is yes. People perceive signs of religious observance in others as a measure of dependability, new research shows. Whether one fasts on Yom Kippur, wears a cross of ash for Lent or places a red dot in the middle of one’s forehead, such religious “badges” do more than just signal that you belong to a particular group. Other people see these displays as a shorthand for reliability.

In four experiments published last year in the journal Psychology of Religion and Spirituality, the anthropologist Richard Sosis of the University of Connecticut and his colleagues altered one fifth of the images so that people appeared to be wearing a cross around their necks or a cross of ash on their foreheads. The experiments were conducted between Ash Wednesday and Easter. The researchers interspersed these images with those of people without any religious adornments.

Several hundred university students of varying backgrounds then examined the stack of photos, rating each of the faces for trustworthiness. The students also played an economic game during which they entrusted money to players whom they deemed honorable.

The researchers were surprised to discover that a person wearing Christian religious symbols prompted powerful feelings of trust, not only among fellow Christians but also among secular students and members of other religions. The presence of a cross doubled the money that non-Christians were willing to offer someone in the trust game, while the Ash Wednesday cross increased their investment by 38.5%.

Other recent studies show that the effect is the same for any religious practice that imposes a cost on the appearance, comfort or finances of believers, or that restricts their diet or sexual behavior. Whether they are MuslimsJews or Hindus, such displays of devotion burnish the reputations of the observant.

In a fascinating study of Hindu and Christian villagers in South India, published this year in the journal Evolution and Human Behavior, the anthropologist Eleanor Power found that local religious rituals greatly enhanced a participant’s standing in the community. For both Catholics and Hindus, these included onerous pilgrimages; Hindu rituals included walking across burning coals and being suspended by hooks in one’s skin. Participation in these widely accepted demonstrations of devotion also predicted which individuals had pivotal roles in local social networks.

“People are more likely to go to you for support if you undertake such religious acts,” said Dr. Power of the nonprofit Santa Fe Institute. She had access to temple records showing who paid religious fees and joined pilgrimages and processions, along with villagers’ evaluations of their peers and their status in the community. “People will rate you as having a good work ethic, giving good advice and being more generous if you worship regularly and do firewalking or other costly acts,” Dr. Power told me.

Such religious displays make us more likely to turn to these people for leadership. Today’s presidential contenders would perhaps benefit from a greater show of reverence. The harder they work to convey that they believe in something greater than themselves, the more credible they will be to voters.