Why Looks Count in Politics

In elections, appearance counts more than most of us care to admit. It all comes down to the brain’s orbitofrontal cortex

See the column on the Wall Street Journal site


Those who watched the New Hampshire Republican debate surely made a mental note of aspects of the candidates’ appearances as they staked out their turf: Ted Cruz’s jutting chin, Ben Carson’s avuncular temples, Donald Trump’s blond, blond hair, Marco Rubio’s full cheeks and boyish manner.

It’s all window-dressing, you might say; only the candidates’ positions matter. But in politics, looks count more than we care to admit.

Indeed, we often clinch our political decisions a split-second after we see a candidate and don’t change them much over time. In a 2006 study published in Psychological Science and led by Alexander Todorov of Princeton University, subjects who selected favorites after a brief glance at snapshots of unfamiliar candidates were able to predict who would win nearly 70% of the 2004 Senate and House races. When the researchers gave people more time to decide, they simply confirmed their first impressions.

In another study, published in 2008 in the Proceedings of the National Academy of Sciences, Prof. Todorov and colleague Nikolaas Oosterof digitally manipulated people’s facial features in photos, revealing just what makes us fall so hard for a candidate. They showed that a rounded, baby-faced appearance with prominent cheekbones, arched inner eyebrows and a sunny demeanor makes a person seem trustworthy.

Such snap judgments also can skew life-or-death decisions, according to a new study in Social Psychological & Personality Science. Assessments of the trustworthiness of convicted murderers based on their facial features aligned well with how they were sentenced. The sense that a prisoner was trustworthy was a good predictor of whether he got life in prison or the death sentence.

Not only do we make critical inferences about people based on their appearance, but another study, published last June in the Journal of Neuroscience, suggests that the ability to assimilate more than one piece of information about them hinges on having a healthy brain—a healthy orbitofrontal cortex, to be precise. Situated right behind the eyeballs on the floor of the skull, this area of the brain is central to social decision-making and impulse control—and to how we make political choices, according to the study.

Lesley Fellows and her team at the Montreal Neurological Institute and McGill University investigated what happens when the orbitofrontal cortex is badly damaged. How might that affect a person’s first impressions of a candidate? The researchers found seven people who had lost use of this part of their brain due to an aneurysm or tumor surgery but whose other cognitive abilities were intact. The study also included a group of 18 people with frontal lobe damage but no harm to the orbitofrontal cortex area. Another control group included matched, healthy subjects.

The researchers asked participants to look at pairs of head shots of political candidates. The subjects didn’t know the candidates and didn’t have any information about them. They first “voted” for one from each pair based on the photos. Then they rated each candidate on two traits: perceived attractiveness and competence (the latter a guess based on appearance).

The votes of the healthy participants and of the 18 people whose brain injuries did not include the orbitofrontal area included considerations of both attractiveness and competence—that is, they might vote for someone whom they judged to be competent even if that candidate wasn’t rated the most attractive. But the votes of the seven people with orbitofrontal damage matched one factor only: the candidates’ appearance. They were most likely to vote for whoever they deemed most attractive.

Even those of us fortunate enough not to have brain damage often can’t explain why we like who we like. “Eventually the brain gets overwhelmed with so many factors,” Dr. Fellows told me. “When it gets to be too much, people just simplify.”

Children’s Lies Are a Sign of Cognitive Progress

Research shows that kids’ ability to bend the truth is a developmental milestone, much like walking and talking

See the column on the Wall Street Journal site


Child-rearing trends might seem to blow with the wind, but most adults would agree that preschool children who have learned to talk shouldn’t lie. But learning to lie, it turns out, is an important part of learning in general—and something to consider apart from fibbing’s ethical implications.

The ability to bend the truth is a developmental milestone, much like walking and talking. Research led by Kang Lee, a psychology professor at the University of Toronto, shows that lying begins early in precocious children. Among verbal 2-year-olds, 30% try to pull the wool over their parents’ eyes at some point. At age 3, 50% regularly try it. Fibbing is common among 80% of 4-year-olds and is seen in nearly all healthy 5- to 7-year-olds.

In other words, lying is nothing unusual in small children. What’s more, younger children who tell tales have a cognitive advantage over the truth-tellers, Dr. Lee said. “Lying requires two ingredients. Children need to understand what’s in someone else’s mind—to know what they know and what they don’t know. We call this ability theory of mind. The children who are better at theory of mind are also better at lying.”

The second requirement, according to Dr. Lee, is executive function—the power to plan ahead and curb unwanted actions. “The 30% of the under-3s who can lie have higher executive function abilities,” he said, “specifically the ability to inhibit the urge to tell the truth and to switch to lying.”

Such cognitive sophistication means that these early liars will be more successful in school and in their dealings with other kids on the playground, he added.

Though Dr. Lee had known for decades that children who excel at theory-of-mind tasks are better liars, he didn’t know which came first. Does lying make children better at guessing what other people are thinking? After all, trying half-truths on for size would elicit feedback from adults that would reveal something about their mental states. Or is it that if you teach people to imagine what’s going on in others’ minds, they then become better fabricators? He tested that notion in an experiment that he published in the journal Psychological Science last November.

Theory-of-mind training has become a popular tool for helping children on the autistic spectrum as well as those with behavioral problems. The training walks children through situations that help them to discover that other people could have knowledge or beliefs different from their own. In Dr. Lee’s lab the children are also read stories rich in information about people’s mental states. “So we asked, what are the side effects? Can we induce lying by training theory of mind?” Dr. Lee said.

He and a team of researchers from Canada, the U.S. and China divided a group of 58 preschoolers from a city in mainland China into two groups after testing them for such things as intelligence, lying ability and executive function. Half of the children received six sessions of theory-of-mind training and the other half received an equal number of sessions devoted to teaching number and spatial problem-solving skills.

After six sessions over eight weeks, the researchers found that the children in the theory-of-mind group had not only become better liars but also were significantly better at lies than the control-group children were. The effects lasted a month. Dr. Lee intends to follow up to see if these results persist.

“The first occasion of your child telling a lie is not an occasion to be alarmed but an occasion for celebration. It’s a teachable moment,” he told me, “a time to discuss what is a lie, what is the truth and what are the implications for other people.”

When Does Gratitude Bring Better Health?

During the holiday season, gifts, cards, carols and donations constantly urge us to give thanks. But gratitude really can have beneficial psychological effects.

See the column on the Wall Street Journal site


Gratitude is one of those tricky, hard-to-pin-down feelings that can be either inert or powerfully transformative. At this time of year, when we’re constantly importuned to give thanks with gifts, cards, carols and donations, it often becomes a reflex. So when does gratitude have psychological effects?

That question didn’t get much scientific scrutiny until recently. Only in the past decade has there been a push to determine if gratitude “decreases pain and depression, and boosts happiness,” as a recent study in Primary Health Care Research & Development put it. The researchers found that an act of explicitly expressing gratitude lifted people’s mood and sense of well-being.

Bolstering this finding, other targeted studies have shown that health-care workers who cataloged why they were grateful experienced a 28% reduction in stress, and that writing about gratitude halved the risk of depression in those with a history of the disease.

Some research results seem almost too good to be true. Simply asking suicidal patients to write a letter of gratitude reduced their hopelessness in 90% of the cases. Among fit teenage athletes, those with high levels of gratitude were more satisfied with life in general and with their teams in particular.

Counting one’s blessings, as opposed to life’s annoyances, seems to bring with it all kinds of benefits: resilience, better health, a rosier outlook—even a longer, more restful night’s sleep and a sense of connectedness to other people.

Changing how we feel is one thing, but changing behavior is another. “It’s not the hardest sell in the world that emotions could make you feel more optimistic about your life. But I grow skeptical about observable effects in the body,” Michael McCullough told me. A psychology professor at the University of Miami who investigates emotions like forgiveness, revenge and gratitude, Dr. McCullough wonders whether feeling grateful actually alters our health or whether it works by motivating us to change our behavior—to quit smoking or drinking, for example.

He found both mood and behavior changes in an experiment he did with Robert Emmons, a colleague at the University of California, Davis. In the study, reported in 2003 in the Journal of Personality and Social Psychology, prompting people to list five things they were grateful for several times a week not only brought an uptick in mood but also resulted in subjects devoting more time to exercise and to helping others.

“Gratitude motivates people into trying to give back,” Dr. McCullough said, “and the research is really good that volunteering is good for health. Emotional state to social contact to feeding back into health behavior—it all makes sense.”

Whether the feeling or the behavior comes first, we do know that gratitude is tied to conscientiousness. Grateful people eat 25% fewer fatty foods and have better blood pressure readings than ungrateful folks. And a new, still unpublished study shows that feeling thankful is linked to lower Hemoglobin A1C, a sign of good blood-glucose management and thus better diabetes control.

In fact, gratitude is such a powerful catalyst for feeling hale and hearty, it’s a wonder that no one (except greeting-card companies and religious leaders) has found a way to package it.

Which brings us back to the holidays. This time of year most of us steel ourselves for bland turkey, rich desserts and loud relatives and forget what is remarkable in our lives. We’d probably all be better off—and enjoy the holidays more—if we followed the lead of gratitude researchers and took a few minutes to list exactly what we’re grateful for. Better still, think it over, then say it out loud so that everyone can hear it.

Changes in Sense of Humor May Presage Dementia

New research suggests that shifts in what a person finds funny can herald imminent changes in the brain—possibly presaging certain types of dementia

A close friend was an even-keeled, responsible man, endowed with a sunny outlook and a gentle, punny sense of humor. So when he started to make snide remarks at social gatherings several years ago, I secretly championed the delight he was taking in his newfound freedom from social constraints. After more than 50 years of exemplary adult behavior, he had earned the right to play court jester now and then. Or so I thought.

New research from University College London suggests that shifts in what a person finds funny can herald imminent changes in the brain. Published this month in the Journal of Alzheimer’s disease, the study found that an altered sense of humor can predate a diagnosis of dementia by as much as 10 years. A burgeoning penchant for slapstick—over a past preference for satire or absurdist humor, for example—characterized nearly everyone who eventually developed frontotemporal dementia. (Far less common than Alzheimer’s, this illness usually hits people in their 50s and 60s.) But a changed sense of comedy affected less than half the people later diagnosed with Alzheimer’s disease.

“The type of change could be a signpost of the type of dementia the person is going to develop,” said Jason Warren, a neurologist at University College London who led the study. Acknowledging that humor is an unconventional way to think about neurodegenerative disease, he told me that most research in the area uses more standard assessment tools, such as memory tests, but that “memory may not be the type of thing that patients or relatives notice first.” Such warnings could be subtle changes in behavior, including humor.

After all, most forms of humor require some form of cognitive sleight-of-hand. “Getting” satire hinges on the ability to shift perspective in a nanosecond. Absurdist jokes play fast and loose with our grasp of logic and social norms; black humor lampoons taboos. All are a rich source of data about the brain.

“Humor is like a stress test,” said Dr. Warren. “The same way you’re on a treadmill to test the cardiovascular system, complex jokes are stressing the brain more than usual.”

Modest in size, the London study compared 48 patients from an outpatient dementia clinic with 21 healthy older adults. A spouse or longtime caregiver filled out a semi-structured questionnaire about what kinds of TV shows, comic writing and other media each subject preferred. Fifteen years before the study and now, how much did the person enjoy slapstick (along the lines of “The Three Stooges,” though the British study focused on U.K. entertainment only, like “Mr. Bean”), satirical comedy (“Saturday Night Live”) or absurdist comedy (“Monty Python”)?

A change in the type of comedy that people found funny turned out to be a sensitive predictor of a later diagnosis of frontotemporal dementia, though Dr. Warren cautioned that a small, retrospective study like this one is just a first step. Still to come are brain-imaging studies and a prospective look at changes in humor in people who carry genetic markers for the disease.

Their findings could well apply to me, given that dementia runs in my family. I admit, though, that I’m not used to thinking about humor this way. The quip attributed to Groucho Marx that “a clown is like an aspirin, only he works twice as fast” captures my view.

In a perfect world, laughter would be the antidote to illness, not its red flag.

For Babies, Copy-Cat Games Provide a Social Compass

Researchers begin to understand infants’ imitations

In 2014 the neuroscientist John O’Keefe won a Nobel Prize for showing that mammals have “place cells” in their brains—an inner mapping system that helps them find their way in the physical world. Now it seems that a particular group of mammals—humans—also have a mapping ability that lets them see themselves in relation to others, thus helping them to navigate in the social world.

It all starts with imitation. In the early 1980s, Andrew Meltzoff, now co-director of the Institute of Learning & Brain Sciences at the University of Washington, found that human babies have an inborn talent for mimicry. Infants just a few days old imitated adults’ facial expressions—and even their finger movements—with surprising accuracy. When Dr. Meltzoff stuck out his tongue or pursed his lips, the newborns did, too.

It’s a handy trick to know that your body and someone else’s have matching moving parts, especially if you’re a newborn who has never looked in a mirror before. But how and why would the human brain be wired to copy what other people do?

Thirty-odd years later, some answers are surfacing. With instruments now available to measure the electrical activity in infants’ brains, Dr. Meltzoff and his colleagues have found that babies have interactive neural maps that match their own bodily sensations to their observations of other people’s movements.

In a paper in the September issue of Trends in Cognitive Sciences, Dr. Meltzoff and Peter Marshall, a professor of psychology at Temple University, discovered that when a 14-month-old baby’s hand was touched, the same region of her brain lighted up as when she saw an adult use his hand to touch something. When the baby watched an adult nudge an object with his foot, there was electrical activity in the region of the brain corresponding to the baby’s perception of her own foot being touched. That study tested 44 14-month-olds, but even much younger babies register a similarity between their own bodies and other people’s, Dr. Meltzoff told me, adding that this recognition may be one root of empathy.

“I think babies are born socially connected,” Dr. Meltzoff said. “They see you moving your hand in the hospital room, and they think ‘I have one of those, and I can move it too!’ It’s an aha! moment. The baby is realizing that he felt that movement in the womb, and this is what it looks like. This starts at birth and flowers as mothers play mutual imitation games with the baby. Parents unconsciously imitate and reflect the babies’ behavior back to them because the babies enjoy it so much. And now we know why.”

Unwittingly copying someone’s gestures, expressions or mannerisms is called “the chameleon effect.” Research published in the Journal of Experimental Social Psychology and elsewhere shows that adults who practice such mimicry enjoy bigger tips if they work in the service industry and a larger paycheck if they’re engaged in salary negotiations.

Humans clearly have a built-in penchant for synchronizing their actions with those of others. Previous studies by Dr. Meltzoff and colleagues have shown that babies fix their gaze on adults who imitate their actions precisely but are far less interested in adults who react but don’t copy them. Using electroencephalogram technology, the researchers have also shown that imitating babies exactly during turn-taking games elicits a distinctive pattern of electrical activity in the babies’ brains.

Even if there’s no proof yet of “body cells” that orient us toward others the way “place cells” orient us in space, Dr. Meltzoff’s work demonstrates that, as he says, “even a young baby recognizes the distinction between his body and yours, but that the pattern of movements is the same. And that’s the fundamental way he learns to be ‘one of us.’”

Less Pain, Less Joy: A New Look at Acetaminophen

The drug, found in Tylenol, is an all-purpose damper, a study finds

Consider this trade-off the next time you have a headache: Would you take a medicine that didn’t just ease the pain but muffled your happiness too?

A recent study suggests that acetaminophen—found in Tylenol, Excedrin and a host of other medications—is an all-purpose damper, stifling a range of strong feelings. Throbbing pain, the sting of rejection, paralyzing indecision—along with euphoria and delight—all appear to be taken down a notch by the drug.

For most people, this over-the-counter palliative doesn’t demand much thought: Take the right dose and the pain goes away. But it may not be that simple.

In 2010, the psychologists Naomi Eisenberger and Nathan DeWall discovered that a three-week course of acetaminophen soothed social pain, like feelings of exclusion or ridicule. The drug also assuaged the agony of indecision, Dr. DeWall found earlier this year.

Building on this research, a new study, published in June in the journal Psychological Science, shows that acetaminophen affects not just how we perceive physical and psychological pain but how the brain processes strong feelings in general. Though the study was small and limited to college students as subjects, the researchers designed it to meet the high standards of pharmaceutical testing and were able to replicate it.

Led by Ohio State University doctoral candidate Geoffrey Durso, the study compared reactions to 40 photos. Some were run-of-the-mill, some pleasant, others shockingly aversive—including images of fighting in a ravaged city and malnourished children.

A photo can elicit gut-wrenching emotions—or enchant and captivate us. The researchers’ goal was to test the painkiller’s effects on such reactions. Half of the 85 subjects took 1,000 milligrams of acetaminophen—a standard “extra strength” dose. The rest took a look-alike placebo. Neither the participants nor the researchers knew who had taken what. After allowing time for the medication to take effect, the researchers then asked participants to rate 40 photos using a standardized test.

“Compared to the placebo, acetaminophen blunted the extremity of their reactions,” said Mr. Durso. And the more intense the emotions, the more acetaminophen muted them. How much did the painkiller dial down the participants’ reactions? “For extremely pleasant stimuli, acetaminophen blunted their emotions by 20%,” Mr. Durso said, and muted reactions to extremely unpleasant photos 10%.

If acetaminophen muffles all kinds of emotional experience, many of our assumptions about mind-body distinctions and how to treat different types of distress may be wrong. “It’s long been thought that positive emotions are one system and negative emotions are another,” said psychologist Baldwin Way of Ohio State, one of the study’s authors. “But if acetaminophen blunts both positive and negative emotions, it’s probably working through the same pathways.”

Like a built-in volume control in the brain, acetaminophen alters the neural circuits that govern our emotional responses in general. Whereas ibuprofen and aspirin inhibit pain by acting right at the site of inflammation, Prof. Way said that acetaminophen acts globally, modifying our reactions to the incoming pain signal. “If you take a painkiller before a run, ibuprofen reduces the pain coming from your knees, whereas acetaminophen reduces how your brain responds to that pain,” he said.

The researchers are now looking at ibuprofen, used in such medicines as Advil and Motrin, to see if it also has psychological effects. What they find may shift how reflexively we reach for pain relief. “Stay tuned,” said Mr. Durso.

Practice Makes Some Perfect, Others Maybe Not

Some brains get a musical head start, research shows

When the late, great B.B. King mused about the roots of his success, he recalled happening on his uncle’s guitar as a small boy and picking out some tunes. Did that jump-start his later musical accomplishments? Or was it singing in the church choir? Maybe it was the death of his young mother—which left him with a real feeling for the blues along with a survivor’s sense of self-reliance.

Then there’s the persistence with which young B.B. pursued every performance opportunity, not to mention his tens of thousands of hours of practice.

It’s probable that all these influences contributed to King’s extraordinary musical achievements, along with one that he didn’t mention: the unique responsiveness of certain areas of his brain. A study published in Cerebral Cortex in July shows that unusual activity in specific neural areas can predict how easily musicians learn their chops.

In their experiment, neuroscientists Robert Zatorre, Sibylle Herholz and Emily Coffey at the Montreal Neurological Institute, along with a colleague in Germany, used functional magnetic resonance imaging to assess how music instruction and practice affect the brain.

They studied 15 healthy young adults who had volunteered for keyboard lessons. None of them had musical training at the outset, nor had they ever played an instrument before. The goal of the experiment was to see what would happen to their brains once they had some instruction and could pick out some popular tunes on the piano.

First, the volunteers lay in the brain scanner and listened to recognizable songs, like “Hey Jude” and “Amazing Grace.” The researchers thus had a measure of the subjects’ baseline neural activity while they were listening to music.

Six weeks of music lessons followed. Though the participants primarily trained with an online program and practiced on their own, their electronic keyboards automatically uploaded every detail of their playing to the lab’s computers. Then the subjects had another fMRI. While in the scanner they heard the same familiar songs from the first round, along with others they now knew how to play.

By comparing the volunteers’ levels of brain activation before and after training, the researchers found that, as they had expected, the brain did change after learning to play music.

“The parietal cortex and the premotor cortex were more active in the trained subjects when hearing a song that they had learned,” said Dr. Zatorre, the principal investigator. Both areas, which are recruited in the perception of sound and in planning and guiding movement, are clearly important when one has to imagine a note and play the right key.

The data also point to a distinct starting advantage in some people—and where that advantage might reside in the brain. A retroactive examination of the first fMRI images predicted who would be the best learners.

Those with a hyperactive Heschl’s gyrus (part of the cerebral cortex that is associated with musical pitch) and with lots of reactivity in their right hippocampus (an area linked to auditory memory) turned out to be more likely to remember tunes they had heard before and, after some practice, play them well.

The “kicker,” said Dr. Zatorre, was finding that neural head start. “That gives you an advantage when you’re learning music, and it’s a completely different system from the parts of the brain that show learning has taken place. It speaks to the idea of 10,000 hours.” In his book “Outliers,” Malcolm Gladwell called 10,000 hours of practice “the magic number of greatness.” Dr. Zatorre disagrees, saying, “Is it really fair to say that everyone’s brain is structured the same way, and that if you practice, you will accomplish the same thing?”

B.B. King was too modest to say so. But I think the answer is no.

How Intelligence Shifts With Age

We seem to get slower and wiser at the same time


Is the saying “older but wiser” just an old wives’ tale? We all know people who are as foolish at 60 as they were at 20 and others whose smarts have cruelly diminished with age. Meanwhile, legions of seniors who used fountain pens as children now deftly tap out texts on their tablets. So what’s the truth about old dogs and new tricks?

A study of adult intelligence, published in March in the journal Psychological Science, pits these maxims against the data. The results challenge some common assumptions—including the idea that mental acuity, like athletic prowess, always declines with age.

“Your physical ability changes over your lifetime. At first you can’t do much,” said Joshua Hartshorne, a postdoctoral fellow at the Massachusetts Institute of Technology and the study’s lead author. From infancy on, we get better at walking, jumping, climbing and running. But in our early 20s our physical abilities begin to decline, he said. Is such waxing and waning also true for mental ability? “There are two competing ideas,” he added. “As you get older you’re slowing down, and as you get older you’re getting wiser.”

The new research suggests that we are getting slower and wiser at the same time.

More From Susan Pinker

Dr. Hartshorne and his colleague, Laura Germine at the Massachusetts General Hospital, took a close look at the development of cognitive abilities as we age and discovered that different skills have different timetables. Some abilities mature early, such as how fast we recall names and faces. Others, like vocabulary and background knowledge, are late bloomers.

This nuanced view of human smarts is based on separating our thinking process into discrete slices—as opposed to viewing intelligence as a single unit, typically called “G.” This study’s findings conflict with most previous research, which shows that “G” is fairly stable across the lifespan. One study followed up on 87,500 Scottish children tested at age 11 and found that their intelligence hadn’t changed much 65 years later.

In this new study, the researchers did both retrospective sleuthing and online testing. First they reanalyzed the scores generated by those who took standardized Wechsler IQ and memory tests in the early 1990s—right at the time these tests were created. By dividing the 2,500 adults who first took these tests into 13 age groups, the researchers were able to chart the trajectory of individual skills, from adolescence to retirement and beyond. To get a more textured picture, the scientists added survey and Internet-based tests of reasoning, memory and social intelligence.

The results showed that our intellectual capacities shift as we age. Processing speed—how fast we absorb and rejig numbers, names and facts—peaks around 18, then “drops off a cliff,” said Dr. Hartshorne.

How much we can remember and manipulate at one time—the size of that mental notepad, which is called working memory—is at its prime in our mid-20s and plateaus around age 35. Then it moves from center stage.

That’s when our emotional intelligence kicks in. The ability to assess people’s emotional states from a photo of their eyes peaks around age 40 and doesn’t decline until our 60s. One form of wisdom, the ability to guess people’s internal states from very little information, is as useful around the kitchen table at it is in the boardroom, and “the difference between a 20-year-old and a 40-year-old is just huge,” Dr. Hartshorne said.

The gains don’t end there. As our hair becomes silver (and sometimes falls out), our vocabularies continue to grow, peaking as late as age 70, said Dr. Hartshorne, adding that 20 years ago tests of vocabulary showed that it crested much earlier, at age 50. “Given the way we’ve chosen to define intelligence, people are getting smarter,” he said.

This is an encouraging sign. If humans continue to learn into their seventh decade, then at least one platitude is true. You can teach an old dog new tricks.

For a Longer Life, Get Social

Research suggests that more than our eating and exercise habits, our social lives can predict how long we’ll live




Jun. 25, 2015 14:53 p.m. ET

See the column on the Wall Street Journal site

Personal independence is such an iconic American value today that few of us question it. In previous generations, retirees lived with family, but now that a large swath of older people can afford to live on their own, that’s what they choose. The convenience of digital devices means that we can now work, shop and pay our bills online, without dealing directly with other people. According to the U.S. Census, 10% of Americans work alone in remote offices and over 13% live alone, the highest rate of solo living in American history.

But is this go-it-alone ideal good for us? New research suggests that, even if you enjoy being by yourself, it just might kill you—or at least shorten your life.

A team led by Julianne Holt-Lunstad at Brigham Young University showed that living alone, or simply spending a lot of your time on your own, can compromise your physical and psychological resilience—whether or not you like your solitude. Published in Psychological Science in March, their data show that how much real social interaction you get is a good predictor of how long you will live.

If you fit into one of three categories—living alone, spending much of your time alone or often feeling lonely—your risk of dying within the next seven years is about 30% higher than it is for people who are otherwise like you. Based on a meta-analysis comprising 70 studies and over 3.4 million adults, the team’s findings reinforce a growing consensus: In-person interaction has physiological effects.

Scientists have long known that loners are likely to die well before their more gregarious neighbors. A landmark longitudinal study published in the American Journal of Epidemiology in 1979 followed nearly every resident of a northern California town for nine years; its results showed that people who not only had intimate partners but met regularly with others to play bridge or volunteer at church were twice as likely to outlive those who led solitary lives. Still, critics wondered whether social contact was the key. Perhaps the social butterflies were healthier to begin with, or the more isolated people had hidden problems, such as depression or disability, that cut their lives short.

Dr. Holt-Lunstad’s team controlled for these confounding factors. What’s more, they discovered that the effect isn’t always a matter of preference or state of mind. We used to think that subjective experience was all that mattered. You could be single or married, spend your days alone or in a throng of people; if you often felt lonely, the thinking went, your blood pressure would spike and your immune function would suffer.

The new research found, however, that objective measures of the amount of human contact you get are as critical to your survival as your opinion of your social life. “I’ve spent almost my whole career studying social support, and I absolutely know the strong effects that our perceptions have on our physiology,” Dr. Holt-Lunstad told me. “But there are other determinants of health that are independent of our perceptions. Even if we hate exercise or broccoli, they’re still good for you.” Our intuitions don’t always point us in the right direction either, she added. “There are things that we enjoy greatly that are bad for our health, like eating those rich, fatty desserts or that big, juicy burger. We take great pleasure in things that are not that great for our health.”

In fact, in a study from 2010 in PLOS Medicine, Dr. Holt-Lunstad showed that our social lives are more faithful predictors of how long we’ll last than our eating and exercise habits.

We are still left with the why question. One piece of the puzzle has to do with hormones. Oxytocin and vasopressin are secreted when we are near enough to hug someone—even if we just shake hands. These hormones reduce stress, kill pain and allow us to let down our guard, all of which contribute to long-term resilience. Real encounters can also switch on and off genes that control our immunity and the rate of tumor growth. But it isn’t all about neurochemistry. As Dr. Holt-Lunstad notes, “just having someone around to call 911 can be a matter of life or death.”