For Babies, Copy-Cat Games Provide a Social Compass

Researchers begin to understand infants’ imitations

In 2014 the neuroscientist John O’Keefe won a Nobel Prize for showing that mammals have “place cells” in their brains—an inner mapping system that helps them find their way in the physical world. Now it seems that a particular group of mammals—humans—also have a mapping ability that lets them see themselves in relation to others, thus helping them to navigate in the social world.

It all starts with imitation. In the early 1980s, Andrew Meltzoff, now co-director of the Institute of Learning & Brain Sciences at the University of Washington, found that human babies have an inborn talent for mimicry. Infants just a few days old imitated adults’ facial expressions—and even their finger movements—with surprising accuracy. When Dr. Meltzoff stuck out his tongue or pursed his lips, the newborns did, too.

It’s a handy trick to know that your body and someone else’s have matching moving parts, especially if you’re a newborn who has never looked in a mirror before. But how and why would the human brain be wired to copy what other people do?

Thirty-odd years later, some answers are surfacing. With instruments now available to measure the electrical activity in infants’ brains, Dr. Meltzoff and his colleagues have found that babies have interactive neural maps that match their own bodily sensations to their observations of other people’s movements.

In a paper in the September issue of Trends in Cognitive Sciences, Dr. Meltzoff and Peter Marshall, a professor of psychology at Temple University, discovered that when a 14-month-old baby’s hand was touched, the same region of her brain lighted up as when she saw an adult use his hand to touch something. When the baby watched an adult nudge an object with his foot, there was electrical activity in the region of the brain corresponding to the baby’s perception of her own foot being touched. That study tested 44 14-month-olds, but even much younger babies register a similarity between their own bodies and other people’s, Dr. Meltzoff told me, adding that this recognition may be one root of empathy.

“I think babies are born socially connected,” Dr. Meltzoff said. “They see you moving your hand in the hospital room, and they think ‘I have one of those, and I can move it too!’ It’s an aha! moment. The baby is realizing that he felt that movement in the womb, and this is what it looks like. This starts at birth and flowers as mothers play mutual imitation games with the baby. Parents unconsciously imitate and reflect the babies’ behavior back to them because the babies enjoy it so much. And now we know why.”

Unwittingly copying someone’s gestures, expressions or mannerisms is called “the chameleon effect.” Research published in the Journal of Experimental Social Psychology and elsewhere shows that adults who practice such mimicry enjoy bigger tips if they work in the service industry and a larger paycheck if they’re engaged in salary negotiations.

Humans clearly have a built-in penchant for synchronizing their actions with those of others. Previous studies by Dr. Meltzoff and colleagues have shown that babies fix their gaze on adults who imitate their actions precisely but are far less interested in adults who react but don’t copy them. Using electroencephalogram technology, the researchers have also shown that imitating babies exactly during turn-taking games elicits a distinctive pattern of electrical activity in the babies’ brains.

Even if there’s no proof yet of “body cells” that orient us toward others the way “place cells” orient us in space, Dr. Meltzoff’s work demonstrates that, as he says, “even a young baby recognizes the distinction between his body and yours, but that the pattern of movements is the same. And that’s the fundamental way he learns to be ‘one of us.’”

Less Pain, Less Joy: A New Look at Acetaminophen

The drug, found in Tylenol, is an all-purpose damper, a study finds

Consider this trade-off the next time you have a headache: Would you take a medicine that didn’t just ease the pain but muffled your happiness too?

A recent study suggests that acetaminophen—found in Tylenol, Excedrin and a host of other medications—is an all-purpose damper, stifling a range of strong feelings. Throbbing pain, the sting of rejection, paralyzing indecision—along with euphoria and delight—all appear to be taken down a notch by the drug.

For most people, this over-the-counter palliative doesn’t demand much thought: Take the right dose and the pain goes away. But it may not be that simple.

In 2010, the psychologists Naomi Eisenberger and Nathan DeWall discovered that a three-week course of acetaminophen soothed social pain, like feelings of exclusion or ridicule. The drug also assuaged the agony of indecision, Dr. DeWall found earlier this year.

Building on this research, a new study, published in June in the journal Psychological Science, shows that acetaminophen affects not just how we perceive physical and psychological pain but how the brain processes strong feelings in general. Though the study was small and limited to college students as subjects, the researchers designed it to meet the high standards of pharmaceutical testing and were able to replicate it.

Led by Ohio State University doctoral candidate Geoffrey Durso, the study compared reactions to 40 photos. Some were run-of-the-mill, some pleasant, others shockingly aversive—including images of fighting in a ravaged city and malnourished children.

A photo can elicit gut-wrenching emotions—or enchant and captivate us. The researchers’ goal was to test the painkiller’s effects on such reactions. Half of the 85 subjects took 1,000 milligrams of acetaminophen—a standard “extra strength” dose. The rest took a look-alike placebo. Neither the participants nor the researchers knew who had taken what. After allowing time for the medication to take effect, the researchers then asked participants to rate 40 photos using a standardized test.

“Compared to the placebo, acetaminophen blunted the extremity of their reactions,” said Mr. Durso. And the more intense the emotions, the more acetaminophen muted them. How much did the painkiller dial down the participants’ reactions? “For extremely pleasant stimuli, acetaminophen blunted their emotions by 20%,” Mr. Durso said, and muted reactions to extremely unpleasant photos 10%.

If acetaminophen muffles all kinds of emotional experience, many of our assumptions about mind-body distinctions and how to treat different types of distress may be wrong. “It’s long been thought that positive emotions are one system and negative emotions are another,” said psychologist Baldwin Way of Ohio State, one of the study’s authors. “But if acetaminophen blunts both positive and negative emotions, it’s probably working through the same pathways.”

Like a built-in volume control in the brain, acetaminophen alters the neural circuits that govern our emotional responses in general. Whereas ibuprofen and aspirin inhibit pain by acting right at the site of inflammation, Prof. Way said that acetaminophen acts globally, modifying our reactions to the incoming pain signal. “If you take a painkiller before a run, ibuprofen reduces the pain coming from your knees, whereas acetaminophen reduces how your brain responds to that pain,” he said.

The researchers are now looking at ibuprofen, used in such medicines as Advil and Motrin, to see if it also has psychological effects. What they find may shift how reflexively we reach for pain relief. “Stay tuned,” said Mr. Durso.

Practice Makes Some Perfect, Others Maybe Not

Some brains get a musical head start, research shows

When the late, great B.B. King mused about the roots of his success, he recalled happening on his uncle’s guitar as a small boy and picking out some tunes. Did that jump-start his later musical accomplishments? Or was it singing in the church choir? Maybe it was the death of his young mother—which left him with a real feeling for the blues along with a survivor’s sense of self-reliance.

Then there’s the persistence with which young B.B. pursued every performance opportunity, not to mention his tens of thousands of hours of practice.

It’s probable that all these influences contributed to King’s extraordinary musical achievements, along with one that he didn’t mention: the unique responsiveness of certain areas of his brain. A study published in Cerebral Cortex in July shows that unusual activity in specific neural areas can predict how easily musicians learn their chops.

In their experiment, neuroscientists Robert Zatorre, Sibylle Herholz and Emily Coffey at the Montreal Neurological Institute, along with a colleague in Germany, used functional magnetic resonance imaging to assess how music instruction and practice affect the brain.

They studied 15 healthy young adults who had volunteered for keyboard lessons. None of them had musical training at the outset, nor had they ever played an instrument before. The goal of the experiment was to see what would happen to their brains once they had some instruction and could pick out some popular tunes on the piano.

First, the volunteers lay in the brain scanner and listened to recognizable songs, like “Hey Jude” and “Amazing Grace.” The researchers thus had a measure of the subjects’ baseline neural activity while they were listening to music.

Six weeks of music lessons followed. Though the participants primarily trained with an online program and practiced on their own, their electronic keyboards automatically uploaded every detail of their playing to the lab’s computers. Then the subjects had another fMRI. While in the scanner they heard the same familiar songs from the first round, along with others they now knew how to play.

By comparing the volunteers’ levels of brain activation before and after training, the researchers found that, as they had expected, the brain did change after learning to play music.

“The parietal cortex and the premotor cortex were more active in the trained subjects when hearing a song that they had learned,” said Dr. Zatorre, the principal investigator. Both areas, which are recruited in the perception of sound and in planning and guiding movement, are clearly important when one has to imagine a note and play the right key.

The data also point to a distinct starting advantage in some people—and where that advantage might reside in the brain. A retroactive examination of the first fMRI images predicted who would be the best learners.

Those with a hyperactive Heschl’s gyrus (part of the cerebral cortex that is associated with musical pitch) and with lots of reactivity in their right hippocampus (an area linked to auditory memory) turned out to be more likely to remember tunes they had heard before and, after some practice, play them well.

The “kicker,” said Dr. Zatorre, was finding that neural head start. “That gives you an advantage when you’re learning music, and it’s a completely different system from the parts of the brain that show learning has taken place. It speaks to the idea of 10,000 hours.” In his book “Outliers,” Malcolm Gladwell called 10,000 hours of practice “the magic number of greatness.” Dr. Zatorre disagrees, saying, “Is it really fair to say that everyone’s brain is structured the same way, and that if you practice, you will accomplish the same thing?”

B.B. King was too modest to say so. But I think the answer is no.

How Intelligence Shifts With Age

We seem to get slower and wiser at the same time

 

Is the saying “older but wiser” just an old wives’ tale? We all know people who are as foolish at 60 as they were at 20 and others whose smarts have cruelly diminished with age. Meanwhile, legions of seniors who used fountain pens as children now deftly tap out texts on their tablets. So what’s the truth about old dogs and new tricks?

A study of adult intelligence, published in March in the journal Psychological Science, pits these maxims against the data. The results challenge some common assumptions—including the idea that mental acuity, like athletic prowess, always declines with age.

“Your physical ability changes over your lifetime. At first you can’t do much,” said Joshua Hartshorne, a postdoctoral fellow at the Massachusetts Institute of Technology and the study’s lead author. From infancy on, we get better at walking, jumping, climbing and running. But in our early 20s our physical abilities begin to decline, he said. Is such waxing and waning also true for mental ability? “There are two competing ideas,” he added. “As you get older you’re slowing down, and as you get older you’re getting wiser.”

The new research suggests that we are getting slower and wiser at the same time.

More From Susan Pinker

Dr. Hartshorne and his colleague, Laura Germine at the Massachusetts General Hospital, took a close look at the development of cognitive abilities as we age and discovered that different skills have different timetables. Some abilities mature early, such as how fast we recall names and faces. Others, like vocabulary and background knowledge, are late bloomers.

This nuanced view of human smarts is based on separating our thinking process into discrete slices—as opposed to viewing intelligence as a single unit, typically called “G.” This study’s findings conflict with most previous research, which shows that “G” is fairly stable across the lifespan. One study followed up on 87,500 Scottish children tested at age 11 and found that their intelligence hadn’t changed much 65 years later.

In this new study, the researchers did both retrospective sleuthing and online testing. First they reanalyzed the scores generated by those who took standardized Wechsler IQ and memory tests in the early 1990s—right at the time these tests were created. By dividing the 2,500 adults who first took these tests into 13 age groups, the researchers were able to chart the trajectory of individual skills, from adolescence to retirement and beyond. To get a more textured picture, the scientists added survey and Internet-based tests of reasoning, memory and social intelligence.

The results showed that our intellectual capacities shift as we age. Processing speed—how fast we absorb and rejig numbers, names and facts—peaks around 18, then “drops off a cliff,” said Dr. Hartshorne.

How much we can remember and manipulate at one time—the size of that mental notepad, which is called working memory—is at its prime in our mid-20s and plateaus around age 35. Then it moves from center stage.

That’s when our emotional intelligence kicks in. The ability to assess people’s emotional states from a photo of their eyes peaks around age 40 and doesn’t decline until our 60s. One form of wisdom, the ability to guess people’s internal states from very little information, is as useful around the kitchen table at it is in the boardroom, and “the difference between a 20-year-old and a 40-year-old is just huge,” Dr. Hartshorne said.

The gains don’t end there. As our hair becomes silver (and sometimes falls out), our vocabularies continue to grow, peaking as late as age 70, said Dr. Hartshorne, adding that 20 years ago tests of vocabulary showed that it crested much earlier, at age 50. “Given the way we’ve chosen to define intelligence, people are getting smarter,” he said.

This is an encouraging sign. If humans continue to learn into their seventh decade, then at least one platitude is true. You can teach an old dog new tricks.

For a Longer Life, Get Social

Research suggests that more than our eating and exercise habits, our social lives can predict how long we’ll live

 

By

SUSAN PINKER

Jun. 25, 2015 14:53 p.m. ET

See the column on the Wall Street Journal site

Personal independence is such an iconic American value today that few of us question it. In previous generations, retirees lived with family, but now that a large swath of older people can afford to live on their own, that’s what they choose. The convenience of digital devices means that we can now work, shop and pay our bills online, without dealing directly with other people. According to the U.S. Census, 10% of Americans work alone in remote offices and over 13% live alone, the highest rate of solo living in American history.

But is this go-it-alone ideal good for us? New research suggests that, even if you enjoy being by yourself, it just might kill you—or at least shorten your life.

A team led by Julianne Holt-Lunstad at Brigham Young University showed that living alone, or simply spending a lot of your time on your own, can compromise your physical and psychological resilience—whether or not you like your solitude. Published in Psychological Science in March, their data show that how much real social interaction you get is a good predictor of how long you will live.

If you fit into one of three categories—living alone, spending much of your time alone or often feeling lonely—your risk of dying within the next seven years is about 30% higher than it is for people who are otherwise like you. Based on a meta-analysis comprising 70 studies and over 3.4 million adults, the team’s findings reinforce a growing consensus: In-person interaction has physiological effects.

Scientists have long known that loners are likely to die well before their more gregarious neighbors. A landmark longitudinal study published in the American Journal of Epidemiology in 1979 followed nearly every resident of a northern California town for nine years; its results showed that people who not only had intimate partners but met regularly with others to play bridge or volunteer at church were twice as likely to outlive those who led solitary lives. Still, critics wondered whether social contact was the key. Perhaps the social butterflies were healthier to begin with, or the more isolated people had hidden problems, such as depression or disability, that cut their lives short.

Dr. Holt-Lunstad’s team controlled for these confounding factors. What’s more, they discovered that the effect isn’t always a matter of preference or state of mind. We used to think that subjective experience was all that mattered. You could be single or married, spend your days alone or in a throng of people; if you often felt lonely, the thinking went, your blood pressure would spike and your immune function would suffer.

The new research found, however, that objective measures of the amount of human contact you get are as critical to your survival as your opinion of your social life. “I’ve spent almost my whole career studying social support, and I absolutely know the strong effects that our perceptions have on our physiology,” Dr. Holt-Lunstad told me. “But there are other determinants of health that are independent of our perceptions. Even if we hate exercise or broccoli, they’re still good for you.” Our intuitions don’t always point us in the right direction either, she added. “There are things that we enjoy greatly that are bad for our health, like eating those rich, fatty desserts or that big, juicy burger. We take great pleasure in things that are not that great for our health.”

In fact, in a study from 2010 in PLOS Medicine, Dr. Holt-Lunstad showed that our social lives are more faithful predictors of how long we’ll last than our eating and exercise habits.

We are still left with the why question. One piece of the puzzle has to do with hormones. Oxytocin and vasopressin are secreted when we are near enough to hug someone—even if we just shake hands. These hormones reduce stress, kill pain and allow us to let down our guard, all of which contribute to long-term resilience. Real encounters can also switch on and off genes that control our immunity and the rate of tumor growth. But it isn’t all about neurochemistry. As Dr. Holt-Lunstad notes, “just having someone around to call 911 can be a matter of life or death.”