Pessimism about Social Mobility


By Susan Pinker

The expectation that every generation will be better educated, earn more and live in a nicer home than their parents is the essence of the American dream. The formula largely worked for Americans in the 19th and 20th centuries. Do we still expect it to? A paper published this summer in the Proceedings of the National Academy of Sciences set out to answer that question.

“I study how children are doing in relation to their parents,” said sociologist Siwei Cheng of New York University, the study’s lead author. “And America is not doing that well, compared to other Western countries.” In Canada, for instance, the chances of a low-income child entering the middle class are twice as great as they are in the U.S. The question that Prof. Cheng sought to answer is whether Americans still believe in the American dream. “Are Americans really that optimistic about mobility?”

Prof. Cheng and Fangqi Wen, a postdoctoral fellow at the University of Oxford, polled the attitudes of 3,077 American adults. Each participant was asked to consider the prospects of a child whose family’s income was in a specific percentile, compared to all American families. A computer spat out a randomly generated income rank, and the participant would estimate how much a child growing up in such a family would earn as a 40-year-old. The next step was to compare subjects’ perceptions to what up-to-date tax data tell us about the actual earnings of someone from such a family.

Americans underestimate the future earnings of children from poor families and overestimate the future earnings of children from middle and upper class families.

This comparison revealed a disconnect: Americans underestimate the future earnings of children from poor families and overestimate the future earnings of children from middle and upper class families. “The reality is that there is indeed a [mobility] gap, by international standards. But what people have in their mind is a larger gap, in terms of how rich and poor kids will do. They’re pessimistic about equality of opportunity,” said Prof. Cheng. To be precise, “the American public perceives the gap in economic outcomes between children from rich and poor families to be twice as large as it actually is.”

The researchers also discovered some surprising demographic divides. College-educated adults estimated a larger opportunity gap than those without a degree. Liberals were more pessimistic than conservatives, younger people more pessimistic than those over 30, and those earning between $30,000 and $100,000 a year more pessimistic than everyone else. In other words, middle-class, educated Americans see less reason for hope about social mobility than the rest of the population.

So much for the American dream.

The study doesn’t try to identify the sources of this pessimism, but we can hazard some guesses. It could be that the squeeze on middle-class jobs and incomes over the past generation has undermined middle-class faith in advancement. In addition, the suspicion of inequality in American society may lead people to assume that the children of the wealthy are getting a free pass, while the children of the poor must be even more hobbled by their reduced chances.

Whatever the reason, if Americans no longer believe that children who start near the bottom can make it to the top—or even to the middle—a good first step is to reconcile our attitudes with the data. There is more upward mobility in America than most of us think, but unless we want to start calling it the Canadian dream, there’s still a long way to go.




When Taking Risks Is the Best Strategy

Research on fishing fleets shows that in the face of scarce resources, trying new approaches can bring big rewards


One way to divide up the world is between people who like to explore new possibilities and those who stick to the tried and true. In fact, the tension between betting on a sure thing and taking a chance that something unexpected and wonderful might happen bedevils human and nonhuman animals alike.

Take songbirds, for example. The half-dozen finches perching at my deck feeder all summer know exactly what they’ll find there: black sunflower seed, and lots of it. Meanwhile, the warblers exploring the woods nearby don’t depend on this predictable food source in fine weather. As foragers, they enjoy other advantages: a more varied diet, less exposure to predators and, as a bonus, the chance to meet the perfect mate flitting from tree to tree.

This “explore-exploit” trade-off has prompted scores of lab studies, computer simulations and algorithms, trying to determine which strategy yields the greatest reward. Now a new study of human behavior in the real world, published last month in Nature Communications, shows that in good times, there’s not much difference between pursuing novelty and sticking to the status quo. When times are tough, however, explorers are the winners.

The new study, led by Shay O’Farrell and James Sanchirico, both of the University of California, Davis, along with Orr Spiegel of Tel Aviv University, examined the routes and results of nearly 2,500 commercial fishing trips in the Gulf of Mexico over a period of 2½ years. The study focused on “bottom longline” fishing, a system where hundreds of lines are attached to a horizontal bar that is then lowered to reach the sea bed. Dr. O’Farrell explained the procedure this way: “Go to a location and put the line down. Stay for a few hours. The lines are a mile long and have a buoy at either end. When they pull that up, they assess the catch, then decide if they will stay or move on to a different spot.”

Over two years of collecting data under various climate conditions, the researchers discovered that the fishermen were fairly consistent. “The exploiters would go to a smaller set of locations over and over, and go with what they know,” Dr. O’Farrell said. “The explorers would consistently try a wider range; they’d sample new places.”

The payoffs were clear. The explorers benefited from their prior knowledge of alternatives—and their ability to take risks—when the going got tough. For instance, while the study was under way, some prime fishing grounds were unexpectedly closed to protect their population of endangered sea turtles. Those who explored alternative sites had other options when their usual fishing grounds were suddenly off limits. Unlike the exploiters, “they didn’t have to start from ground zero to gain the knowledge they needed” when conditions changed, said Dr. O’Farrell. At the very least, they were more likely to continue to fish during an upheaval.

Similarly, the immediate impact of storms didn’t disrupt the explorers as much as it did those who cleaved to their routine. In the long run, there wasn’t a huge difference between the two groups, perhaps due to the sharing of information between fishing crews, said Dr. O’Farrell. But in challenging times, the study’s message was clear: “You can try new things in the face of uncertainty.”

Bystanders who Intervene in an Attack

In 1964, the Kitty Genovese case taught the world that strangers wouldn’t come to a victim’s aid. New research suggests that, in fact, they usually do.


If you were assaulted in a public place, do you think anyone would intervene? Or would they just look down at their shoes and walk on by?

Most people expect very little help from strangers, especially in the big cities to which vast populations in the modern world have migrated over the past century. Having once lived overwhelmingly in far-flung rural hamlets, Americans have long seen cities as anonymous, dangerous places.

In 1964, the fatal stabbing of Kitty Genovese made this sense of threat more palpable. Genovese, a 28-year-old woman returning from a night shift, was brutally attacked in the entrance of her Queens, N.Y. apartment building. Thirty-eight of her neighbors purportedly heard her screams and did nothing to help. Her story launched a new psychological term: the Bystander Effect, which refers to the idea that the greater the number of bystanders, the less likely people are to act as good Samaritans.

But there was a problem with both the term and the story. Many of the details of the Kitty Genovese story turned out to be false. She did die at the hands of a violent stranger, but subsequent sleuthing revealed that several bystanders did, in fact, try to intervene. And a study published last month in the journal American Psychologist confirms that bystanders aren’t as passive as we once thought. Not only will an observer often step forward in a real crisis, but the more people are present, the more likely the victim is to get assistance.

“It only takes one person to help the victim,” said Richard Philpot, the paper’s first author and a research fellow in psychology at Lancaster University and the University of Copenhagen. Working with three colleagues, Dr. Philpot broke away from previous approaches to documenting the Bystander Effect. Instead of simulating a violent emergency while groups of various sizes looked on, this new study analyzed closed circuit TV footage of real people interacting in public spaces.

In other words, Dr. Philpot’s team focused on genuine conflict—ranging from animated disagreement to physical violence—that spontaneously arose in public places in Amsterdam, Cape Town and Lancaster, U.K.

Four trained coders pored over footage from these cities’ surveillance cameras, culling 219 aggressive incidents from 1,225 video clips. The coders looked at the size of the crowd and zeroed in on any effort to calm the aggressor, to block contact between the two parties or pull the aggressor away, or to provide practical help to the victim.

What emerged was surprising, in more ways than one. Strangers intervened in nine out of 10 violent incidents. And the more people were around, the more likely it was that someone in trouble would get help. The consistency of the findings was remarkable: “South Africa was the only place were we saw weapons such as machetes, axes or knives,” said Dr. Philpot. “But victims were equally likely to be helped in conflicts there as they would be in the U.K. or the Netherlands.”

Some questions remain: How do people know when it’s safe to intervene, and would these findings hold in less populated places? But the new study is reassuring, at least for city-dwellers. There is indeed strength in numbers. It’s too late for Kitty Genovese, but there’s still time for the rest of us, who could be that one person in a crowd who steps forward to help.


Envy at its Worst: In the Future Tense

A new study shows that we’re more jealous of our friends’ plans and prospects than their past accomplishments.


It is better to be envied than to be pitied, wrote the Greek historian Herodotus, and in our use of social media it’s clear that most of us agree. After all, why post selfies of yourself and your sweetheart lifting champagne flutes en route to Thailand if not to induce an eat-your-heart-out response in your friends?

Ubiquitous public displays of everyone’s happy moments—with the low points edited out—are one reason, according to a 2017 study, that most of us believe other people lead richer social lives than we do. Research shows that most of us think we are better looking, smarter, more competent and of course less biased than other people. But our perspectives do a 180 when it comes to our social lives.

When we hear about our friends’ plans to share a summer cottage on the seashore, for example, we may feel green with envy. We imagine that they’ll have a blast—the group dinners on the screened-in porch, swimming together at the pond, the bike rides to get ice cream. The idyllic possibilities are enough to make a working stiff with one week of holiday gag.

Now a new study, published last month in the journal Psychological Science, adds a counterintuitive twist to this familiar story: Other people’s plans for the future irritate us much more than experiences they’ve had in the past. At first this idea might seem strange. Don’t we feel envy for people’s accomplishments and possessions, such as their expensive cars? We do, but the study shows that jealousy stings even more when we think about the good things that lie in their future. In fact, our envious feelings ramp up as someone else’s fortunate occasion gets closer—and plummet once the event is over. Apparently envy is time-stamped, like an email or a tray of raw chicken from the supermarket.

In the study, researchers asked 620 participants to imagine their best friend in each of five circumstances: on a dream vacation, a dream date or in the ideal job, car or house. To make the exercise more personal, the participants were asked for the initials of their best friend. Then they had to rate, on a scale of 1 to 7, how they would feel during the days and weeks leading up to each drool-worthy event, as well as the days and weeks that followed. “Imagine the 10 days and nights of your friend in Maui next month. How do you feel about this?” said Ed O’Brien, one of the study’s lead authors and a professor at the University of Chicago. “Now roll back a year. How does the trip make you feel now?”

The researchers discovered that envy includes negative emotions, like malice, but can also be a source of positive feelings, like increased motivation. Each feeling has its own timetable: “While negative reactions decrease over time once the friend has achieved something we want, time has no effect on the positive form of envy, the motivation to try to do that sort of thing in the future,” explained Prof. O’Brien.

The finding that envy is more predictably triggered by other people’s future prospects than by their past achievements fits into a larger argument. The human brain evolved to have an emotional bias toward the future. according to American social psychologist Roy Baumeister and his colleagues. First we focus on what we want, and then a second, more cautious stage of thinking comes into play so we can figure out what we must do to get it.

So if we want a kinder social media world, we should stick to posting about events that have come and gone, rather than inciting envy with things we’re looking forward to. Got plans for a long, lazy summer frolicking with friends at the beach? For now, you might want to keep that to yourself.

Pollution that Discriminates by Gender

A new study shows that boys’ brains are more vulnerable than girls’ to lead exposure in early childhood


While in Australia last month I learned that female green sea turtles on the Great Barrier Reef now outnumber males by 116 to 1. Biologists blame it on the rising temperature of the sand on Australian nesting beaches: The warmer the sand, the more females hatch. In Sarnia, Ontario—known as Chemical Valley due to its 36 local petrochemical plants—emissions and runoff have halved the number of boys born in the area since the early 1990s, according to studies published in Environmental Health Perspectives.

Now a new study, published online last month in the journal Economics and Human Biology, shows that U.S. counties where lead in the topsoil exceeds the national average had twice the number of five-year-old boys with long-term cognitive problems. Five-year-old girls weren’t affected. Right from conception, it seems that environmental stress, especially pollution, discriminates on the basis of sex.

Edson Severnini, a professor of economics and public policy at Carnegie Mellon University, and his colleagues Karen Clay and Margarita Portnykh began with the United States Geological Survey’s recorded levels of lead in topsoil in 252 of the largest counties in the U.S. in 2000. They then turned to parents’ responses to a question on the 2000 census: Had their five-year-old experienced difficulties, for at least six months, with learning, memory, focus or decision making? The parents of over 77,000 children replied with a yes or a no.

We’ve long known lead to be dangerous, and adding the heavy metal to gasoline, house paint and pesticide has been banned now for decades. Nonetheless, we’re still living with lead’s legacy. Over the 20th century more than 6.5 million tons were released into the environment across the U.S., most of it still blowing around or sticking to soil particles. That is alarming because lead is a neurotoxin: It starves the brain—especially the frontal lobe of the developing brain—of protein and energy, and it doesn’t decompose.

To make matters worse, lead on painted windowsills and in garden soil tastes sugary. Innocently ingesting even tiny amounts of lead can translate to lower IQs and attentional and behavioral problems later on, researchers have found.

There is even evidence that higher levels of lead in the bloodstream can predict antisocial behavior and violence in adolescence and early adulthood, according to a 2012 study led by Tulane medical researcher Howard Mielke published in the journal Environment International.

The new Carnegie Mellon study reinforces the link between a child’s early lead exposure and an uncertain future. Preschoolers’ exposure to lead in their first five years of life increased their probability of compromised cognitive function, including a weaker ability to learn, solve problems and control one’s impulses. The study also adds two fascinating twists to the existing mound of scary data: Boys are twice as vulnerable as girls to early neural damage, and even levels of lead that are currently considered acceptable can exert a deleterious effect.

Our problems with lead aren’t history, this study shows. But it turns out that education can dampen lead’s harmful effects. “We actually show that if boys had some schooling, the [negative] effect was much smaller,” said Dr. Severnini. Except for restricting the lead in small aircraft fuel, which is still unregulated, and remediating contaminated soil where children live and play, we may not be able to control how much lead is still hanging around. But there is something we can do to protect our children’s brains, and it is called preschool



Women Have Younger Brains than Men

A new study of brain metabolism reveals a gender gap with important implications for the way we age


Women tend to live longer than men. This is one of the most robust findings in biological science, and the evidence isn’t hard to find. In the U.S., women outlive men by almost five years, on average, while the gap is as wide as 10 years in Latvia and Vietnam. Now there is fresh evidence that women not only have a longevity advantage; their brains seem to be more youthful throughout adulthood, too.

The new study, published last month in the Proceedings of the National Academy of Science, was led by radiologist Manu Goyal and neurologist Marcus Raichle, both at the Washington University School of Medicine. It shows that, when measuring brain metabolism—that is, the rate at which it uses glucose and oxygen to power its energy-hungry activities—the brains of adult women consistently appear about three years younger than men’s brains do. This brain-age gender gap mirrors the difference in longevity, and it may tell us something important about how sex differences in neural development affect how long we keep our marbles—and, ultimately, how long we live.

In the early 1970s, Dr. Raichle was one of the first neuroscientists to use PET scans to look at cognitive function in a living person’s brain. Now 82, he is still at it. In this study, the Goyal-Raichle research team deployed PET scans to assess how much energy is consumed by an adult’s brain and exactly where in the brain the demand is greatest. “Glucose is like coal. It burns up in the brain and produces energy,” Dr. Raichle said. “But when you burn up things in the brain you produce byproducts that the brain doesn’t want hanging around,” he explained. Glucose makes energy and also does the mopping up afterward. Thus, how much glucose is used to power the brain’s daily activities, including its cleanup functions, will tell you something about how vigorous and youthful that brain is.

In the new study, 205 healthy adults between the ages of 20 and 82 underwent PET scans while lying quietly in the scanner with their eyes closed. The researchers used the scans to assess blood flow to various neural regions, and also to track two forms of glucose uptake: one that burns oxygen (oxydative) and one that doesn’t (non-oxydative). Based on a study they published in 2017, the researchers already knew that our brains use glucose in both ways when we’re young, but that non-oxydative glucose consumption takes a nose dive as we age. They also knew that, after puberty, blood flow in the brain declines less in women than it does in men.

These persisting gender differences in brain metabolism meant that women’s brains often looked younger than those of men their age. The researchers used machine learning to detect distinctive patterns in the brains they studied. “When we trained it on males and tested it on females, then it guessed the female’s brain age to be three to four years younger than the women’s chronological age,” said Dr. Goyal. Conversely, when the machine was trained to see female metabolic patterns as the standard, it guessed men’s brains to be two to three years older than they actually were. That difference in metabolic brain age added up to approximately a three year advantage for women.

These brain age differences persisted across the adult lifespan and were visible even when people’s brains showed the harbingers of Alzheimer’s disease. “These new findings provide yet more evidence, as if more were needed, of just how ubiquitous sex influences on brain function have proven to be, often showing up in places we least expect them,” said Larry Cahill, a neuroscientist who studies sex differences in the brain at University of California, Irvine. “The fact that we often struggle to understand what they mean—as happens in the rest of neuroscience—does not make them less important.”

Inheriting the Trauma of Genocide

Research shows that atrocities witnessed by Tutsi survivors in Rwanda can leave marks on their children, born years later

Inheriting the Trauma of Genocide

A century ago, we called it shell shock—the legacy of trauma that wrought havoc on the bodies and minds of some World War I veterans. Today, when survivors of terrible events experience flashbacks and fears that disrupt their daily lives, long after the actual threat is gone, it is called post-traumatic stress disorder, or PTSD. Whatever you call it, we now know that direct exposure to extreme deprivation, violence, dislocation or torture can transform not only those who experience it but also their future offspring.

Scores of studies have shown that the adult children of Holocaust survivors who suffered from PTSD are at high risk of developing the disorder themselves. Now, 25 years after the genocide against the Tutsi of Rwanda, an unusual collaboration between Rwandan and Israeli researchers has uncovered similar intergenerational effects. The atrocities witnessed by surviving Tutsis have left enduring marks on their adult children, the vast majority of whom hadn’t been born when the genocide took place in 1994.

The study, published last month in the journal Psychiatry Research, was led by Amit Shrira, a professor of psycho-gerontology at Bar Ilan University, working with Benjamin Mollov, a Bar Ilan social scientist, and Chantal Mudahogora, a Canadian researcher and a survivor of the genocide in Rwanda. The team studied 60 pairs of Tutsi survivor parents and their adult children, all of whom completed two questionnaires about their psychological state.

Not all families who share a brutal history are the same, the researchers discovered. According to their responses, the Tutsi parents could be divided into three groups.

The first group (33.3%) suffered from complex PTSD, a new diagnostic category that includes panic attacks, recurring nightmares and intrusive memories. They perceived themselves as helpless and had difficulty maintaining close relationships. The second group of parents (26.7%) had simple PTSD: They continued to relive the traumatic events and personal losses they had experienced and were haunted by a sense of threat. The third group (40%) didn’t have any clinical symptoms of PTSD. Though it is impossible not to experience distress after surviving a genocide, this group of parents seemed to be surprisingly resilient; their feelings of grief didn’t disrupt their lives to the degree that they merited a clinical diagnosis.

The more severe a parent’s symptoms, it turned out, the more severe were those of the adult child. “The children of parents with complex PTSD suffered the highest level of secondary traumatization, with symptoms related to the parental trauma,” said Dr. Shrira. “These children were born after the genocide, but they had nightmares about it and were more restless and hypervigilant than the children in the other two groups.”

Ms. Mudahogora noted that her own three adult children have been affected by her experience as a survivor. “I never grew up with any grandparents, so I miss that a lot,” said her son Chris Mucyo, a 23-year-old university student who was born after the genocide and participated in the study as a subject. “I have great parents, but it is a lot of pressure. We didn’t have a period of innocence as kids.”

That self-awareness is telling. Having parents who modeled coping skills while talking openly about their losses can make a big difference, said Dr. Shrira, who has observed the same resilience in the families of many Holocaust survivors. It is unclear why some survivors transmit trauma to the next generation while others do not. We know that the vulnerability of parents to trauma is likely passed on to their children through genetic and epigenetic means, but how they face adversity is also a factor, speculates Dr. Shrira. “After all, the transmission of trauma is also the way the story is told.”

Aggression in Boys Is a Family Matter

A new study of children shows that problematic behavior can be identified in infancy, if not before, by looking at their background and circumstances

Aggression in Boys Is a Family Matter

In 25 years of clinical practice as a psychologist I’ve seen my share of aggressive boys. They ended up in my office because they kicked classmates, poked children with scissors or, in one unforgettable case, set another student’s collar on fire to get his attention. (It worked.) These boys were usually between the ages of 4 and 10, and most of them came from chaotic homes with few clear expectations, closely spaced siblings and an overwhelmed mother. The father was often absent, literally or psychologically. They had been referred by their teachers, who wanted the aggression to stop.

It was a tall order, and I was only successful some of the time. Now I know that intervention should have started much earlier. A new study published in the JAMA Network in December suggests that children who are most likely to be aggressive throughout childhood and adolescence can be identified in infancy, if not before, by looking at their family histories.

The longitudinal study, led by Richard E. Tremblay of Université de Montréal, followed the development of 2,200 randomly selected babies born in Quebec in the late 1990s, using reports from parents, teachers and the children themselves. In a previous study, the research team found that many children try to use physical force during their first two years of life, by hitting, biting or kicking others. Normally, such aggression peaks between 2.5 and 3.5 years old, and then peters out as children mature and gradually learn society’s rules for social interaction—just in time for them to enter school.

But aggression persists in a small group of children, and the study suggests that they often share a similar home environment. “Children who show problems early are from very specific families,” said Prof. Tremblay. The predictors include one or both parents having been physically aggressive as children or having failed in school. This background is likely to keep families trapped in a dysfunctional cycle. “If you’re physically aggressive and you fail in school, the likelihood of getting a job to get out of poverty is almost nil,” Prof. Tremblay explains.

“These chronically aggressive families could be identified by obstetricians.” Prof. Tremblay says, “so interventions can start close to conception. Starting that early would have a much better impact than waiting until a child is in school.” Yet most aggressive children aren’t identified until they start kindergarten. Teachers are left to wonder, “Why didn’t the parents tell me this child has a problem?”

The answer to that question is one of the study’s most intriguing findings. Unusually aggressive boys were seldom identified by their mothers early on, the researchers discovered. Even if these boys were poised to hurt others and struggle in school, they had to wait for teachers to take note in order to get assistance. In contrast, mothers are more likely to get help for girls, who at any age are far less likely to be physically aggressive than boys.

Whether due to helplessness or denial, this blind spot can be remedied. Some states already offer support such as nurse-family partnership programs, which match vulnerable first-time mothers with registered nurses who visit them up to 60 times during pregnancy and in the child’s first years of life. The nurses offer solid, nonjudgmental advice on subjects like breastfeeding, communicating and bonding with infants and, if necessary, how to quit smoking or drinking.

Through this ongoing relationship, these mothers learn how to sustain a healthy pregnancy and respond to their developing child before he becomes intransigently aggressive, not after. Prof. Tremblay thinks that is wise. “You have to invest in prevention based on the known predictors in the family, and not wait until the second, third or fourth child is born.”

Resilient Teens in a Dangerous World

Resilient Teens in a Dangerous World

A study shows that strong executive function in the brain helps young people’s bodies resist the stress of living in a rough neighborhood

Resilient Teens in a Dangerous World

“That which doesn’t kill me makes me stronger,” Friedrich Nietzsche famously wrote. But the truth is likely to be the reverse: If an ordeal doesn’t break you, it is probably because you were stronger in the first place.

So says a paper just published in the journal Proceedings of the National Academy of Sciences by Greg Miller, a psychology professor at Northwestern University’s Institute for Policy Research, and his colleagues. By looking at the health of teenagers living in Chicago’s rougher neighborhoods, they explored why some young people are more resilient than others. Why is it that we don’t all react to adversity the same way?

The team already knew when they started that proximity to crime can damage the health of young people. On the night after a violent crime, for example, local teens’ sleep patterns are often disrupted and their cortisol levels spike, according to a study published in 2017. Even if they don’t witness it directly, many teens’ awareness of a local homicide or assault leaves a distinct biological signature in the form of an elevated heart rate and altered behavior.

These clues show that fear is literally getting under the skins of young people, engendering long-term cardiovascular risks. “Kids who live where there is violent crime have more health issues than [those] in higher income neighborhoods. But it’s not just about income. Violence per se is related to higher blood pressure, blood sugar, more obesity and all the precursors of diabetes and heart disease,” Prof. Miller told me.

Yet some children who live in the same areas seem to be shielded from crime’s corrosive effects. To find out what protects them, researchers recruited a diverse group of 218 eighth-graders in Chicago. The researchers tracked the incidence of crime in each teenager’s neighborhood by using data from local police departments. On average, the adolescents in the study lived in areas where the murder rate is 142% higher than elsewhere in the U.S.

The researchers brought the teens into the lab for two testing sessions. First, their weight, blood pressure, abdominal fat, blood insulin and glucose levels were measured—all of which can be signs of cardiovascular risk. The second time, the young people underwent brain imaging, with fMRIs to assess the strength and efficiency of certain resting state neural networks. The goal was to check for any differences between the brains of the children with cardiovascular risk factors and those of the hardier ones. Could the more resilient participants be identified by their brain scans alone?

Indeed they could. The study found that young people with a stronger, more efficient central executive network (CEN)—the brain areas that govern the regulation of emotion, and the vigilance that accompanies fear—were also the ones who demonstrated fewer cardiovascular risk factors. “We used statistical measures to rule out the effect of the family’s income, their area’s level of pollution or segregation, or the availability of fresh food,” Prof. Miller told me. “The kids with low CEN connectivity had worse cardiovascular health.”

What this means is that participants whose brains allowed them to suppress intrusive, unpleasant thoughts and to reappraise threatening events were also physically more resilient. They were thinner, with lower biological measures of stress, and their hearts were stronger.

This was an observational study, so we can’t say for certain that a teenager’s brain architecture is what’s keeping their damaging physical reactions to violence in check. Perhaps some kids are just born resilient. Their more measured cognitive response to life’s risks comes packaged with the signs of robust cardiovascular health. I wouldn’t say that these kids have all the luck, but they do seem biologically equipped to face stress with equanimity. And that’s a good start.

When It Comes to Sleep, One Size Fits All

When It Comes to Sleep, One Size Fits All

A massive new study shows that every adult needs 7-8 hours a night, or else their cognitive abilities will suffer

When It Comes to Sleep, One Size Fits All

I’d always thought that our need for sleep, like our appetite for food, drink or social contact, was a personal matter: Some people need more, some need less. Age, lifestyle, work and metabolism combine to determine how much sleep a person needs to function, and if some people thrive on five hours a night and others require seven, chalk it up to different strokes for different folks, right?

Wrong. A new study of the sleep habits of more than 10,000 people around the world suggests that the amount of sleep adults need is universal. The massive survey, published in the journal Sleep, demonstrates that adults everywhere need 7-8 hours a night—no more and no less—in order to be mentally limber. When we habitually stint on those hours, higher-order cognitive processing—such as the ability to see complex patterns and solve problems—is compromised.

The lead author of the study, Conor Wild, a research associate at the Brain and Mind Institute at the University of Western Ontario, explained how a research team directed by the neuroscientist Adrian Owen recruited thousands of people for the online study. “We did a lot of social media advertising and radio interviews. And the BBC put together a two-minute spot on Dr. Owen’s sleep research,” Dr. Wild said.

This blitz prompted more than 40,000 people from 150 countries to sign up. Ultimately, two thirds of them were eliminated due to technical glitches, incomplete questionnaires, or extreme responses—such as reporting that they sleep zero hours a night, or more than 16.

The 10,886 adults who remained filled out a detailed online questionnaire about their backgrounds, medical histories and sleep patterns. How long was their average night’s sleep? How often was their rest interrupted, and how consistent were their nights?

Once their sleep habits were recorded, the participants completed a battery of 12 cognitive tests. Puzzle-like tasks assessed their spatial, verbal and short-term memories, as well as their capacity for deductive reasoning, sustained attention, planning and clear expression.

The findings that emerged were startling. Half the sample averaged less than 6.4 hours of sleep a night—a pattern that was associated with impaired problem-solving, reasoning and verbal acuity. Those who routinely slept six hours a night or less flubbed more questions based on spatial rotation or grammatical reasoning and left more tasks incomplete than those who got a full night’s rest.

Surprisingly, mere sleep deprivation—that is, one or two nights with little or no sleep—did not alter reasoning or verbal skills, though it did hobble short-term memory. This finding is reassuring, given that many professionals—think of hospital residents and airline pilots—have to make life-or-death decisions based on exactly this kind of erratic sleep schedule. If their short-term memory is compromised, they can always look up facts on their phones. But for split-second, life-changing decisions, they are on their own.

Regularly sleeping too little seems to be much more damaging than having one or two bad nights. Getting four hours of sleep or less for an extended period is equivalent to adding eight years to one’s age when it comes to test performance, the study shows—a significant decline. But a single good night can repair some of the damage: People who slept more than usual the night before they were tested ended up acing more of the cognitive tests.

So if you want to be articulate, solve a pesky problem, parallel park or organize an effective team—or even your closet—you’ll definitely need that nightly 7-8 hours of sleep.